Soliya’s CEO Waidehi Gokhale is interviewed by All Tech Is Human

Waidehi Gokhale, Soliya CEO

This interview was originally published by All Tech is Human in their HX Report: Aligning Our Tech Future With Our Human Experience.


Technology stands to play an ever-increasing role in our world: Harnessing it as a vehicle for increasing connection and authentic discourse is a collective responsibility.


Tell us about your current role.

As the CEO of Soliya, I have the responsibility to set and navigate implementation of our strategy, engagement with our Board, ensuring the wellbeing of the team as well as overall fiduciary responsibility for the organization, which includes all our fundraising efforts.

A major goal for HX might be to change our approach to talking about, engaging with and designing technology in a way that's aligned with our needs as humans — not users. In your opinion, what needs to happen in order to improve the ways we currently approach technology?

Technology stands to play an ever-increasing role in our world: Harnessing it as a vehicle for increasing connection and authentic discourse is a collective responsibility. I think technology gets viewed as a quick fix. If something is broken, let's find a tech solution. In thinking about what that solution is, we need to consider more closely what makes something broken and for whom. The person creating the solution has to understand what is “broken” from the same perspective as the person being impacted by the “brokenness.” Currently, there is a disconnect on this front. It's not just about user experience, it's about what is the core problem we're trying to solve and for whom that feels mismatched. The other concern is misinformation – there is a massive lack of ethical conduct and transparency in the development of technology solutions across all arenas. It could be argued that these two things go hand in hand. If we can come to a place where there is an understanding of technology as a vehicle with utility that enhances wellbeing collectively then we will have moved the needle on a healthier relationship with technology.

How would you describe your own relationship with technology?

Technology was always part of my household, as my father was keenly interested in all things technological. I was much less so. Fast-forward to my second career, and I found myself in an organization that was at the pioneering forefront of using technology as a vehicle to generate thoughtful and meaningful human connection. Before the dawn of ubiquitous video conferencing and overwhelming social media citizenship – this was back in 2003 that they chose to design a technology focused on bringing diverse groups of people together to deliberately engage in provocative conversations under the guidance of trained facilitators. The tech application that was designed and then reiterated on over the years was unique in both its intended function and its interface. I found myself jumping on board with this organization and working as part of a remote, incredibly diverse global team using technology all day, every day, both operationally in terms of being an employee and functionally, in terms of the program delivery. I have been with them now for 14 years.

For me, technology is a means to an end. What I choose to use is fairly deliberate, but there is no doubt that it is increasingly becoming something that intersects all facets of life.

If we can come to a place where there is an understanding of technology as a vehicle with utility that enhances wellbeing collectively then we will have moved the needle on a healthier relationship with technology.

A frequent criticism of social media platforms is that they prioritize profit over people. Do you agree with that statement? And, if so, what needs to change in order for platforms to have people at the center of their decision-making?

I do agree with the statement. I don't believe they intended this to be the case from the outset, however, very quickly – once the business model took off and proved astronomically lucrative – it's impossible to expect the platforms themselves to deviate. They are answerable to too many for whom the profit has become the goal. I'm a little bit cynical in that I don't believe social media can be intended to “bring people together” in any of its current incarnations. So for current platforms, I think beyond a determined commitment to the safety of users – most specifically children – and a more deliberate approach to transparency of process and intention, I don't think there is much to be done. I think social media would have to be reimagined to include a very clear human element into the moderation of interactions if indeed it is to be something that prioritizes people.

Social psychology tells us negative information is “stickier” than the positive kind and many societies are experiencing a fairly dark view of tech, so can you help surface some positives? What would you cite as your top 3-5 positive uses of digital tech and connected media?

  1. When designed and implemented deliberately, technology can truly be used to bring people together to foster genuine and constructive engagement. Technology can enable a limitless number and diversity of people to be brought together for this purpose. When this is done well, it's transformational. The skills built and the seeds sown are the pillars of what will bring about lasting change in how the world engages.

  2. In a globalized world, it provides a vital vehicle for families and loved ones to maintain contact and connection with one another. This was surely proven over the past two years as the world grappled with a universal health crisis.

  3. Notwithstanding the perils of misinformation and disinformation, there has been a great leveling in terms of how many can access how much in terms of information. If human nature didn't get in the way of progress, we would see that we have actually come a long way on this front - information is not available to just the lucky few, and such a vast cross-section of information can be at one's fingertips. If used well, technology can in fact be a very powerful tool for dissemination.

  4. Holding hands with science to make tremendous breakthroughs in diagnostics, in delivery of treatment, in radical new approaches and tools for medical use. Again, when done under a framework of ethics and compassion, technology can be a powerful ally for scientific progress.

What organizations are doing valuable work toward improving our tech future which you admire?

Build Up is doing excellent work.

If one person’s free speech is another’s harm and content moderation can never be perfect, what will it take to optimize human and algorithmic content moderation for tech users as well as policymakers? What steps are needed for optimal content moderation?

Ah the ultimate conundrum…. How to moderate content? If people cannot be trusted to author content with transparency and integrity and consumers cannot be trusted to think critically about what they are reading, then who gets to speak without any limitations? Who doesn't get to speak for whatever reason? And who decides? I think content moderation would optimally be done as a combination of human and machine/algorithmic tactics. The folks engaged in content moderation should be trained in key practices of critical thinking and multi-cultural insights and communication. This training should be done externally to the entity they work for, otherwise they are moderating content with the agenda of their specific platform. There should be agreed-upon industry standards linked to understood societal agreements on basic ethics of content. AI will only be able to play a partial role in this practice. Even in its most advanced form, an AI database will ultimately struggle with nuance. As such, there needs to be a multi-layered approach to content moderation – recognizing that this will have implications on the speed of content delivery, which in turn will mean that, unless all buy into a methodology, no one platform will take the leap for fear of falling out of the race, as it were. True strides in solving this quagmire of moderation will require meaningful collaboration across the sphere of technology platforms, which may be too tall an order.

The folks engaged in content moderation should be trained in key practices of critical thinking and multi-cultural insights and communication. This training should be done externally to the entity they work for, otherwise, they are moderating content with the agenda of their specific platform.

What makes YOU optimistic that we, as a society, can build a tech future aligned with our human values?

The work we do at Soliya shows me on a daily basis that, if we use technology to create meaningful encounters that foster exchange and learning, more and more people will understand how best to harness technology to bring out the best in us in terms of the kinds of digital and in-person citizens we can be.

Technology can enable truly difficult conversations to take place, provided they are well designed and implemented. These conversations in turn can foster a commitment to the practice of dialogue. If we can begin to use the practice of dialogue as the starting point for all design and decision-making processes, then perhaps the outcomes will allow us to align our efforts across all sectors with basic human values.


Previous
Previous

Commentary | A Social Science Study Offers Lessons for Improving Human Experiences Online

Next
Next

Opinion: How Can Dialogue Be an Effective Tool to Promote Social Justice on Campus?