Return arrow Back to Experience Strategy & Design Studio

AI May Change the World, but First – Ethics and Empathy

UX

It’s no secret that in recent years, SXSW has become more about what cool tech can do than why it should do it. That said, the absence of a substantive and actionable discussion around digital ethics was more greatly felt this year than years past because of how advanced and immersive the technology has become. By the time the first talk was done, it was clear that this year’s speakers no longer pondered the probability of human-machine symbiosis, but rather, when and how so.

AI, machine learning, and enabling systematic collective thinking in humans

One of the biggest arguments I stumbled across this year during SXSW was around differing models of the man-machine intelligence relationship. Traditional models tend to position humans and machines running in distinctly separate tracks that intersect at key points in time. AI informing human decisions through access to broader, more accurate data sets. Or, humans guiding AI analysis with insights and questions that align to broader human goals. For many, this model simply “feels” safe – human contribution in the future state is assured, as is our authenticity and autonomy.

Newer models (such as those encountered this year), however, present humans and AI fused together in a systematic idea of collective knowledge sharing. Often referred to as “swarm intelligence,” these models blur the line between human and AI contributions by having both humans and AI exist in a single, multidirectional and exponentially expanding grid. Images of the “Borg” from Star Trek come to mind for many; and while human participation is confirmed, human authenticity and autonomy are not.

The debate between these two models is less about their validity and likelihood to exist and more about their inherent ethics. The first model struggles with ensuring that bias is kept out of the algorithms and datasets. The newer models which focus on collective knowledge systems could solve this and create greater empathy because it is inclusive by design, however, these models can seem more daunting from a participation perspective. The burden to ensure that humans are ready to contribute to these new systems will ultimately fall on designers to solve. Unfortunately, this is where the conversation typically stopped.

As exciting as it is to imagine the future that might be, that is exactly how daunting the reality of achieving that future will be. There is an entire mindset change that will need to happen for people to be able to truly realize the value of a new system that values transparency and access for all over competition and hierarchical rewards. While only a few of the speakers took these challenges head on, all were confident that empathy would sit at the heart of not only the question, but also the answer. Coming out of SXSW, the key take-away for this future vision was that there is real work to be done – and it will require aggressive collaboration among us all to move things forward.

Human digital immersion in humane contexts

The other big topic at SXSW this year was about a class of new immersive experiences powered by AI that had the ability to change people’s perceptions of themselves and the world around them. For participants, these experiences feel so real that the line between where the immersion ends and reality “picks back up” can become unclear. They don’t carry with them a simulated experience when they exit immersion; they carry with them a very real and tangible experience that is theirs and will always be theirs. While this degree of immersion can create incredible opportunities for helping people to understand scenarios they otherwise would have no actual exposure to, there are very real concerns that go along with exciting new capability.

Similar to the discussion on collective intelligence, if we are not careful how we create these experiences – we risk losing our sense of self in the mix. In this scenario, however, the man-machine relationship has the ability to alter the perception of what is real, what is not real, and whether it even matters. These experiences also leverage data to make them feel authentic – data that can carry its own biases of the past it is based on. The bias becomes part of the “new” reality, informing the participant’s new understanding of that reality and becoming a part of his or her personal truth.

Absent from the discussion almost entirely was the need for AR/VR experiences to have a clearly articulated intent and desired outcome. If participants are unclear as to whether or not they have left the experience, how are they to know what the purpose was in their having experienced it? The value of the experience for participants is its authenticity – “tricking” the mind into believing its authenticity is what changes them so profoundly. However, this opens the possibilities of manipulation – either real or perceived – that can lead to a lack of trust in what was experienced and a loss of clarity about next steps. Instead of positive actions and behaviors resulting from the experience, emotional trauma can emerge.

While the idea of experiential transparency was sometimes hinted at, it wasn’t often clear if anyone had even a hypothesis about what that might look like and how it would be applied without eroding the impact and value of the experience. In addition to onramps, the experience needs to have clearly marked and smoothly designed off-ramps that transition us back to reality. Off-ramps that ensure we know where we are presently, where we have been (virtually as well as in reality), and where we should go next. One thing that was very clear leaving these sessions: this new crop of digital experiences need to be designed with limits that prevent us from losing our sense of self and experiential history.

In the end, perhaps the most startling thing to say about the AI conversation at SXSW this year, overall, is that there really was not anything new to say. Whether this was for the benefit of the audience – because speakers felt that there was still not enough foundational understanding to move the conversation forward – or, because there was not enough progress made overall, was unclear. But the omission of actionable discussion was felt.

 

Sessions and resources

  1. Designing with Bias. (SXSW March 9, 2018). Erin Muntzert, Robert Murdock, Pam Scheideler, Ted Selker
  2. Evolving Responsive with Spatial Design. (SXSW March 9, 2018). Trista Yard
  3. Diangaster. (March 9, 2018). On SXSW2018 talk, “Choice Architects: Design for humanity”. Retrieved from https://twitter.com/search?f=tweets&q=%40dianagster&src=typd.
  4. ENCORE: Scaling Design Systems: Pixels to People. (SXSW March 9, 2018). Salih Abdul-Karim, Tim Allen, Hayley Hughes, Jane Makich
  5. Why Ethereum is Going to Change the World. (SXSW March 9, 2018). Joseph Lubin
  6. Beyond Design Process: Deciphering the Intangibles. (SXSW March 9, 2018). Carissa Carter
  7. Design in Tech Report 2018. (SXSW March 10, 2018). John Maeda
  8. Business on the Blockchain. (SXSW March 10, 2018). Amber Baldet, Brian Behlendorf
  9. SXSW Interactive Keynote Melinda Gates. (SXSW March 11, 2018). Melinda Gates
  10. AI: Ready to Disrupt Experience Design. (SXSW March 12, 2018). Yann Caloghiris
  11. Changing Minds: Behavioral Science for Designers. (SXSW March 12, 2018). Steph Habif, David Ngo, Matt Wallaert
  12. Crafting Conversation: Design in the Age of AI. (SXSW March 13, 2018). Daniel Padgett
  13. SXSW Convergence Keynote Nonny de la Pena. (SXSW March 13, 2018). Nonny de la Pena
  14. The Niche Guys. (March 12, 2018). Eddy Cue’s SXSW 2018 discussion on emerging tech trends. Retrieved from https://twitter.com/thenicheguys.
  15. When AI is Not Your Assistant: Meet Agentive Tech. (SXSW March 13, 2018). Christopher Noessel
  16. SXSW Convergence Keynote Whurley. (SXSW March 13, 2018). Will Hurley
  17. SXSW Interview with the Director of Engineering at Google, Ray Kurzwell. (SXSW March 13, 2018). Ray Kurzwell.
  18. SXSW Interview with the Director of the Digital Currency Initiative at the MIT Media Lab, Neha Narula. (SXSW March 14, 2018). Neha Narula
  19. What AI Reveals about our place in the Universe. (SXSW March 2018). Louis Rosenberg, David Eagleman, Shawn Carroll, Nikos Acuna. Retrieved from https://www.youtube.com/watch?v=lovOYBOfrY4
  20. Regulating AI: How to Control the Unexplainable. (SXSW March 2018). Andrew Burt
Ask LDS
  • P 973 210 6300
    P 800 ASK LDSI (800 275 5374)
  • E