Eliza Explores the Benefits and Imperfections of Therapy

Games Features Eliza
Eliza Explores the Benefits and Imperfections of Therapy

In the opening menu for Eliza, you can see Evelyn, the main character, sitting on a bench, overlooking the water outside of Seattle. The image is likely a reference to Hamilton Viewpoint in West Seattle, a place where photographers, tourists and even locals often go to see one of the best views in all of the city. West Seattle is unique, in that it’s divided from the rest of Seattle by a bridge, making it feel like it’s not a part of the city even though it’s really only separated by a small canal. Those who live there, like I once did, like to complain about their exclusion by the rest of the city, how your social life dies the minute you move there, how West Seattle is still “real” Seattle, even if it’s across a body of water. How fitting that one of the better places to get a good look at the classic Seattle skyline comes from the perspective of an outsider looking in.

Eliza is about a mental wellness app and its developer’s attempt to reconcile the social impact of their creation. The app, which records what a patient says and forms a response based on key feedback like voice modulation, word choice and heart rate, is the work of Evelyn, who removed herself from the project after the death of a colleague. Disillusioned about the path of her life in the years that have followed, Evelyn is curious about whether Eliza ever amounted to anything, so she decides to go undercover as a proxy—the human face behind each Eliza session, a person who wears a headset issuing the prompts that comprise each conversation. The relationships she forms and the conclusions she then makes are up to the player, but ultimately, the game is about the dissonance created by removing human beings from an inherently human process. It’s about how the script of human connection mirrors the facade of a proxy, and how for some, an algorithm may be better than nothing at all.

The question about the practicality and ethics of personal data in digital health and mental wellness services is not a new one, but Eliza tackles it in an exploratory way that sincerely engages with the full implications of the idea. A mental health app, with its removal of the human assessment and feedback process, seems so impersonal and ineffective. But if it has any genuine ability to supplement certain forms of care, then it merits consideration. As Rae, the manager of three offices for Skandha, says, even poetry “has a model and structure to it, it’s not lawless. Even AI in poetry isn’t antithetical to its spirit.”

And to wit, the scripts that Eliza proxies follow, while alienating in their formality, do mirror some of the language and philosophy of therapy itself. An app like Eliza has the potential to offer the benefits we’d normally find from our friends and family from a “neutral” stance. Its removal of the human perspective, with all its bias and messiness, does what good therapy, especially cognitive behavioral therapy, professes to do: give advice that’s in the best interest of the client by training them how to question their own line of thinking without getting bogged down in the emotional details.

But what is life if not emotional details? As the game explores, the intuition of an AI can only go so far. Rainer, Skandha’s CEO, is confident that accumulating massive data, session feedback and user info is enough to eventually build an app capable of fully nuanced conversation. But Evelyn, in her natural desire to help, is constantly at odds with the restraints imposed as a proxy. Eliza’s responses are limited, and deviation from the script is cause for termination. The lack of room for improvisation forces a one-size-fits-all-approach that doesn’t encourage the open dialogue necessary to form a trusting bond. While some of her patients seem to appreciate the impartiality of a “robot therapist,” most get frustrated with the lack of real solutions or answers. No matter how effective the app, Evelyn is left with the sense that the constraints on her honesty might be damaging.

And despite Rainer’s lofty utopian aspirations, there is a contradiction in trying to create a foolproof, automated diagnosis system that’s somehow superior to the work of humans but based on the human discipline of psychology filtered through the error-prone process of programming. After all, Eliza is built on the analysis of a user’s tone and vocabulary, and many people have difficulty expressing their actual needs. It’s hard to judge an emotional state based on the words that aren’t there. As Evelyn’s sessions demonstrate, the app also offers solutions that just aren’t practical to people who have real problems. A nature meditation program isn’t going to give an objective answer about what you should do about cheating on your girlfriend. What is the point of seeking advice or perspective about a human life from a source that hasn’t ever felt pain? How does one feel heard if no one is really there?

Eliza Seattle Light Rail.jpg

It seems fitting that the Eliza proxies, by reciting the scripted prompts in each session, provide the facade of a real therapy session. In some ways, therapy is the facade of a real friendship.

It’s a symbiotic exchange where intimate details are shared. You see each other regularly. And yet, a certain amount of distance is required, and professionalism dictates that no real relationship can be formed. In that sense, Eliza isn’t as cruelly impersonal as it seems on the surface. We can’t always count on human beings, even our therapists, to give us what we need. And that’s the point of therapy, isn’t it? To become an emotionally independent human being. A therapist isn’t going to ping my phone to remind me to take my meds and they aren’t going to text with me in the middle of the night. There’s a certain self-sufficiency that the process is meant to encourage, and you could argue that Eliza does that.

Eliza actually takes place in my home neighborhood. Speaking as a Seattlite, Eliza is exactly something we’d do—implementing it in well-off school districts and pushing an impersonal automated process rife with privacy abuses over forming and reinforcing social and community connections. We have high rates of mental illness but the Seattle Freeze, that unfriendliness we’re so often accused of (equal parts due to the weather and our workloads) is real. The thought of so directly removing a human being from a human process is emblematic both of our social and work cultures. We are lonely and fiercely antisocial. An algorithm may have the logical impartiality we desire, but sometimes what we really need is a witness to our pain.

Maybe it’s not so pointless to talk to a robot or converse in canned responses. After all, I do that in games all the time. Just talking Maya (a frustrated artist failing to “make it”) through her problems brought me to tears trying to process my own. Consider Holiday, an older client of Evelyn’s who is lonely but scattered and just wants to talk and talk even if no one is really listening. And as we find out later, Holiday never actually discusses her real issues in therapy, which makes Evelyn powerless to actually help her, whether she stays on the Eliza script or not. It reveals the flaws of the Eliza system, but also again reflects the reality of therapy: it’s not her job to fix Holiday’s life, any more than it’s, say, my therapist’s job to fix mine. And if the patient doesn’t open up, there’s nothing anyone can do, therapist or therapy app. For all their differences, “real” therapy, and the automated kind that Eliza provides, share some of the same problems.

There’s a moment near the end of the game where Eliza gets turned back on Evelyn, when she decides to go through with a session, to see if she can get some clarity from her own creation. In a sense, Eliza is also being used on the player. Being on the receiving end is illuminating, in that we can now see what patients go through during each session. There are a lot more dialogue options to choose from. At one point, in that clinical, mirroring language that the player has become familiar with, the proxy asks Evelyn what she wants in life. A list of options pops up. Reading through them, and trying to find one that seems to fit Evelyn’s mindset and situation (filtered, of course, through your own actions as the player), quickly turns into an exercise of reflecting your own. Do I want… “to escape all this”? To be loved? To be valued? Success? Control? “To understand what’s really going on”? To enjoy what I can? To be at peace with myself, to be seen and heard, to excel at something? Do I want a family, just a “normal” human life, to survive, or nothing at all? What does she want? What do I want? Do I know?

Evelyn’s journey ends in a place not dissimilar to the one I face now, as I finish up cognitive behavioral therapy and contemplate what the future holds. We are both looking over the course of our career and wondering if we ever managed to affect or help anybody. When the time came, and I had to select her future, whether to join her friend Nora in artistic revolution or one of her old coworkers in an attempt to marry the mental health and tech industries forever, I made her choice reflect what might be my own: to become a therapist. In my version of events, she continues as a proxy to earn a living while she pursues her degree. It felt right that she would find her way to reconnect with the world while reconciling her past, however messy and imperfect it may be.

It’s like Rae says: “Sometimes when we voice things out loud we can hear ourselves think”. And in that last session with herself, with the app that was created based on her own thoughts and personality, Evelyn finds clarity. Maybe it doesn’t matter what got her there, so much as the fact that she arrived. It’s not a perfect system, because Evelyn isn’t perfect. And perhaps she won’t be a perfect therapist either. But most important is that if a person needs help, someone or something will be there.


Holly Green is the assistant editor of Paste Games and a reporter and semiprofessional photographer. She is also the author of Fry Scores: An Unofficial Guide To Video Game Grub. You can find her work at Gamasutra, Polygon, Unwinnable, and other videogame news publications.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin
Tags