Wednesday, May 31, 2017

Ex Machina: A Dystopian Reflection

(Artist: Ex Machina)

Like most futurist films, Ex Machina tells us more about the current state of humanity than it does the future. Ex Machina, intentionally or unintentionally, was glittered with biases. There were multiple aspects of Ex Machina I found troubling--including (but not limited to): moral, social, and technological issues. Ex Machina is a dystopian view of AI, or rather, a dystopian reflection of humanity.

While optimistic estimates put human-level AI at 2029,[1] the most important thing we can do is reflect upon ourselves and question what we are programming into our creations. Some contemplate developing an ethical code for AI integration and creation, but in all honesty, I’m inclined to believe that the ethical code we need to be evaluating is our own. Our AI will be reflections of ourselves. What do we want that to look like?

For example, Ava, the AI, was played by a white woman. Our moral codes and social structures have a long history with its treatment of women. In Ex Machina, Nathan, the creator of Ava, hired Caleb to determine if Ava passed for human-level intelligence, a Turing test of sorts.[2] Historically, there have been more than a few occasions when men have been in a disproportionate position of deciding when women are “allowed” to the full status of personhood, e.g. legally, intellectually, or physically. Granting the status of personhood in Ex Machina was determined by two men, which could be a reflection of historic and present representations of women.

Though Caleb may have had the best of intentions, his biases were quite evident. It was clear throughout the film he cared for the well-being of Ava, but he didn’t seem to have the same concern for Kyoko. There is an intersectional component that shouldn’t be overlooked. Kyoko, a woman of color, who was specifically limited in her ability to speak, was treated far worse than Ava. She was treated at best as a house servant, and at worst as a sex slave. The message received was, remove a woman’s ability to communicate and you remove her personhood, autonomy, and consent.

When Kyoko attempted to take off her clothes when she and Caleb were alone, Caleb, and likely the majority of the audience, assumed this as a sexual advance. But could there be another explanation? Was she attempting to communicate to Caleb her real identity? Was she attempting to share something intimate and profoundly important of her personhood that was mistaken as something sexual? Maybe. Couldn’t she have simply ripped the skin off her face? Maybe. Like with any victim of sexual abuse and trauma, perhaps Kyoko was acting out of fear. Her sexuality (and Ava’s for that matter) was portrayed in the movie as one of, if not her most powerful asset. Perhaps it was her sexuality that put her in a position which she was brave enough to expose her true self as an AI.

Either way, Caleb showed little to no indication that he was interested in liberating women like Kyoto, but rather interested in his own gratification with Ava. Was Ava only using Caleb for her escape? Or did Kyoko communicate to Ava Caleb’s lack of desire to liberate women like Kyoko? Is that why Ava left Caleb? We don’t know.

In the end, Kyoko died while Ava lived. After Kyoko’s death, Ava literally took the skin off another woman of color’s body and kept it for her own. I’m having a hard time seeing how this isn’t an awful unintentional bias or a direct representation of white women’s perceptions of women of color. It’s truly horrifying.

While some feminists applaud Ava’s escape[3], the story teaches us that in order for a woman, a white woman, to attain personhood, autonomy, and freedom, it is at the hand of violence, deceit, betrayal, death, and the expense of women of color and men. If this is how patriarchy is to be defeated, I’m not so sure we’ve achieved anything of value. What we have on our hands is a quasi-Animal Farm[4] scenario where one dictatorship is overthrown just to be replaced by another one. Is there really any justice or improvement at that point? If one demographic is held up at the expense of another, we have not evolved and THAT is the real dystopian message of the film. Not that Ava, the most powerful AI created has escaped her cage, but that there is no genuine progression—just the enhancement of technologies that we program with our own biases, flaws, and oppressions.[5] Ava is exactly what we made her to be. If that’s really what we allow patriarchy to do to us, all of us, then we all lose.

The development of AI needs our intersectional experiences and differences, but more importantly it needs our own internal transformations. AI will be an amplified version of ourselves, all that is good and beautiful and all that is harmful and terrible. We must continue to change ourselves, and not just change, but improve through genuine progression if there is any hope for AI or our species. Radical compassion, love, and benevolence are required for progress, not just a race track to the most advanced technologies. Because without those attributes, there is no advanced civilization.



Notes and Citations



[1] Ray Kurzweil, “Don’t Fear Artificial Intelligence,” TIME, December 19 2014, accessed May 27, 2017, “The median view of AI practitioners today is that we are still several decades from achieving human-­level AI. I am more optimistic and put the date at 2029, but either way, we do have time to devise ethical standards.” http://time.com/3641921/dont-fear-artificial-intelligence/

[2] Wikipedia, The Free Encyclopedia, s.v. “Turing test,” accessed May 27, 2017 https://en.wikipedia.org/wiki/Turing_test

[3] J.A. Micheline, “Ex Machina: A (White) Feminist Parable of Our Time,” Women Write About Comics, May 21, 2015, accessed May 27, 2017, “But here’s what’s killer about the android/artificial intelligence = woman metaphor, particularly as shown by Ava. It sucks because it implies that we were created by men and for men, when of course, we were not. And yet, it triumphs because it says that even if we allow such a ridiculous premise, even if we entertain the notion that men are constantly building or trying to build us into what we want to be, it doesn’t matter. Ava, patriarchal dream as she is, shrugs off her programming, shrugs off the way that men want to see her and gets her goddamn own. Which is inspiring, of course, because in the end, what it really means is: so can we.”

[4] Wikipedia, The Free Encyclopedia, s.v. “Animal Farm,” accessed May 27, 2017 https://en.wikipedia.org/wiki/Animal_Farm

[5] Clare Garvie and Jonathan Frankle,  “Facial-Recognition Software Might Have a Racial Bias Problem,” The Atlantic,  April 7, 2016, accessed May 27, 2017 https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/