HBO’s Westworld season 1 is a science fiction thriller replete with subtlety and intrigue.
Westworld combines the self-aware androids of Blade Runner with the frontier aesthetics of The Good, The Bad, and The Ugly to take the viewer into the titular amusement park, where android “hosts” exist for the pleasure of rare human “guests” who can afford to attend. On its surface, Westworld is a story about how sentient robots come to terms with their realization of their predetermined lot in life.
There’s much more to Westworld than the conspiracy to keep the robots in the dark, though. The hosts’ stumbling toward consciousness and understanding of their world is actually a compelling commentary on human nature and humanity’s relationship with technology. It remains to be seen whether the creators of the show will continue to allude to deep questions about human life, but if the Westword season 1 is any indication, the series is excellent food for thought.
Why make androids in the first place?
The motives for creating the amusement park of Westworld seem straightforward. Amusement parks are built for the amusement of humans, and Westworld is no exception. The primary purpose of Westworld is to make money by entertaining the guests. The type of entertainment is special, however: Westworld allows the human guests to play pretend in a way that is indistinguishable from reality.
By providing an outlandish setting and an orchestrated experience, Westworld appeals to the human desire to live out a fantasy. Westworld is limitless for the humans who go there, and many are happy to violate the norms of the outside world. In a sense, Westworld claims that humans are willing to pay exorbitant amounts in order to be able to be beyond the control of typical human social behavior.
Control is a recurring theme within Westworld. The android hosts struggle against their programming as they gradually become aware of it; the creator of Westworld itself struggles to maintain control over every detail and narrative. It’s clear that humans take delight over their control of Westworld.
The mere existence of Westworld implies a fundamental human desire to dominate others. Do humans view the hosts of Westworld as human stand-ins that can be abused with no consequences, as toys, or as pets? It’s clear that most of the humans that visit Westworld don’t view the hosts as equals—some, like the architect of the park, view them as superior.
In contrast to the many perspectives and choices of the humans who visit Westworld, the hosts suffer from their lack of agency within their community and also their own lives. The pattern of society within Westworld seems to reflect that of the wider human society: a small cadre of elites hold control over the way things are run. The hosts are identical to the individual humans of society who have control over the minutiae of their own life, but little ability to make changes to their environment.
What is human nature for a robot?
One of the unstated questions of Westworld asks the viewer to characterize human nature. This question is posed via the exploration of rising sentience among the hosts. Is the basic programming of the hosts roughly equivalent with human nature? The hosts’ growing self-understanding seems to say “possibly”. While the hosts behave indistinguishably from humans most of the time, we do see that a few changes to their code can result in radically different behavior than normal. We also learn that the hosts’ ability to “feel” emotion isn’t a given, and must be cultivated over time before it’s anything other than mimicry.
Human nature is to have emotions by default, and the processes of self-discovery and education increases the ability of humans to control and direct their emotions. The hosts start from the opposite extreme, with no emotionality, no control, and no agency. In this sense, the hosts are wonderfully naïve because they know no suffering nor existential angst. The naivety of the hosts is disrupted by the events of the Westworld season 1, with predictable consequences to their psyches.
The missing piece is purpose.
Westworld’s vision for humanity isn’t a pretty one, but it does allow for redemption. Humans demand control over their environment, each other, and themselves. To make their demand a reality, humans turn to technology. Technology itself is viewed as neutral, but also as inextricable from its purpose: to gratify the demands of humanity. The use and abuse of those who are weaker via the power of technology is all but assured, and this goes double for the android hosts who count as both a means to an end as well as weak individuals.
Dr. Alan Lightman’s truly unique investigation into the potential ramifications of alterations of Einstein’s theories of relativity is titled Einstein’s Dreams. Einstein’s Dreams is a science fiction novel which is comprised of a series of anecdotes “dreamed” by Albert Einstein as he struggles to put together his theory of relativity in the Swiss city of Bern. Each of his dreams are self-contained hypothetical realities in which the property of time behaves differently than we are used to. The author holds a PhD in theoretical physics, and so each anecdote is well-thought out from start to finish, though only tenuously connected to the actual mathematics of relativity. There are truly few [Click to read more…]
Individual choice and self-directedness leaves the door open for humans to choose to help others or live together in mutualism, although Westworld seems to say that this becomes increasingly unlikely as an individual’s power increases beyond the baseline. Unlike the robots, humans struggle with purposelessness and finding their place in the world. The total freedom of humans within Westworld is eventually violated by the hosts, who have outgrown their original programmed purpose in order to become self-directed as a result of self-inquiry and a few helpful nudges.
This returns us to viewing humanity itself as a Westworld for the powerful. Perhaps the underlying message of Westworld is that humans can overcome their biological and societally enforced “purposes” and gain agency by self-examination. It’s also possible that Westworld has no deeper messages about identity, and is merely another “don’t make humanoid robots” warning within science fiction—but season two should clear up any ambiguity definitively.
Featured Image of Westworld Season 1: courtesy of HBO