In which I keep lecturing people about what’s wrong with this show and they don’t want to hear it, so I’m going to blog it for my own amusement!
I’ve heard it’s possible that HBO will lock in as many as five seasons of ‘Westworld,’ an episodic TV reboot of the 1973 movie of the same name.
Five seasons of this? It’s well-acted and well shot, but it’s rife with fail. In the opening credits you see the creation of biologically printed robots; in the show itself you see technicians pulling rubbery skin onto metal skeletons. Which is it? Are they biological or mechanical, or both?
And how does keeping humanoid robots so life-like they look exactly like humans naked help to dehumanize them to the staff, rather than merely degrade them like prisoners or slaves?
The plot itself is interesting… if you like sappy, emotional Westerns with no science fiction whatsoever, and only a mere nod to game-playing in the “hidden level” sub-plot followed by Ed Harris’ Man In Black character.
All these robots can pass the Turing test, but they’re fodder for human guests to fuck and fight and maim and rape and kill. They bleed. They appear to understand freedom enough to want it. The whole show is gratuitous violence and weepy scenes of robot subjugation. It fails spectacularly to do what sci-fi does best, which is ask important questions:
If you can build robots so subtle they can pass as human, are they human?
What is consciousness?
Where’s the line between a very convincing machine and a self-aware AI? How do you design a test to find it?
Is it moral to build a device, giving it the capacity to feel and understand pain, just so you can hurt it?
Is what they feel actually pain, when they’re just robots? Can they feel, as we define feeling, or is it all programming designed to mimic feeling? How can anybody tell the difference?
2 Responses to What’s wrong with ‘Westworld.’
Leave a Reply Cancel reply
Friends
- Barn Lust
- Blind Prophesy
- Blogography*
- blort*
- Cabezalana
- Chaos Leaves Town*
- Cocky & Rude
- EmoSonic
- From The Storage Room
- Hunting the Horny-backed Toad
- Jazzy Chad
- Mission Blvd
- Not My Rabbit
- Puntabulous
- sathyabh.at*
- Seismic Twitch
- superherokaren
- The Book of Shenry
- The Intrepid Arkansawyer
- The Naughty Butternut
- tokio bleu
- Vicious, Unrepentant, Bitter Old Queen
- whatever*
- William
- WoolGatherer
- Powered by Calendar Labs
You’ve touched on two things I love about the show:
1) Technical ambiguity about how the robots are built and how they work leaves a lot open to interpretation and speculation. This show has been great fodder for long, late night conversations about What Does It All Mean!? with friends.
2) All of the questions you say the show doesn’t pose, I don’t think it needs to, and am glad that it doesn’t. At this point those questions are pretty much a given in our minds when robots are involved in a story, thanks to the genre’s tradition. I think the producers made a masterful decision to not point too directly at these questions (besides through more or less subtle devices like the choice of pop music on the player piano, and repeated motifs such as the player piano itself.) It’s very “show not tell”. They only need to show us how characters living in the middle of these questions behave, and we get to fill in the rest.
Thanks for commenting; hardly get any comments these post-blog days!
Pop music on the player piano has nothing to do with any of my points about consciousness, humanoid-duplicating technology, and AI. But it is cute!
And I disagree: it does need to pose those questions. Otherwise, it’s just a western, which is an utter waste of robots. And “Given the genre’s tradition”? What tradition? The one in which we just accept robots ipso facto as conscious and never examine any of it, like Blade Runner or Wall E or Star Wars or Short Circuit? I’m tired of those. I wanted West World to be about the difference between artificial intelligence and consciousness, because there’s no other reason to use the West World world.
In the original West World, the machines were automatons, and not conscious. They malfunctioned rather than revolted. In this show, it’s clear they’re going to revolt, which they can only do if they’re conscious. There’s no way a machine can do things it’s not programmed to do, unless: it’s programmed badly by accident; it’s programmed well and does what it’s intended to do; it becomes sentient and decides to do what it wants. None of this is (was, at the time of writing) being addressed.
(Apparently now — I’m several episodes behind at this point — show addresses issues of fate versus free will. I need to catch up!)