Episode 9, Season 2 of Star Trek, the Next Generation, posits a
situation in which Star Fleet has ordered Data to be disassembled in order
to analyze his construction. Data objects-- I don't know why he
would-- he's a robot-- and a "hearing" is held. Picard defends Data
against this contemptuous act: Data is more than a machine. He
must have a soul. He is my friend.
Picard is obviously getting a little soft in the head.
Some fans of the series expressed sheer joy in the celebration of Data's
"right" to exercise choice as a... a what? Nobody goes into detail
because the idea, as charming as it is to them, is absurd. As a
machine?
Picard defends Data as being "self-aware" and "intelligent", and he claims
that no one can prove that Data does not have consciousness. The straw
man opposition, Bruce Maddox, stunningly can't remember what a robot is or
what the difference is between a mechanism and a biological organism.
He can't, apparently, even define "consciousness". When Picard
asks Data to explain why he is at the hearing, the scene cries out-- nay,
screams-- for Maddox to point out that Data is doing absolutely nothing
more than following his programming instructions. As
sophisticated as they might be, it is still nothing more than software.
If Rodenberry had wanted to make the episode at least a little challenging,
he might have tried to argue that Data has so many circuits and so many
programming instructions that the collective result of all those 1's and 0's
is similar to human consciousness, but then Maddox could have replied that
Picard was being reductionist on the human side, and, if the writer had
the gift of eloquence, Maddox could have waxed poetic about human will,
love, the pleasure of companionship, the reverence for beauty, and so on.
But, as I said, that's why I couldn't stand Star Trek TNG, and that's why
the
movie also sucked.
It could also have been funny. Maddox could have asked Data a number
of weird questions, and then submitted a list to the court showing that he
knew all of the answers Data would give in advance. Because Data is
not and could never be anything more than the sum of his hardware and his
programming. Data could be shown the unicorn, as Deckard was in "Blade
Runner", a deeply profound film about something like robots.
But then, Picard would find a bug in the code and
interpret that as "personality".
The other point Maddox could have made is that there could reasonably be
another Data, or a million Datas, a point Picard inadvertently makes (but
then he calls it a "race"). Each of the Datas-- a point glaringly and
ridiculously ignored at the hearing-- would be absolutely identical.
Now we enter absurdity. Picard asserts that it is a measure of the
quality of humans as humans how they will treat machines like Data.
This should be comical, but it is dramatized with morbid reverence as if a
great philosophical point is being made. Picard even refers to Data as
"new life", as in "seek out new life forms".
Again, it's really like Picard is talking about a laptop or an iphone here.
Just imagine for a moment if, instead of a rather creepy looking android,
the machine was represented by an Apple laptop instead.
Well, really, aside from a larger number of circuits, is there any
difference?
Why can't robots ever have feelings? It's oddly akin to the idea that
you
can't know what you don't know-- there is no way a robot can
"want" to have anything. It can only execute its programs.
Data might have had an objective all right-- but it would have been in the
nature of an embedded command to preserve his mechanical and functional
integrity. His first question might have been, "Have I been superseded
by an upgraded model? All right then, I will take myself to the dump...."
* * *
Why is the idea that a robot would want to be human so attractive to people?
Is it a totally narcissistic impulse-- we are so fabulous at everything that all other sentient beings-- even
objects-- want to be us? Or is it because we identify with the
robot-- because deep down in our dead souls we too want to be human.
I believe Star Trek: The Next Generation would have been far, far more
interesting if Data behaved more like a real robot: cold-blooded, ruthless,
efficient. The humans around him would be terrified because he would
be the only crew member capable of making a truly rational decision.
There would be another fear-- the fear that his programmer made a mistake
somewhere in those zillions of lines of code, and Data hits the
self-destruct button when they're stopping to refuel.
Data is
not capable of hatred.
Data's Trial.
November 23, 2010
The Blade Runner exception? The replicants in the movie Blade Runner
obviously have feelings. But that is because they are biological organisms
genetically engineered for their roles as slaves on other planets. Phillip
K. Dick wisely avoids providing unnecessary detail about their origins and legal
status, but they are susceptible to the effects of a "virus", which is clearly
perceived as a biological entity-- not a computer "virus".
Since the replicants have an organic, biological nature, it is reasonable to
believe they might have feelings, and a soul.
In fact, one of the most disturbing elements of "Blade Runner" is the fact that
the replicants seem to be super-humans who are slaves only by virtue of the fact
that they have been "engineered" genetically.
Given what we have seen of human nature in the 20th century, that scenario is
certainly plausible.
But electro-mechanical robots-- never.
More on Computers Thinking They Think
I tried to watch "Star Trek: The Next Generation".
I tried. After a few episodes, I found it revolting and stopped.
What was so revolting? Data. Here's the problem-- Data is such
an overwhelmingly preposterous character, it over-shadowed every other
virtue the show possessed-- which weren't that many anyway.
I don't
understand why anyone accepts the character of Data, or the character of
David in "Artificial Intelligence", or "I, Robot", or any number of robotic
characters who aspire to have human feelings. "Terminator" and the
original "Star Trek", and the original "Lost in Space" TV series got it
right*: robots, even with artificial intelligence, can never
aspire to being human because they can't have feelings. It's not
possible. It's not theoretically possible. It's not
hypothetically possible. It is a simple physical impossibility.
It's not even a good fantasy for which you could extrapolate an interesting
plot, for those willing to suspend their disbelief.
The only reason a robot would ever even behave like a human is
because of the robot's programming, which is always performed by humans.
Even if so-called "Artificial Intelligence" is used to program the robot,
the ultimate source of all the code can never not be human. The robot
that "aspires" to be human is really the
programmer who wraps up a little gift for himself in a lovely little decorative
box and then opens it himself: for me? I can't believe it!
There is only one small sliver of a possibility here-- what if portions
of the computer's brain were organic? I grant a sliver-- a very, very
tiny one -- of possibility. I still don't think it is possible for
organic material to function in a robotic wrapper in any way not defined by
the human designers. If they ever succeeded, I think we would regard
it more like a hamster with hard drives than a sentient being. But
there is a tiny, tiny sliver of possibility that a rich, massive data base
(because that's all the computer-robot's instructions are anyway) might
acquire an element of unpredictability if certain algorithms' could be
linked to an organic material in some way...... nah.
A
key thing here is this: since Data is not human, he could never, ever aspire
to be human. Why would he? Any expression he would make of
such ambition could never be anything but the output of an algorithm placed
in his memory by a human programmer. That's all it can ever be: a programmer
telling his mechanical device to declare that it wants to be the programmer. Like
a child planning a tea party with her dolls.
So Data could no more aspire to be human, or feel
like he's missing something, than a remote control for your PVR. So
whenever he expresses such sentiments to Picard or whoever, he is simply
parroting nonsense placed in his "memory" by his programmer, and the fact
that the other characters on SNG appear to take him seriously makes the
"drama" only slightly less ridiculous than watching the Three Stooges play
doctors.
* I know -- the robot on "Lost in Space"
would occasionally express emotions, and seemed awfully protective of the
family Robinson, but it was clearly satirical, funny precisely because of
the assumption that the robot, of course, could not have any feelings, as
when it was embarrassed by "feelings" of joy when reunited with Will after a
long absence.