Subscribe to Christianity Today
C. Stephen Evans
Robots with Choice?
Imagine you are walking downtown in a major city and you come across an unpleasant protester: an aggressive white supremacist who is handing out racist propaganda. His face seems contorted with hatred as he thrusts a pamphlet toward you. You recoil from this depraved specimen of humanity and quickly go on your way, but it is hard to get his image out of your mind.
You might be even more disturbed, however, if you were to discover that this "protester" was not a human being at all but rather a convincing humanoid robot, the invention of an evil, racist genius who wishes to further his ideology without the inconvenience and risk of making a personal appearance.
If that were the case, your moral disapproval would rest on the inventor. The robot itself (himself?) would be distasteful, but you would no longer think it appropriate to regard the mechanical "person" as evil. For such a mechanism lacks free will. In an immediate causal sense, the robot is "responsible" for its behavior, but in a moral sense, the responsibility lies with the inventor/programmer. The robot can only do what its inventor designed and programmed it to do.
Now, in your imagination, change the scenario one more time. The protester is not a robot, but a real human being. However, this human being has been kidnapped by an evil group of racist genius-scientists, who have implanted tiny electronic sensors in strategic places throughout the protester's brain. This cabal of scientists can completely control the beliefs and acts of will of the protester through remote-control electronic signals. Who is now responsible for the behavior of the protester?
Suppose we assume the protester has no memory of the kidnapping and operation, and no awareness that he is being electronically controlled. (Scientists capable of such sophisticated neural control can surely manage that trick as well.) If we ask the protester if he is doing what he does freely, he answers affirmatively. And he is right about this, in at least one ...