Should Sentient Computers Have Legal Personality?

James Leaver

Should Sentient Computers Have Legal Personality?
There is a matter that has been troubling me for some time now, and I think this issue of The Full Bench on technology and the law is a good forum to share it with you. You can probably guess what my trouble is from the title of this article, and lest you accuse me of being silly, please know I am quite earnest about the matter. It started about six months ago when I was watching District 9 and I wondered if extraterrestrials ever find themselves marooned on earth, would they have any standing before the law? Would they be recognised as having legal personality? Or would they have no standing in a court of law, unable to enforce a right, seek a remedy or claim some legal entitlement? Could you or I seek a writ of habeas corpus to protect an extra-terrestrial from arbitrary imprisonment? Or would the being be treated as a mere chattel, capable of ownership and possession, like a house elf in Harry Potter? Further, could the law be just if it did not extend legal personality to extra-terrestrials? ET was intelligent, felt pain and experienced something akin to human emotion. Most importantly, though, he was sentient. Sentience is the state of being endowed with feeling and consciousness. Or in the words of Laurence of Arabia writing of the Arab Revolt: the living knew themselves just sentient puppets on God’s stage.1 The possibility of such a question ever arising is negligible. But what if we one day create a sentient computer?

This was canvassed in a Star Trek episode 2 when Starship Enterprise-D’s android Lieutenant Commander Data brings a legal action to prevent the Starfleet from disassembling him to carry out scientific research. The question for determination was Data’s legal status: did he have a right to self-determination or was he a mere chattel? Here is an excerpt of a crossexamination by Captain Picard, representing Data, of the cyberneticist wanting to disassemble Data: Picard: Commander, is your contention that Data is not a sentient being and therefore not entitled to all the rights reserved to all lifeforms within this federation? Maddox: Data is not sentient, no. Picard: Commander, would you enlighten us what is required for sentience. Maddox: Intelligence, self-awareness, consciousness. Picard: Prove to the court I am sentient. Maddox: This is absurd, we all know you’re sentient. Picard: So I’m sentient but Commander Data is not? Maddox: That’s right. Picard: Why? Why am I sentient? Maddox: Well, you are self-aware. Picard: Ah, that’s the second of your criteria. Let’s deal with the first, intelligence. Is Commander Data intelligent? Maddox: Yes. It has the ability to learn and understand and to cope with new situations. Picard: Like this hearing? Maddox: Yes. Picard: What about self-awareness. What does that mean? Why am I self-aware? Maddox: Because you are conscious of your existence and actions. You are aware of yourself and your own ego. Picard: Commander Data, what are you doing now?



T E Lawrence, Seven Pillars of Wisdom (1922).

‘The Measure of a Man’, Star Trek: The Next Generation, session 2 episode 9, 13 February 1989.

Page 1 of 4

Should Sentient Computers Have Legal Personality?

James Leaver

Data: I’m taking part in a legal hearing to determine my rights and status. Am I a person or property? Picard: And what’s at stake? Data: My right to choose; perhaps my very life. … Picard: Commander, you have devoted your life to the study of cybernetics in general, and Data in particular. And now you propose to dismantle him? Maddox: So that I can learn from it and construct more. Picard: How many more? Maddox: As many as are needed. Hundreds, thousands if necessary. There is no limit. Picard: … Is that becoming a race? And won’t we be judged by how we treat that race? Now tell me Commander, what is Data? Maddox: I don’t understand. Picard: What is he? Maddox: A Machine. Picard: Is he? Are you sure? Maddox: Yes! Picard: You see he’s met two of your criteria for sentience, so what if he meets a third, consciousness? And even a smallest degree. What is he then? I don’t know, do you? Picard, apparently unaware of court procedure, did not wait for the witness to answer this question, but instead went straight on to his closing address: Your Honour, … sooner or later this man, or others like him will succeed in replicating Commander Data. The decision you reach here today will determine how we will regard this creation of our genius. It will reveal the kind of a people we are. What he is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty and freedoms. Expanding them for some, savagely curtailing them for others. Are you prepared to condemn him and all who come after him to servitude and slavery? The case put forward by Picard was that if a being is sentient, regardless of whether it is artificial, then it should be regarded as having the right of self-determination.

There is a problem, however, with this being the measure of whether something should have legal personality. Anyone would accept that non-human animals have sentience, and although there is a movement calling for such beings to be conferred with legal personhood, the call for it is not overwhelming, or convincing. It is one thing to give animals certain rights and immunities under the law to protect their dignity; it is an entirely different thing to say an animal should have legal personality, as animals are not independent actors within our society and do not have a desire to be. In his Commentaries, Blackstone makes reference to legal personhood, saying: Persons also are divided by the law into either natural persons, or artificial. Natural persons are such as the God of nature formed us: artificial are such as created and devised by human laws for the purposes of society and government; which are called corporations or bodies politic.3 Blackstone also gives us a fuller justification for the law conferring legal personhood upon artificial entities like bodies politic and corporations, saying it is necessary for the advantage of the public to confer some artificial persons with perpetual succession, giving them what he calls a kind of legal immortality: To shew the advantages of these incorporations, let us consider the case of a college in either of our universities, founded ad studendum et orandum,4 for the encouragement and support of religion and learning. If this was a mere voluntary assembly, the individuals which compose it might indeed read, pray, study, and perform scholastic exercises together, so long as they could agree to do so: but they could neither frame, nor receive, any laws or rules of their conduct; none at least, which would have any

Blackstone, Commentaries on the Laws of England (1765), ‘Of the Rights of Persons, 119. 4 “for studying and praying.”

Page 2 of 4

Should Sentient Computers Have Legal Personality?

James Leaver

binding force, for want of a coercive power to create a sufficient obligation. Neither could they be capable of retaining any privileges or immunities …5 On this view, the conferral of legal personality on the artificial is done to vest it with permanence so the purposes and objects of the artificial may serve society in perpetuity in spite of man’s impermanence. But this does not help us answer the question of whether a sentient computer should be given legal personality, because it is not necessary to give a computer legal permanence or protections so it may continue to serve society. A computer carries out this function by the mere fact that it is put to use by man, for man’s objects. The existence of the corporation as a distinct legal entity is a means by which certain social objects can be protected and given a force in law. Another example of the conferral of legal personality on the artificial is the Law of the Rights of Mother Earth, a recent Bolivian environmental law that confers legal personality on the earth, thereby giving the earth legal standing in a court of law. As I understand it, interested parties can bring an action on behalf of and for the benefit of Mother Earth to protect its rights. It is a legal fiction by which environmental objectives can be achieved for the benefit of society. The legal personality Mother Earth enjoys in Bolivia illustrates the benefit that the artificial (or in this case the natural) may gain from the conferral of legal personality; but it does not help answer whether computers should have the same benefit. It is hard to imagine a world where computers could be free actors beyond the limitations contained within their code.

Science fiction, a genre all about speculation of what artificial intelligence may mean for society, is defined by its exploration of the frontier between artificial intelligence and such limitations. In the 1940s, Isaac Asimov famously put forward the Three Laws of Robotics: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.6 If there is ever a computer that has free will, or is able to act according to want, lust or other folly, however, these laws must cease to be of effect as they are the ultimate barrier to free will. Then we must ask ourselves by what law should such a computer be governed and what are the means by which this occurs? Or perhaps the question is better answered by appealing to our compassion. How akin to us, by which I mean how susceptible to the follies of the human condition, must a computer be before it deserves the protection and guarantees of the rule of law? Or is beyond us to feel a moral obligation towards something which in reality has no flesh of its own and is physically nothing more than an organised combination of silicon, wiring and other hardware brought to ‘life’ by a flow of electrons though an integrated circuit? If we are able to design a sentient computer that transcends the adage ‘garbage in, garbage out’, and has a degree of free will, should that computer also not have free speech? Similarly, if there was a ever a

Blackstone, Commentaries on the Laws of England (1765), ‘Of the Rights of Persons, 4556.


Isaac Asimov, ‘Runaround’ in John Campbell Jr (ed), Astounding Science Fiction (March 1942), 94.

Page 3 of 4

Should Sentient Computers Have Legal Personality?

James Leaver

Terminator who can do acts not because it is programmed to do them but because it wants to do them and that cyborg is accused of a crime, should it be subject to summary decommission or does it deserve the due process of the law which would entail an enquiry into whether the cyborg had the requisite mental element to establish guilt? These are questions I cannot answer and Google, Bing and WolframAlpha are unable to answer them for me. I leave them with you as food for thought. Email James Leaver: Twitter: @lexawkward

Page 4 of 4

Sign up to vote on this title
UsefulNot useful