Peter Singer argues self-aware AI should be given rights

internally renowned moral philosopher Professor Singer spoke to Yahoo News Australia about the rise of artificial intelligence.

Video transcript

- Should AI systems like ChatGPT be given more rights than humans if they become self-aware, and more sentient, and smarter than us?

PETER SINGER: I don't think ChatGPT is even sentient at all, let alone self-aware. But if they or other AI systems do become conscious or sentient, then, certainly, they ought to have some moral status that respects their interests.

If they become smarter than us, I don't think it follows that they actually have more rights or higher moral status than us. Because, after all, we don't measure people's IQ or, say, if you've won a Nobel Prize, therefore you have some special moral status that gives you more rights than other people, except, of course, for the rights to the prize money and recognition, but nothing more than that. So I think that would be the case with a superintelligent artificial general intelligence as well. It would have the status of other self-aware, sentient beings.

- So we should think before we switch it off, if it is becoming self aware.

PETER SINGER: Yes. Let's say if it is self-aware, then I think that is a bit like ending the life of a human being who is self-aware. So other things being equal, we should not switch it off once it has become self-aware. If we can predict that it is going to become self-aware, I think that's more like terminating a pregnancy, and I am in favor of allowing people to terminate pregnancies. So I would say that it's OK to turn off an AI that predictably will become self-aware if you leave it running but isn't as yet.

- So once it is self-aware, that's when the decision becomes tougher.

PETER SINGER: Yes. Though I do think if it's actually just aware, if it's a conscious being without being self-aware, it still has some rights. I don't think it has a right to life. But I think it has a right that we not do things to it which cause it to suffer, as non-human animals, some of whom are self-aware, but some of whom are not self-aware. I think the ones who are not self-aware still have rights not to have pain inflicted on them or not to have pain inflicted on them except for a very important overriding reason.