The Imprinted Brain

How genes set the balance between autism and psychosis

The One-Question Turing Test

The Turing test says as much about us as it does about thinking in machines.

Can machines think? The originator of modern, electronic computing asked this question in terms of the now famous Turing test. Turing proposed it as a variant of an “imitation game” in which, using only indirect means of communication such as a teleprinter or computer terminal, you had to distinguish between a male and female respondent who were not obliged to tell the truth about who they were. In the crucial variant that he proposed to answer the question of whether machines can think, one of the protagonists is a computer programmed to imitate a person. He argued that if most people were no more than randomly good at saying which was the machine and which was not after a suitable period of interrogation of both, you could say that the machine had passed the test.

Inevitable limitations regarding computers’ ability to handle contextual, common-sense knowledge means that at present Turing tests cannot be completely open with regard to the topic of discussion. Computer programs have done quite well with wine, politics, and religion as the subjects for conversation—presumably because these are topics about which you can talk complete nonsense and still be taken seriously! And of course, the fact that five out of ten of its clients were convinced by a program mimicking a psychotherapist probably says more about psychotherapy than it does about the Turing test. 

Find a Therapist

Search for a mental health professional near you.

But however that may be, here is a single-question Turing test that goes to the heart of the matter:

You are interrogating two respondents via a computer terminal. One is a man; the other is a computer that is programmed to make you think it is a man. You must decide which is which on the basis of a reply to a single question to just one of them. What question must you ask to determine definitely whether you are interrogating the machine or the man?

Obviously, asking, “Are you the man?” or “Are you the machine?” of either will not suffice, because although the man will answer truthfully, the machine will give you a false answer, and you have no way of knowing which you are addressing, because both will claim to be human.

Nevertheless, if you consider what each would tell you about what the other would say, a solution can be found. This involves imagining alternatives that in turn demand an understanding of each respondent’s knowledge of the other’s truth telling or otherwise.

Specifically this is known in the autism literature as an appreciation of false belief, and generally these issues are aspects of what is often called “theory-of-mind skills,” “mind-reading,” or in a word, mentalism. Deficits in mentalism are generally diagnostic of autism, and tests of false belief are particularly crucial. Furthermore, as this example suggests, such mentalistic skills are the key to passing the Turing test and giving computers an appearance of being able to think.

The answer is to ask either “Which of you would be indicated as the man if I asked the other to indicate him?”

Clearly, you might be questioning the man or the machine. If you were addressing the machine, it would of course say that it would be indicated. But if you were addressing the man, he would confirm that answer because he would know that the machine would give a false answer, and indicate itself as human. You could then confidently conclude that, whichever you asked, the man was the respondent not indicated in the answer.  

Or at least, you could unless the machine was clever enough to realize that you would think this. Clearly, if the machine’s mentalistic intelligence was high enough, it might realize that lying in answer to this question would give it away, but that telling the truth would not, because the man would be wrongly assuming that the computer would lie.

Being able to lie is yet another aspect of mentalism where autistics—perhaps to their credit—have serious failings. This is because lying exploits those same mentalistic skills that the single question solution here requires, and this in turn suggests that, were the machine to be clever enough to tell the truth in this instance, there would be no single question solution to such a Turing test.

A thinking machine might deduce that deceit was the essence of being human—even when it sometimes meant telling the truth. And the machine might be right!

 

 

 

Christopher Robert Badcock, Ph.D., is author of The Imprinted Brain: how genes set the balance between autism and psychosis. 

more...

Subscribe to The Imprinted Brain

Current Issue

Love & Lust

Who says marriage is where desire goes to die?