Skip to main content

Verified by Psychology Today

Michael Chorost, Ph.D.
Michael Chorost Ph.D.
Neuroscience

Is the Brain Just a Giant Switching Machine?

Can consciousness can be maintained by computer chips? Possibly not.

I’m posting my response to an interesting question posed by Silas Busch, a student at Bard College who attended a lecture I gave at Bard in January 2013. Mr. Busch gave me permission to post our exchange. Both emails are slightly edited for conciseness.

Dear Michael,

Do you think consciousness can be maintained in individuals who, hypothetically, have parts of their brain replaced by computer chips? I ask because I took a cognitive science course during the fall semester in which we discussed the hypothetical "Zombie Problem" proposed by Searle in 1992. The long and short of it is that he proposed that if parts of a brain were replaced by silicon computer chips (albeit ones that perfectly replicate the functions of the natural neuronal make up) the mind (ie consciousness) would still eventually be lost, even if external function appeared to remain normal. Essentially the question is: is consciousness a result of morphology or matter? Can a brain with consciousness be made of inorganic matter, or different organic matter than it is made of?

Silas Busch, Bard College

Dear Silas,

I've thought a lot about that question. Certainly a great many people assume that the brain is fundamentally a giant collection of switches, and therefore it (and any part of it) could in principle be replaced with any other set of switches, regardless of their physical nature. From that perspective, the answer to your question is simple: yes.

But I think the assumption that the brain is just a giant collection of switches overlooks several important points. For one thing, it assumes that a neuron is fundamentally just a simple switching machine, one that can easily be reverse-engineered. It just counts up the number of incoming signals and decides whether to fire or not based on those signals. But there is in fact a great deal of computation going on within individual neurons. The molecular reactions in the cytoplasm and membranes are fantastically complex. I once heard a professor at the University of Pennsylvania characterize the neuron as “a supercomputer in its own right.” If that is true, then the output of a neuron, while simple in itself -- 0 or 1 -- is the outcome of a complex process about which we know very little. And if we know very little about it, it's going to be hard for us to replace it with an equivalent switch.

You also have to consider the fact that neurons are not the only information-bearing part of the brain. Neurons are bathed in neurotransmitters like serotonin and oxytocin, and those neurotransmitters constitute a complex information system in their own right. What's more, it's fundamentally analog, based on gradients of chemical concentrations rather than on the simple absence or presence of a signal. And it's not immediately clear to me that you can fully characterize an analog system with a digital one. In other words, even if you can fully simulate neural firings, can you also fully simulate the action of the neurotransmitters? And you can't even think about doing that until you have a body, whose physical reactions to the environment trigger a great deal of the neurotransmitter releases.

It gets even more complex than that, when you consider the possibility that electrical firing can be influenced by the presence of the electromagnetic field created by the brain itself. I wrote about this in a footnote in my book World Wide Mind:

While it is surely true that a neuron’s synaptic inputs are a major determinant of when and whether it fires, they may not be the only determinant. It has been suggested, for example, that the local electric field generated by neural activity may itself influence whether neurons fire. (Such fields certainly exist—they are what is detected by an EEG machine.) In that case, simply counting up a neuron’s incoming action potentials cannot yield enough information to determine whether it will fire. One also has to measure the electrical field. One then gets caught up in a vicious circle of mutually entangled action and reaction. The neural firings change the electrical field, which changes the neural firings, which change the electrical field, and so on—it never ends…A practical mind-to-mind communications technology may have to track more brain events than just neural firings. For a good discussion of the limitations of the synaptic paradigm, read Bell’s “Levels and loops: the future of artificial intelligence and neuroscience.” (p. 76)

Most complex of all is the possibility that quantum events may be involved in neural activity. Roger Penrose made the suggestion in his book The Emperor's New Mind. Penrose suggested that neurons are small enough for quantum effects to be important at the synaptic junctions and membranes. And quantum effects, by their nature, can't be fully simulated on a macroscopic system. Penrose’s argument is controversial and hasn’t been accepted by most neuroscientists, but one at least has to consider it. It could be than in order to fully simulate consciousness, you would need a quantum computer as well as a conventional one. And I don't think anyone has a clue how you would actually program a quantum computer for that purpose.

While the suggestions regarding electrical fields and quantum effects are speculative and disputed, it's clear that consciousness is so complex that it may not be easily explainable in terms of simple switching mechanisms. One may have go further afield to explain it.

So for all of those reasons, I think it's premature at best to assume that brains that we have could be fully reproducible in the informational systems that we have at the moment. There's a whole lot that we don't know about neural behavior and the physical substrate of consciousness.

Thanks for asking a good question. Maybe someday you’ll take us closer to having a real answer.

Best regards,

Mike

advertisement
About the Author
Michael Chorost, Ph.D.

Michael Chorost, Ph.D., is the author of World Wide Mind: The Coming Integration of Humans,.

More from Michael Chorost Ph.D.
More from Psychology Today
More from Michael Chorost Ph.D.
More from Psychology Today