Mind and Brain

In

farnk-a-bwIn Sunday’s New York Times Book Review, Adam Frank, an astrophysics professor at the University of Rochester, examined a new book by Michio Kaku, in which, like many theorists, Kaku speculates that in this century we’ll be able to essentially replicate the human mind using technology. Frank did an admirable job of applying skepticism to some of the underlying assumptions embedded in this idea, without condescending to the speculative value of Kaku’s vision. It sounds as if Kaku believes our minds are simply extremely complex flesh-based computers. Frank takes exception to this kind of simplification. These are issues we do need to think about, and I loved the gentle hesitation Frank brought to concerns about Kaku’s thesis. Many skeptics now consider human consciousness to be nothing but a physical phenomenon, an illusion conjured up by the firing of neurons in the brain, reducible to nothing other than those physical processes. Frank doesn’t support or oppose such a view, in his review, but he suggests that consciousness could be an emergent aspect of the world, a result of biological evolution that can’t be entirely explained by all of its physical moving parts–in other words it may not be reducible to the action of the physical processes that gave rise to it. And he also suggests that consciousness might simply be one of the building blocks of the universe. This sounds as if he wouldn’t ignore the suggestions that consciousness could be something closer to an ontological principle that precedes and guides the emergence of life itself rather than an outcome of the world, a notion that people like Kaku would likely abhor.

I thought his explication of all this was concise and beautifully written, well worth quoting at length here:

I’ve spent most of my professional life running supercomputer simulations of events like the collapsing of interstellar gas clouds to form new stars, and it seems to me that Kaku has taken a metaphor and mistaken it for a mechanism. There has always been the temptation to take the latest technology, like clockworks in the 17th century, and see it as a model for the mechanics of thought. But simulations are not a self, and information is not experience. Kaku acknowledges the existence of the hard problem but waves it away. “There is no such thing as the Hard Problem,” he writes.

Thus the essential mystery of our lives — the strange sense of presence to which we’re bound till death and that lies at the heart of so much poetry, art and music — is dismissed as a non-problem when it’s exactly the problem we can’t ignore. If we’re to have anything like a final theory of consciousness, we had better be attentive to the complexity of how we experience our being.

 When Kaku quotes the cognitive scientist Marvin Minsky telling us that “minds are simply what brains do,” he assumes that scientific accounts of consciousness must reduce to discussions of circuitry and programming alone. But there are other options. For those pursuing ideas of “emergence,” descriptions of lower-level structures, like neurons, don’t exhaust nature’s creative potential. There’s also the more radical possibility that some rudimentary form of consciousness must be added to the list of things the world is built of, like mass or electric charge.

On the ethical front, Kaku does an admirable job of at least raising the troubling issues inherent in the technologies he describes, but there’s one critical question he misses entirely. The deployment of new technologies tends to create their own realities and values. If we treat minds like meat-computers, we may end up in a world where that’s the only aspect of their nature we perceive or value.