The arrogance of knowledge
By David Weinberger
"It's a type of arrogance to think we can understand consciousness." So said the person next to me at dinner as we discussed a presentation by Ray Kurzweil, the technology impressario who has brought us technology that matters, including world-class music synthesizers and reading machines for the blind and dyslexic. To my surprise, I found myself disagreeing.
The question wasn't whether Kurzweil himself is arrogant; he's not, although he seems to be doing OK in the self-esteem department. But that doesn't matter. My dinner companion was referring instead to the human enterprise championed by Kurzweil and many others and embraced widely in our culture. The thinking is that the brain is a marvelous computer. Therefore, once we have hardware large enough to contain all the information in a brain and once we understand the brain's software well enough, we will be able to build intelligent machines.
I'm not alone in having problems with this line of thinking. It seems to me that hardware/software is the wrong paradigm for understanding the brain and consciousness. (Yes, I am using "paradigm" in its original Kuhnian sense.) The brain is no more hardware and software than the liver is. So, running computer software on computer hardware can no more create consciousness than it can create a liver.
The simple reason is that software is symbolic, all the way down to the bit level. I have eight light switches in my house. Like bits, they are either on or off. They are not bits but they could become bits if I chose to treat them as such. I will count "on" as a one and "off" as a zero. I will begin my 8-bit number by reading my house from top down. Suddenly, my light switches are bits. Likewise, the transistors in a computer are only bits because we say they are. If I were instead to look at a waving field of wheat and count left as on and right as off, I might by coincidence find that it models Julius Caesar's brain state at the moment Brutus stabbed him, but it would not and could not be conscious, for the stalks are only bits because I took them as such.
There's a mystery to flesh itself. I don't pretend to understand it at all. I heard the MIT scientist Rodney Brooks on the radio a couple of weeks ago saying that every 10 years or so he switches projects. He's been working on robotics for 10 years. Now he finds himself interested in a different problem: What's the difference between dead flesh and living flesh? We know what their behaviors are, of course, but what causes that difference? In short, what is life? He doesn't know and I sure don't either. But I do know that flesh and our fleshly lives so surpass our understanding that we can't even get it wrestled into a question we don't giggle at.
I don't take Brooks' question as arrogant because I think he recognizes its awesome magnitude. But the attempt to render consciousness into hardware and software is different. It strikes me as an attempt to make the question answerable by diminishing its subject. Just as we sometimes confuse a business with the spreadsheet that represents it and as we sometimes over-systematize knowledge in order to manage it, the belief that someday we'll create conscious computers seems to me not arrogant but ungrateful.
David Weinberger edits "The Journal of the Hyperlinked Organization" (hyperorg.com), e-mail firstname.lastname@example.org