Brian Byrd

Problem Search

CGSC 270

In the November edition of Psychology Today, writer Michael Cross examines the possibility that a computer can smell. The computer, called Magnus, is quite able to process objects, but will it someday be able to process smells? Magnum's creator, Igor Aleksander, who works on intelligent and interactive systems, claims to be close to cracking the problem of "qualia" in artificial intelligence.

Philosophers believe that AI (artificial intelligence) engineers can build mechanisms which function like brains that "chess grand masters or [pick]... a bad apple out of a box of good ones." They even feel that it will be possible to build a Turing Machine, with functions indistinguishable from a conscious entity. What they don't believe is that such machines create real human consciousness, any more than "a flight simulator will transport you across the Atlantic."

What seems to be missing are the experiential qualities of sensations. This means that a computer, or any other electronic device, will never know what it is like to taste food. The counter argument to this by AI advocates is to accept that qualia exist, but can be explained in some mechanic way and thus be re-created. This is what Aleksander is trying to do. He believes that you get "qualia alongside a set of "special awareness brain cells" and that these evolved to enable the brain to generalize from a limited experience of learning.

This illustrates the dangers of scientists dabbling in philosophy and vice versa. [But see Dennett and Churchland. WF] All we really know is that consciouness exists [Do we know that? WF], as part of the neurochemical tool kit with evolution accumulated inside human skulls to cope any situation. If philospheres amd scienists are still arguing about the ground rules of what this entails, we are nowhere near replicating the trick. [Not necessarily: qualia may just come with the hardware: see Gazzaniga et al. Ch. 14. WF]