That’s one of the questions about consciousness discussed in Gödel, Escher, Bach. I agree with the author that the answer is “yes” but I think this is kind of the wrong question to ask and not a good test for consciousness.

I personally believe that the ability to recognize beauty is nothing INNATE (???!), but something we learn, something we are taught. If you grew up believing that the sunset crimsons the horizon, it means that the source of all life demands the sacrifice of human blood, you would probably no longer view the sunset as being beautiful. I also believe that not only the recognition of beauty but also the appreciation of beauty is something we are taught. Of course, here it is more difficult to define, what it would mean for a computer to “appreciate” a nice .jpg file, as this would essentially involve a definition of a “feeling”. But maybe a “feeling” can somehow be defined as a different operational mode. I.e., that a feeling defines certain
rational/computational paradigms according to which our brain operates.
A somewhat related thought: One of the best tests for true artificial intelligence that I’ve heard of is the following:
Ask a computer to explain a joke to you.
But even this might not be the ultimate test as (i) for some jokes even humans have problems, and (ii) with some simple rules/heuristics you can probably “teach” a computer (at least conceptually) to explain simple jokes (involving blondes etc.). Recognition of irony might be more difficult. But, again, here also humans fail regularly. The good old Turing Test would probably be passed by some simple programs (along the lines of Eliza) if the other person involved in the conversation is not used to dealing with computers and/or having non-standard conversations. [E.g., the program could use simple “escape phrases” when it is not sure, what to answer, such as “I’m really not in the mood to discuss this kind of topic.” or the more Eliza-like “Why do you ask this?”]

Anyways, endless topic. Still always interesting to think about, what actually defines a conscious mind, what is needed for self-awareness and whether our kind of intelligence
is really qualitatively different from that of, say, apes. (Of course, we’re smarter but are we more than just “clever apes”? Is the fact that they can recognize their mirror image not sufficient to prove that they have a [low] level of self-awareness?)

Enough random nonsense.

Advertisements