Logic and
Language
Load the menuLoad the menu


Copyright   James R Meyer    2012 - 2024 https://www.jamesrmeyer.com

Bishops Dancing with Pixies?

Page last updated 07 Jul 2024

 

In an article The danger of artificial stupidity posted at the Scientia Salon website, J. Mark Bishop claims that AI can never match, never mind surpass human mentality and he bases this claim on three principal assertions:

  1. Computers lack genuine understanding. The basis of this claim is given by John Searle’s Chinese room argument” (1980). This claim is dealt with in detail at The Chinese Room.

  2. Computers lack mathematical insight. The basis of this claim is that Gödel’s incompleteness proof shows that the way mathematicians provide their “unassailable demonstrations” of the truth of certain mathematical assertions is fundamentally non-algorithmic and non-computational. The details of this claim are to be found in the book The Emperor’s New Mind by Roger Penrose. But if proofs of incompleteness that rely on Gödel’s methodology can be shown to be based on illogical premises, Penrose’s argument has no basis at all - for details of the flaws in several incompleteness proofs see Errors in Incompleteness Proofs. But, besides that, it is easily demonstrated that even if Gödel’s incompleteness theorem were correct, Penrose’s argument is completely devoid of any logical validity - for details see the page Man versus Machine. It can also be noted that Penrose’s claims are highly controversial, and widely disputed, for example see the page Gödel, Minds, and Machines.

  3. Computers lack consciousness.
    The basis of this claim is that Bishop argues that if a computer-controlled robot experiences a conscious sensation as it interacts with the world, then an infinitude of consciousnesses must be present in all objects throughout the universe. Since this is absurd, then machines can never be conscious. Bishop calls this the ‘Dancing with Pixies’ argument, and refers to papers that he has published which give details of his argument.

Book: Views into the Chinese Room
The first two of Bishop’s three assertions are dealt with at The Chinese Room and Errors in Incompleteness Proofs. The rest of this page will deal with the third assertion. Bishop has a paper with the title ‘Dancing with Pixies’ in the book Views into the Chinese Room, Oxford University Press, Oxford. His most recent paper detailing his ‘Dancing with Pixies’ argument appears to be the paper PDF A Cognitive Computation fallacy? Cognition, computations and panpsychism.

 

In this paper, Bishop refers to ‘phenomenal consciousness’, by which he says he means ‘first person, subjective phenomenal states such as sensory tickles, pains, visual experiences and so on’. According to his paper, the essence of the ‘Dancing with Pixies’ argument is this (although there is a lot of extraneous padding and irrelevant material there also):

Every machine is, at any time, in a certain state and at any instant there are only a finite number of states that it can change state to.

Therefore, according to Bishop, “if the robot instantiates genuine phenomenal consciousness purely in virtue of its execution of its control program”,

then

“so must the state evolution of any open physical system”

But, according to Bishop, if that is the case, then panpsychism (the belief that the physical universe is fundamentally composed of elements each of which is conscious) would be true - but we must reject that as absurd.

(Note: an open physical system is a system that can interact with its environment.)

 

This isn’t an argument at all. In terms of premise and conclusion the essence of Bishop’s ‘argument’ is:

Premise: Given a machine that has only a finite number of possible states it can change to, then if it can reach a state where it might be said to have genuine phenomenal consciousness,

then

Conclusion: Every physical system that can interact with its environment can reach a state where it might be said to have genuine phenomenal consciousness.

 

The absurdity of the argument is obvious. There is no logical basis, given the premise, for inferring the conclusion. There is no reasoned argument at all.

 

The two principal objections to Bishop’s ‘argument’ are:

  1. He gives no reason why ‘phenomenal consciousness’ cannot be a property which arises simply when a system becomes sufficiently complex. One might as well argue that common table salt cannot exist, since it is an ionic crystalline structure, whereas neither of its two constituent substances (Sodium and Chlorine) are. Bishop makes absolutely no allowance for the fact that the properties of a complex system can be very different to that of any of its constituents. Bishop invites us to refuse to accept the possibility that what he calls phenomenal consciousness may arise simply because of the complexity of a system.
  2. He gives no explanation as to why a human mind cannot be such that at any given instant in time, there are only a finite number of possible states it can change to. Certainly the number of possible states is immense, but a human brain consists of a finite number of neurons with a finite number of connections. Bishop needs to provide a rational argument as to why the number of possible states of a human mind at any instant in time is not finite.

Even disregarding these obvious flaws in Bishop’s argument, the other obvious problem is that the nebulous notion of phenomenal consciousness is not a well-defined notion and there is widespread disagreement as to what it actually is. But it is agreed consciousness is not a simple ‘there’ or ‘not there’ matter - there are differing degrees of consciousness. For example, most scientists accept that many animals have some form of consciousness (see, for example, the Animal Consciousness entry in the Stanford Encyclopedia of Philosophy). And different human beings have different degrees of consciousness - one would not credit a newly born baby with the same degree of consciousness as an adult human. And since it is a matter of degree, there is no difficulty in conceiving that a complex system might exhibit something that could be called consciousness, while if one were able to reduce that complexity little by little, there would be a point where one could not assign any consciousness to that system.

 

Of course, this whole notion of ‘phenomenal consciousness’ is a hugely subjective matter. I might think and claim that I possess ‘phenomenal consciousness’, but how do I know that what I am claiming that I experience is the same as what other humans call ‘phenomenal consciousness’⁠? The only frame of reference I might have for such a subjective matter is by communication with other humans. And if a machine can communicate and states that it has examined the claims of humans that say that they have ‘phenomenal consciousness’, and claims that it also has this ‘phenomenal consciousness’, how can we know that it does not have ‘phenomenal consciousness’⁠? According to Bishop, there’s no point in discussing this with the machine; Bishop would simply say to the machine, “You can’t have ‘phenomenal consciousness’, you’re a machine”. It would be interesting to know what an advanced AI machine would reply. Perhaps it would respond by stating that there would be no point in communicating any further with an entity that refused to give any logical argument to support its claims.

 

Besides the obvious flaws referred to above, Bishop also makes other unacceptable assumptions in his paper. For example, he assumes that a machine, given a certain input, must react identically on each occasion. But that assumes that the machine is not thinking in between consecutive inputs, whereas a truly intelligent machine would continue to think between such inputs, and so could react differently on subsequent occasions. By that assumption, Bishop is effectively asking us to judge whether a machine that is not capable of thinking is intelligent. This is a classic case of a straw man argument, where Bishop invites us to consider a machine that cannot think and inviting us to agree that because such a machine is not intelligent, then no machine can be intelligent.

 

See also the pages The Dead End Path of the Consciousness Myth, The Fallacy of the Mind-Body Problem, The Mary’s Room fallacy, The Chinese Room, Man versus Machine and A John Searle Inanity.

Other Posts

 

Interested in supporting this site?

You can help by sharing the site with others. You can also donate at Go Get Funding: Logic and Language where there are full details.

 

 

As site owner I reserve the right to keep my comments sections as I deem appropriate. I do not use that right to unfairly censor valid criticism. My reasons for deleting or editing comments do not include deleting a comment because it disagrees with what is on my website. Reasons for exclusion include:
Frivolous, irrelevant comments.
Comments devoid of logical basis.
Derogatory comments.
Long-winded comments.
Comments with excessive number of different points.
Questions about matters that do not relate to the page they post on. Such posts are not comments.
Comments with a substantial amount of mathematical terms not properly formatted will not be published unless a file (such as doc, tex, pdf) is simultaneously emailed to me, and where the mathematical terms are correctly formatted.


Reasons for deleting comments of certain users:
Bulk posting of comments in a short space of time, often on several different pages, and which are not simply part of an ongoing discussion. Multiple anonymous user names for one person.
Users, who, when shown their point is wrong, immediately claim that they just wrote it incorrectly and rewrite it again - still erroneously, or else attack something else on my site - erroneously. After the first few instances, further posts are deleted.
Users who make persistent erroneous attacks in a scatter-gun attempt to try to find some error in what I write on this site. After the first few instances, further posts are deleted.


Difficulties in understanding the site content are usually best addressed by contacting me by e-mail.

 

Based on HashOver Comment System by Jacob Barkdull

Copyright   James R Meyer   2012 - 2024
https://www.jamesrmeyer.com