John Searle’s homunculus announces phased retirement

John Searle in the Chinese Room. Not picture: Zhu Tao in the Searle Room.
After 54 years of teaching at Berkeley, the man inside John Searle’s head has announced he will be entering a three-year phased retirement after the end of the current semester. The diminutive Zhu Tao made the announcement at a press conference Monday in a rare out-of-costume appearance.
At the conference Zhu said he is retiring from his current position in order to spend more time with Searle’s family. “I have become quite attached to these people,” Zhu said through a translator. “Although, admittedly, not being able to understand a word they say has limited the intimacy of our relationships.”
Though it never afforded genuine understanding, for the most part Zhu’s English-to-English instruction manual served him well during his time as Searle. One notable exception was the famous Searle-Derrida debate, in which Searle leveled charges of “deliberate obscurantism” against Derrida and other deconstructionists. “Some people do not use words according to their prescribed manner,” said Zhu, reflecting on the exchange. “This results in great confusion.”
While he expressed sadness at the end of an era, Zhu looked back with pride at his time inside John Searle’s head. Zhu is popularly credited with sparking the shift away from brain-based cognition. Today that shift continues apace, with figures such as Andy Clark and David Chalmers outsourcing their thinking to call centers in India as part of a growing movement of philosophers who believe cognition can extend beyond the boundaries of one’s skull.
In the spring of 2014-15 and 2015-16 Zhu will hold a visiting position at Tufts inside the head of Daniel Dennett, after which he will officially become emeritus at Berkeley.
Zhu qua Dennett plans to teach a course on “reconsidering the Cartesian theater.”
fauxphilnews
April 24, 2013 at 9:23 am
Ah Searle, who famously proved that people could never be conscious because their neurons aren’t conscience.
Carl
April 26, 2013 at 1:26 am
Citation? The Chinese Room argument does no such thing.
noen
April 30, 2013 at 6:45 am
Looking at the appropriate Encyclopedia article ( http://plato.stanford.edu/entries/chinese-room/#5.1 ) some have interpreted Searle’s argument as one about not being able to construct a whole with a key property not possessed by the parts, for example Dennett “Searle’s view, then, comes to this: take a material object (any material object) that does not have the power of causing mental phenomena; you cannot turn it in to an object that does have the power of producing mental phenomena simply by programming it—reorganizing the conditional dependencies of transitions between its states.” (according to article this would be in, Dennett, 1987, ‘Fast Thinking’, in The Intentional Stance, Cambridge, MA: MIT Press, 324–337.)
So a single transistor/diode/electronic component (as in say a thermostat) has no power to create mental states, therefore no computer can, because that would just be a formal combination of powerless transitors and formal programs can not cause mind (note only the last premise is claimed to be shown by the Chinese Room argument).
Replace transistor with neuron and we get the original comment.
Allan Olley
May 6, 2013 at 11:41 pm
What Dennett says about Searle’s view is correct. Searle rejects functionalism, the view that having mental states is just a matter of having the right functional organization, regardless of what you are made of. But Dennett does not (because he knows that this is not the way Searle argues) attribute to Searle an argument of the form: 1. Transistors are not conscious. 2. Computers have transistors as parts. 3. Nothing has a property unless all its parts have that property. 3. Therefore, no computers are conscious. If Searle had offered an argument of that form, it would be correct to say that he is committed to saying that we are not conscious because our neurons are not. But he did not. Searle rather offered a counterexample to the claim understanding something is simply a matter of having the right input-output relations mediated by the right internal functional organization.
K. Ludwig
June 7, 2013 at 7:41 am
K. Ludwig
Keep in mind I’m only trying to establish that Carl’s flippant remark was an appropriate jocular response to Searle. I’m not trying to establish that it is a fair, carefully considered response.
However, you are right that I erred in equating the formal rearrangement Dennett describes with physical construction. Note however that any formal or functional rearrangement of a material thing in the sense used in say the Chinese Room (ie reprogramming a computer, or making a human obey a complex series of rules of action using a massive paper record keeping system) is also a physical rearrangement (changes magnetic domains on disc drives, electrical charge and flow patterns in a network of transistors, opening and shutting of physical relays in an electromechanical machine, different arrangements of ink on paper etc.). So Searle is committed to the view that a physical REconstruction of a system can not give it new causal powers (such as the power of comprehension) if the reconstruction is purely formal or functional in character (how you avoid making this limitation question begging I’m really not clear) as when we physically reconstruct the state of a computer’s memory to enter a new program or as the computer reconstructs/transforms its own state as it carries out such a program.
The point could be made that if a mass of neurons has no comprehension, specifically no power of comprehension, no formal or functional rearrangement of the neuron and their connections could turn that into a brain with the power of comprehension. So if I believe that a unformed or malformed human brain (perhaps that of a neonate or fetus) has no power of comprehension, further I think the only difference between the unformed brain and that of a comprehending brain of a normal human being is the formal/functional arrangement of neural connections and I accept the conclusion of the Chinese Room then I’m forced to conclude that humans do not comprehend anything. Which would seem to me a harmless restatement of Carl’s original joke (especially given it is a joke and not a serious argument).
Searle of course just believes it is a brute fact that biological matter (including possibly green slime of an alien with a transparent skull) has the causal power of comprehension and electronics does not, so he would just take it that any brain only different from a normal human one by formal/funcitonal rearrangement just has the power of comprehension (it need not be using that power at the time). I assume that would be his response to Carl’s joke taken in that light.
Finally, some of Searle’s own rhetoric suggests a suppressed argument that uses a fallacy of construction/composition as when he brings up the question of whether thermostats have a state of mind. If he were just depending on the explicit argument of the Chinese room the question of whether simple electronic components have states of mind is about as relevant as whether starfish or other animals with a nervous system but no central nervous system (ie one step above free floating neurons) have states of mind, His thermostat aside seems irrelevant to the main argument of the Chinese Room, but it is suggestive to me of the more simplistic argument Carl’s joke accuses him of.
Allan Olley
June 9, 2013 at 1:54 pm
Conscious[ness] and conscience are, shall we say, distinct.
ChrisTS
April 30, 2013 at 9:13 pm
To say the least.
Peter Hardy
May 1, 2013 at 5:48 am
[…] Read more at fauxphilnews […]
John Searle’s homunculus announces phased retirement | Philosophy @ MHS
April 29, 2013 at 2:10 am
Zhu will be missed….
Bruno Verbeek
June 6, 2013 at 4:28 am
this is awful, misinformed, and not funny.
shut up
July 5, 2013 at 6:22 am