Recall the contrast between common-sense functionalism and scientific functionalism. Block calls scientific functionalism “psychofunctionalism.” I will use that name here.
In “Troubles with Functionalism,” Block first presents arguments directed against common-sense functionalism (in sections 1.2-1.4). Then (in parts of the article not included in our selection), he goes on to present some arguments directed against psychofunctionalism (in sections 2.1 and 3.0). Finally, in section 3.1 of the paper, he raises worries for both varieties of functionalism, about how to specify inputs and outputs.
Block’s first objection to common-sense functionalism is that it’s too “liberal” — that is, it attributes mental states to too many things, including things which intuitively have no mental life.
Block offers the following as an example:
Homunculi-head. Imagine a body externally like a human body, say yours, but internally quite different. The neurons from sensory organs are connected to a bank of lights in a hollow cavity in the head. A set of buttons connects to the motor-output neurons. Inside the cavity resides a group of little men. Each has a very simple task: to implement a “square” of an adequate machine table that describes you. On one wall is a bulletin board on which is posted a state card, i.e., a card that bears a symbol designating one of the states specified in the machine table. Here is what the little men do: Suppose the posted card has a ‘G’ on it… Suppose the light representing input I17 goes on. One of the G-men has the following as his sole task: when the card reads ‘G’ and the I17 light goes on, he presses output button O191 and changes the state card to ‘M’… In spite of the low level of intelligence required of each little man, the system as a whole manages to simulate you because the functional organization they have been trained to realize is yours… (p. 278)
Block also tells a variant of this story in which the homunculi are replaced by the citizens of China (China-brain). In both cases, he argues, the common-sense functionalist is committed to saying that the system has mental states — indeed the same mental states that you have, since the system has the same functional organization that you have. But this is intuitively the wrong result, Block says. These systems don’t seem to have any mental states. Even if we were willing to grant that the Homunculi-head or the China-brain had propositional attitudes, it is especially doubtful whether they have any qualitative mental states, like pain or perceptual experiences. So common-sense functionalism errs in its claim that any system with the same functional organization as you will have all the same mental states that you have.
(The rest of Block’s article wasn’t included in our reading selection, but I’ll summarize what happens there anyway. Here is the full article if you’re interested.)
Putnam proposed that we emend common-sense functionalism, so that it says that creatures with the same functional organization as you will have the same mental states as you if and only if the internal mechanisms which realize their functional organization do not depend on the activities of things which themselves have minds.
Block objects that this emendation is (i) ad hoc, and (ii) too strong. It is too strong in that it rules out the possibility of some creatures who intuitively would share all of your mental states. (Imagine tiny creatures who build spaceships that mimic the behavior of our elementary physical particles. Imagine that colonists eat food made up of such creatures, and gradually come themselves to be composed of the tiny creatures. According to Putnam’s emendation, this will make a difference to the mental life of the colonists. But that seems the wrong result. In this case, unlike the Homunculi-head case, the presence and activities of the little creatures inside you does not seem to make any intuitive difference to what sort of mental life you would enjoy.)
Block’s next objections against common-sense functionalism occur in section 1.4. These objections all have to do with the fact that common-sense functionalism attempts to define all mental states in terms of widely-known platitudes, or in terms of conceptual analyses of the meanings of mental terms.
Block next turns to psychofunctionalism. The psychofunctionalist concerns himself with systems which are functionally equivalent not just in common sense respects, but also in terms of the functional characteristics of their underlying cognitive mechanisms. Since there are important functional differences between the cognitive mechanisms in our brains and the mechanisms in the Homunculi-head and China-brain, psychofunctionalism is not committed to saying that those systems have mental states.
Block thinks that psychofunctionalism still has troubles accounting for qualitative states like pain and perceptual experiences. He gives an inverted spectrum argument to try to show that experiences might differ qualitatively even though they have the same causal role, and are counted by the psychofunctionalist as equivalent. So the qualitative features of experience cannot be defined in functional terms. (We enountered these inverted spectrum arguments before, and may discuss them more in coming classes.) If Block’s argument here works, it will also work against common-sense functionalism.
In addition, Block complains that psychofunctionalism is too “chauvinist” — that is, it denies mental states to too many things, including things which intuitively have those mental states. He tells a story in which we encounter Martians who are equivalent to us in all common-sense functional respects, but not in terms of their underlying cognitive mechanisms:
We develop extensive cultural and commercial intercourse with [the Martians]. We study each other’s science and philosophy journals, go to each other’s movies, read each other’s novels, etc. Then Martian and Earthian psychologists compare notes, only to find that in underlying psychology, Martians and Earthians are very different… Imagine that what Martian and Earthian psychologists find when they compare notes is that Martians and Earthians differ as if they were the end products of maximally different design choices (compatible with rough functional equivalence in adults). Should we reject our assumption that Martians can enjoy our films, believe their own apparent scientific results, etc.?… Surely there are many ways of filling in the Martian/Earthian difference I sketched on which it would be perfectly clear that even if Martians behave differently from us on subtle psychological experiments, they nonetheless think, desire, enjoy, etc. To suppose otherwise would be crude human chauvinism. (pp. 310-11)
So common-sense functionalism and psychofunctionalism each face a number of troubles. In the final section of his paper, Block raises a problem which affects both versions of functionalism. This problem has to do with how the functionalist specifies inputs and outputs. (This is also something we discussed before.)
The common-sense functionalist specifies inputs and outputs in the same way the behaviorist does: input in terms of light and sound falling on one’s sense-organs, and output in terms of movements of arms and legs. Since the common-sense functionalist defines mental states in terms of causal relations to these inputs and outputs, the only creatures capable of having those mental states will also have to have inner states standing in causal relations to inputs and outputs of those sorts. But what about creatures that lack our sense-organs, and lack arms and legs? Do we really want to say that such creatures are incapable of having any of the mental states we have?
The psychofunctionalist specifies inputs and outputs in terms of neural activity. Since the psychofunctionalist defines mental states in terms of causal relations to these inputs and outputs, the only creatures capable of having those mental states will also have to have inner states standing in causal relations to inputs and outputs of those sorts. But what about creatures with different neural structures than ours? Or creatures with no neurons? Do we really want to say that such creatures are incapable of having any of the mental states we have?
Perhaps these functionalists can characterize inputs and outputs in more abstract terms. Then they would count a system as functionally equivalent to you just in case it had some inputs and outputs (output1, output2,...) — no matter what those inputs and outputs are intrinsically like — which are isomorphic to your inputs and outputs, and which stand in the same functional relations to the system’s internal states as your inputs and outputs stand in to your internal states.
The problem with this approach is that it is too “liberal” — it would count too many things as being functionally equivalent to you. Block offers the following example:
Economy of Bolivia. Economic systems have inputs and outputs, e.g., influx and outflux of credits and debits. And economic systems also have a rich variety of internal states, e.g., having a rate of increase of GNP equal to double the Prime Rate. It does not seem impossible that a wealthy sheik could gain control of the economy of a small country, e.g., Bolivia, and manipulate its financial system to make it functionally equivalent to a person, e.g., himself. If this seems implausible, remember that the economic states, inputs, and outputs designated by the sheik to correspond to his mental state, inputs, and outputs, need not be “natural” economic magnitudes… The mapping from psychological magnitudes to economic magnitudes could be as bizarre as the sheik requires. (p. 315)
Since it is unbelievable that the sheik’s activities could make the economy of Bolivia start to enjoy a mental life, something must be wrong with the view which says that functional equivalence of this liberal sort to you suffices for having the same mental states as you have. There have to be more substantial constraints on what can count as an input, an output, and an internal state.
It is not at all obvious what will be the best route for the functionalist to take here. None of these different ways of specifying inputs and outputs seems fully satisfactory.