Richard Chappell has been recently engaged in a defense of Chalmer's zombie argument; it has been ongoing, but an early summary of his position is here. One issue has come to worry me. Chappell keeps emphasizing that he's talking about special non-third-personal facts, which is why he's not impressed by Brown's argument against non-physical third-personal phenomenal facts. Now, there are first-personal facts which are known to be irreducible to anything third-personal, as discussed for example in Lewis' "Attitudes De Dicto and De Se." But no zombie scenario is needed to show this. No duplicate of me is me, no matter how close the physical match (and as a counterpart theorist, I'd say that this holds across possible worlds; no physical duplicate of me in another possible world really is me in any strict sense).
Of course, that nobody except me has my perspective does not show that nobody except me has any perspective at all. At least, so one hopes, though the zombie argument does seem to invite us to take solipsism more seriously. It seems that via some sort of empathy, I can imagine being someone else. And I am inclined to think that, within the limitations of the accuracy of my empathic imaginings, the others I empathize with are actually having something like the consciousness of themselves that I'm imagining having.
A zombie scenario is one on which I would be making some mistake in thus projecting my imagined consciousness onto the zombie. Of course, many such mistakes are always to be expected, since there's so much I don't know about what's really going on with the others, but in the zombie case the error is supposed to be total, and not based on any of the usual ways of going wrong. If I think I'm imagining what it's like to be a zombie, I'm automatically entirely wrong.
This leads me to wonder what is supposed to make me wrong. I suppose Chappell wouldn't be very impressed with the question, as he thinks it's a brute fact whether consciousness is present or not, so he doesn't think any explanation should be expected. But I have to say that my sincere inability to figure out what could make me wrong makes me rather inclined to think I can't really conceive of zombies after all.
I suppose this could be turned against me, and I could be asked how I know that nobody else is me. Of course, perhaps I just don't. But perhaps it is just a simple matter of logic that I am who I am, and nobody else is; if it is thus a priori, then this would be a clear difference from the zombie case, where no a priori rescue seems available. I confess that this does strike me as plausible (interesting though the Buddhist alternative response is).
Indexical facts are odd. It's not clear that they're really facts at all. (Perhaps facts are just true propositions. And there are no indexical propositions. Instead, a belief like 'The sugar is spilling from my trolley' will express different propositions for different people.) Anyway, it's not clear that the kind of 'perspective' involved in de se attitudes is the same as having a phenomenal perspective, or 'something it is like' to be one. One imagines a simple robot might be programmed to reason about its location in the world, and thus to have de se attitudes of a sort, but that wouldn't suffice to show that Strong AI is possible (let alone realized in this simple robot).
Anyway, I don't take phenomenal facts to be anything like de se attitudes. 'RC is in phenomenal state X' is a perfectly objective, descriptive (non-indexical, de dicto) fact about the world, no less than 'RC is in physical state Y'.
You ask: "what is supposed to make me wrong [in projecting consciousness onto a zombie?]"
As you predict, my response is that you're wrong about (the fundamental fact) whether the object is conscious. But let me try to bring out your intuitions a bit...
First: what else do you think your claim is about? If you were merely trying to make a claim about the object's behaviour, or its physical or functional structure, that could vindicate your claims to infallibility [modulo "the usual ways of going wrong" -- mistaking a lifelike doll for a human, or other mistakes of physical fact]. But presumably you're trying to talk about something more than just that. So, whatever that 'something more' is (e.g., the fundamental fact of what it is like to be the object), you could be wrong about that.
If you prefer property-talk, how about this: you're mistaken about what phenomenal properties (if any) the zombie instantiates.
At a higher level of explanation, we might wonder, why were we wrong about this? How did the world get to be such that the zombie lacks the phenomenal properties we're inclined to attribute to him? Here the property dualist might say: the zombie world lacks the psycho-physical bridging laws that our world has. Alongside the various physical laws and constants, it's a nomic necessity that physical state X gives rise to phenomenal state Y. But the zombie world lacks this law, so the physical state doesn't give rise to any phenomenal state at all. It's just so much matter. So, when you think about what that world is like (in descriptive terms), don't those facts help make intuitive sense of why one would be mistaken to project consciousness onto its inhabitants? Doesn't it seem to you that there is some additional fact there, as to whether a material organism is conscious or not, for you to be mistaken about?
(Can you imagine being wrong about whether, e.g., variou animal species are conscious -- and not just because of any merely physical error about their brain and behaviour? What prevents you from extrapolating this to otherworldly humans?)
Posted by: Richard | May 26, 2008 at 07:30 PM
No, I don't think I can imagine being wrong about whether various animal species are conscious without making some physical error about their brain or behavior (or perhaps some logical error; I make those, too).
Isn't the question of whether phenomenal consciousness is something more than physical and functional structure precisely the point at issue?
Posted by: Aaron Boyden | May 27, 2008 at 11:20 AM
Well, some people theorize that consciousness might be reducible to mere physical and functional structure, and the correctness of this theory is indeed what the ultimate dispute is over. But I thought you were making a more intuitive claim, about what you can pre-theoretically make sense of. I thought most people at least acknowledged that consciousness is conceptually distinct from physical-functional concepts, so I was hoping to bring out your intuitions here.
If really all you mean by 'consciousness' is some complex physical-functional arrangement, then we're talking past each other. But I don't think that's what you mean, or there'd by no need to talk of 'projecting', etc. (We don't need to project our physical properties onto other beings; such properties are already manifest in the object.)
Instead, I suspect, you're implicitly presupposing physicalism and thus reporting your theory-laden judgment that physical facts are the only facts there are for one to be mistaken about. If you've really internalized this theory, I guess zombies won't seem conceivable to you, in the ordinary sense of 'conceivable'. Though I'm not sure that this matters for my argument, which instead appeals to the technical notion of being 'not a priori false'. Or do you think it's a priori that anything in physical state P must also be in phenomenal state Q?
Posted by: Richard | May 27, 2008 at 12:49 PM
I think I'm not happy with your dichotomy. I don't think I mean anything non-standard by consciousness, but I think I mean some complex physical-functional arrangement. The talk of projection was not intended to do more than reflect the fact that we represent others as conscious to a great degree via maintaining internal simulations of them, which is a different procedure than we use with most properties as you note. I don't see that anything of metaphysical significance hangs on that, though.
As to whether I think it's a priori that anything in physical state P must also be in phenomenal state Q, I suppose the answer is yes. To bring up another popular analogy, I find the idea of a physical duplicate of our world which doesn't contain consciousness about as sensible as the idea of a physical duplicate of our world which doesn't contain tables. Now, I can't specify the a priori conditions of tablehood which would enable one to derive the existence of tables from the assorted physical facts, but I'm pretty sure they exist. Same with consciousness, as I see it.
Admittedly, I tend to equate the a priori with the analytic, so I have some weird ideas about the a priori. But the standard materialist position was that mind-brain identity is on a par with the alleged a posteriori necessities. You seem to criticize these (and I sympathize), but your criticism seems to be that they're not really a posteriori, not that they're not necessary, so my position is that a materialist should take this on board and say yes, they're a priori.
* Edited to remove serious confusions in final paragraph.
Posted by: Aaron Boyden | May 27, 2008 at 05:09 PM