Blog powered by Typepad

Enter your email address:

Delivered by FeedBurner

Your email address:


Powered by FeedBlitz

Become a Fan

« Meta-reasons and subjectivism | Main | I probably read too many blogs »

May 26, 2008

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Richard

Indexical facts are odd. It's not clear that they're really facts at all. (Perhaps facts are just true propositions. And there are no indexical propositions. Instead, a belief like 'The sugar is spilling from my trolley' will express different propositions for different people.) Anyway, it's not clear that the kind of 'perspective' involved in de se attitudes is the same as having a phenomenal perspective, or 'something it is like' to be one. One imagines a simple robot might be programmed to reason about its location in the world, and thus to have de se attitudes of a sort, but that wouldn't suffice to show that Strong AI is possible (let alone realized in this simple robot).

Anyway, I don't take phenomenal facts to be anything like de se attitudes. 'RC is in phenomenal state X' is a perfectly objective, descriptive (non-indexical, de dicto) fact about the world, no less than 'RC is in physical state Y'.

You ask: "what is supposed to make me wrong [in projecting consciousness onto a zombie?]"

As you predict, my response is that you're wrong about (the fundamental fact) whether the object is conscious. But let me try to bring out your intuitions a bit...

First: what else do you think your claim is about? If you were merely trying to make a claim about the object's behaviour, or its physical or functional structure, that could vindicate your claims to infallibility [modulo "the usual ways of going wrong" -- mistaking a lifelike doll for a human, or other mistakes of physical fact]. But presumably you're trying to talk about something more than just that. So, whatever that 'something more' is (e.g., the fundamental fact of what it is like to be the object), you could be wrong about that.

If you prefer property-talk, how about this: you're mistaken about what phenomenal properties (if any) the zombie instantiates.

At a higher level of explanation, we might wonder, why were we wrong about this? How did the world get to be such that the zombie lacks the phenomenal properties we're inclined to attribute to him? Here the property dualist might say: the zombie world lacks the psycho-physical bridging laws that our world has. Alongside the various physical laws and constants, it's a nomic necessity that physical state X gives rise to phenomenal state Y. But the zombie world lacks this law, so the physical state doesn't give rise to any phenomenal state at all. It's just so much matter. So, when you think about what that world is like (in descriptive terms), don't those facts help make intuitive sense of why one would be mistaken to project consciousness onto its inhabitants? Doesn't it seem to you that there is some additional fact there, as to whether a material organism is conscious or not, for you to be mistaken about?

(Can you imagine being wrong about whether, e.g., variou animal species are conscious -- and not just because of any merely physical error about their brain and behaviour? What prevents you from extrapolating this to otherworldly humans?)

Aaron Boyden

No, I don't think I can imagine being wrong about whether various animal species are conscious without making some physical error about their brain or behavior (or perhaps some logical error; I make those, too).

Isn't the question of whether phenomenal consciousness is something more than physical and functional structure precisely the point at issue?

Richard

Well, some people theorize that consciousness might be reducible to mere physical and functional structure, and the correctness of this theory is indeed what the ultimate dispute is over. But I thought you were making a more intuitive claim, about what you can pre-theoretically make sense of. I thought most people at least acknowledged that consciousness is conceptually distinct from physical-functional concepts, so I was hoping to bring out your intuitions here.

If really all you mean by 'consciousness' is some complex physical-functional arrangement, then we're talking past each other. But I don't think that's what you mean, or there'd by no need to talk of 'projecting', etc. (We don't need to project our physical properties onto other beings; such properties are already manifest in the object.)

Instead, I suspect, you're implicitly presupposing physicalism and thus reporting your theory-laden judgment that physical facts are the only facts there are for one to be mistaken about. If you've really internalized this theory, I guess zombies won't seem conceivable to you, in the ordinary sense of 'conceivable'. Though I'm not sure that this matters for my argument, which instead appeals to the technical notion of being 'not a priori false'. Or do you think it's a priori that anything in physical state P must also be in phenomenal state Q?

Aaron Boyden

I think I'm not happy with your dichotomy. I don't think I mean anything non-standard by consciousness, but I think I mean some complex physical-functional arrangement. The talk of projection was not intended to do more than reflect the fact that we represent others as conscious to a great degree via maintaining internal simulations of them, which is a different procedure than we use with most properties as you note. I don't see that anything of metaphysical significance hangs on that, though.

As to whether I think it's a priori that anything in physical state P must also be in phenomenal state Q, I suppose the answer is yes. To bring up another popular analogy, I find the idea of a physical duplicate of our world which doesn't contain consciousness about as sensible as the idea of a physical duplicate of our world which doesn't contain tables. Now, I can't specify the a priori conditions of tablehood which would enable one to derive the existence of tables from the assorted physical facts, but I'm pretty sure they exist. Same with consciousness, as I see it.

Admittedly, I tend to equate the a priori with the analytic, so I have some weird ideas about the a priori. But the standard materialist position was that mind-brain identity is on a par with the alleged a posteriori necessities. You seem to criticize these (and I sympathize), but your criticism seems to be that they're not really a posteriori, not that they're not necessary, so my position is that a materialist should take this on board and say yes, they're a priori.

* Edited to remove serious confusions in final paragraph.

The comments to this entry are closed.