Thursday, January 03, 2008

A response to some people on Debunking Christianity

I was explaining the structure of the argument, not attempting to defend the premises. I was making the rather narrow point that Parsons had the structure wrong. At least the argument I have endeavored to defend does not have that kind of structure, and I don't think Lewis's did either.

I dislike using terms like "magical" or "supernatural." I prefer to argue that if there is to be reason in the world, intentional explanations must be basic explanations. Nor am I denying that physical states can be correlated with mental states, or that physical changes can cause mental changes. What I am saying is that when you add up all the truths about how physical states are arranged, they don't entail any unique truths about what the mental states are. Physical states don't, and can't entail mental states, in much the way they don't and can't entail moral truths. Naturalists like Quine and Dennett agree with me on this. Do you think they are wrong?

Physical states, including states of a computer, are indeterminate with respect to mental states. This includes states of a computer playing chess. The programmers create a physical system which mimics proper chess-playing given a framework of meaning provided by humans. The move Rf6 on my computer screen, played by Fritz (who kicks my butt on a daily basis, in case anyone is wondering) has a meaning relative to my understanding of chess, which it itself lacks. It is only by anthropomorphizing the silicon monster do we get determinate meanings for its moves. The laws of chess have nothing to do with what the computer does, but human programmers give it the physical motions of a computer a context of meaning that allows is to see those moves as chess moves.

But if physical states are indeterminate with respect to meaning, can it be that we have no determinate mental states or proposotional attitudes? If so, then it is never literally true that we add, subtract, multiply or divide. Ever read Kripke on Wittgenstein? If we literally perform the operation 2 + 2 = 4, then we understand the meanings of 2, +, and 4. What are thoughts are about must be exactly those meanings. But the physical is indeterminate with respect to mental content. This means that determinacy of meaning must come from someplace other than the physical.

Or maybe we don't literally add, subtract, multiply and divide. We only simulate it. But how do we know what we're simulating, if that's the case.

Labels:

2 Comments:

At 1/04/2008 01:01:00 PM , Blogger Doctor Logic said...

Victor,

Physical states, including states of a computer, are indeterminate with respect to mental states.

While this may be technically true, it's not helpful. It's like saying that physical states are indeterminate with respect to my state of wealth. There are many physical states that might correspond to me being a millionaire on paper. However, that does not mean that I cannot be a millionaire in the physical sense. Likewise, there are many physical states of my brain consistent with a single, recognizable mental state within my mind, but that does not mean that my mind cannot be a physical thing.

Also, I don't think Quine's indeterminacy of translation means what you think it means. Quine says that if X in your language translates to Y in mine (from an omniscient view), then it would take an infinite amount of data for me to be sure I have the correct translation. But this does not speak to whether you yourself have total precision about X in the first place. In reality, you don't have infinite precision, and special cases like this come up all the time ('true Scotsman' is a fair example). Is a rap artist a singer? Answering such questions forces us to make our definitions more precise. IOW, Quine's idea applies just as well to non-physical minds as physical ones, and to ideas in our own languages.

Here, you take time talking about why Deep Blue doesn't know what it is computing about, relative to a human player. I agree. But again, this is not helpful. Deep Blue is not self-aware. Its awareness is limited to positions on a chess board. If our awareness were limited to positions on a chess board (and not internal states or states beyond the chess board), then we would not appreciate the meaning of the game either.

I think this is a flaw in most of the AfR arguments. They all fail to ask what it is about human thought that makes it intentional.

The answer is that intentional thought is predictive about that intent, and intentional thought is closely linked with the power of recognition. My thoughts about chess are not limited to positions on a board. My thoughts are deeper than Deep Blue's because they relate to observations I might make about my opponent, about my own emotions (and my opponents), about analogues between chess and warfare, and thousands of other things. I can also devise new theories about all these things on and off the chess board. And these thoughts of mine are such that I will recognize experiences that the thoughts are about.

Now the question you have to ask is not whether Deep Blue understands anything beyond the abstract chess board. Of course it cannot, because it has no facility to compare thoughts to reality, let alone devise fresh theories. The real question is whether a machine that has such abilities can be conscious like we are. If a machine has sensory inputs, is aware of its own mental processes, makes predictions about both, and therefore beliefs about what will be the case, can it then be conscious? That's a much more interesting question.

 
At 1/24/2008 10:03:00 AM , Blogger Shackleman said...

doctor logic: Its awareness is limited to positions on a chess board

No it isn't. The computer is not aware of *anything*. It's shuffling around zeros and ones at the input of an end user and in an order or pattern that has been pre-determined by a programmer. It has absolutely *no* idea what *any* of the zeros and ones represent, why it's shuffling them around in various positions, or anything else. A computer is quite literally nothing more than an expensive abacus.

doctor logic: Now the question you have to ask is not whether Deep Blue understands anything beyond the abstract chess board.

Your premise is wrong. Not only can the computer *not* understand anything *beyond* the chess board, it doesn't understand anything *at all*.

doctor logic, you frequently compare what computers do to the act of thinking. Computers do *no* thinking. None. Again, they shuffle around representational zeros and ones. That's it. End of story.

Unless you're suggesting that "thought" is the act of shuffling around atoms in the same way that computers shuffle around representational ones and zeros, then you're entire (frequently repeated) premise is incoherent and reflects a common misunderstanding of how computers actually work and what they actually do.

 

Post a Comment

Subscribe to Post Comments [Atom]

<< Home