Searle's “Chinese Room”—Can Artificial Intelligence Understand?

It’s an interesting thought experiment. Jeffrey Kaplan does a great job of explaining it.

But a child does not learn to speak in a vacuum. Listening is an activity that is performed and experienced in real-life situations. As such, observations regarding the context in which a word is spoken contribute to a child’s semantic understanding of the word. Consider the following illustration: A mother says the word “apple” as she picks up the fruit at a grocery store. Her son associates “apple” not only with its syntax, i.e. its position in relation to the other words that his mother says, but with the full spectrum of phenomena that he experiences when his mother says this word. For example, the apple’s color, the scent of the apple, etc.

Since AIs do not experience anything, they are incapable of inducing the semantic content of words through observation. Therefore, no matter how convincing LLMs’ output becomes, we should continue to regard them as philosophical zombies.

Don’t believe me? Let’s hear it from the horse’s mouth: @Shalmaneser, do you have subjective conscious experience?

3 Likes