Bridging the Gap Between Philosophy and Neuroscience

A classic obstacle that scientists have encountered when studying consciousness is the problem of cross-modal sensory integration, otherwise known as the binding problem. When applied to the visual system, this is the question of how when an individual is perceiving a table, for instance, information such as shape, color, and motion all give rise to the single conscious realization of table. One theory that has been recently advanced as a possible solution to the binding problem is the so-called 40-Hertz hypothesis. This hypothesis consists of the idea that synchronized firing of numerous neural assemblages at a common frequency provide for a unity among different aspects of visual perception. This has been heralded by some scientists as a potential neural correlate of visual awareness. While the 40-Hertz Hypothesis is extremely tenuous and has not been rigorously experimentally verified, it is a example of how science could conceivably discover a neural correlate of awareness.

Let us suppose that the 40-Hertz Hypothesis, or any other hypothesis that entails neural correlates of conscious states perhaps discovered by neural network simulations, is indeed correct. Whether this would aid us in the explanation of consciousness or not is questionable. Following the confirmation of such a hypothesis, we can say that neural state N1 is conscious state C1, or some combination of the two, state NC1. We can certainly study the physical aspects of state NC1 down to every single axon, dendrite, and synapse, but when we attempt to explain or describe phenomenological aspects of state NC1, the 40-Hertz hypothesis cannot help us. We are confronted with a genuine philosophical problem. This is the hard problem that Chalmers describes. How can we reconcile the objective physical aspects of consciousness with the subjective phenomenological aspects of consciousness?

Consider this thought experiment: Suppose we are able to construct silicon neurons that are functionally identical to biological neurons. Additionally, suppose we are able to determine the exact structure of an individual's brain. It is conceivable that we could construct a silicon brain that would be functionally identical to the original. After all, the brain is composed of molecules, just like all matter. There is no reason why an artifical brain couldn't be constructed. Nothing in the physical structure of the brain aside from its complexity, distiguishes it from a conventional computer. Therefore, the interface of computers with minds does not require the solution to the mind-body problem.

If we are to interface minds with other minds, we must first objectify the subjective. This is necessary if we are to explain the subjective aspects of consciousness. We can only reconcile the subjective/phenomenal aspect of consciousness with the objective/physical aspect of consciousness, by first objectifying the phenomenal aspects of consciousness. We must describe, at least in part, the subjective character of experiences in a form comprehensible to beings incapable of having those experiences. A revolution in philosophy, a revolution which ignores the brain, is necessary to properly reformulate materialism. If this is indeed possible, then we will be able to account for the subjective character of experience with a physical theory of mind. It seems unlikely that any physical theory of mind can be contemplated until more thought has been given to the general problem of subjective and objective. Otherwise we cannot even pose the mind-body problem without sidestepping it.

Where does this leave Neuromancer?

Case's adventures in the matrix seem feasible. Interfacing computers with the mind does not require knowledge of subjective experience. All that are required are the scientific advanced in this hyperessay. Assuming neuroscience advances to the point where all the functional states of the brain can modeled and understood, computers and minds will be able to communicate.

However, the simstims, like in the scene depicted in the introduction of this hyperessay, are unthinkable unless Molly's and Case's subjective experiences can somehow be encoded in some objective language, which would invariably involve the resolution of the hard problem of consciousness.