What is Consciousness?

In short, when it comes to even defining consciousness, in all honesty we (humans) are just completely clueless, which is quite a scary realization.

What is Consciousness?
Juju's Aeron cat bed

ChatGPT:

What is consciousness is one of the still open questions of .. Philosophy? Psychology? Neuroscience? Theology? ...? I don't think we even know which field is supposed to answer it.

But why does it matter?

There's being alive, and then there's being conscious

Consciousness is a concept tied directly to ethics, in part, because so much of ethics concerns itself with how we should treat other beings.

But what precisely defines another thing as not just a thing but a being? People – yes, Chairs – no. Cats – yes, lab rats – .. yes? Plants? – ... er yes?? Bacteria? .... hrmm... Viruses? ... geez

The whole concept of life, examined closely, is actually pretty confusing. And it gets more confusing if you infuse it with consciousness.

Some ethicists will claim (probably rightly so) that hurting animals is wrong but hurting plants is fine because while both animals and plants are alive, plants aren't conscious the way animals are.

But what is this consciousness that they talk about that animals have and plants don't? Do viruses have it? Bacteria? Oysters?

And if we can't even define the term, should we be confident in using it for ethical reasons?

The hard problem of consciousness

Wikipedia's current description of consciousness illustrates, through its incoherence, the pickle we got ourselves in:

Consciousness, at its simplest, is awareness of a state or object, either internal to oneself or in one's external environment. However, its nature has led to millennia of analyses, explanations, and debate among philosophers, scientists, and theologians. Opinions differ about what exactly needs to be studied or even considered consciousness. In some explanations, it is synonymous with the mind, and at other times, an aspect of it. In the past, it was one's "inner life", the world of introspection, of private thought, imagination, and volition. Today, it often includes any kind of cognition, experience, feeling, or perception. It may be awareness, awareness of awareness, metacognition, or self-awareness, either continuously changing or not. The disparate range of research, notions, and speculations raises a curiosity about whether the right questions are being asked.

In short, when it comes to even defining consciousness, in all honesty we (humans) are just completely clueless, which is quite a scary realization.

Then there's this concept called "The Hard Problem of Consciousness" which is .. perhaps not surprising, quite hard to explain: it posits that there's something called our subjective experience that's different, and separate, from just the way our brains allow us humans and other conscious animals to behave as we do in just a mechanistic way.

The whole concept is problematic, but quite important, because at the root of consciousness and aliveness from an ethical standpoint, lives the hard problem of consciousness, that is, some type of subjective experience that's different from just how an "animal-machine" works, and an experience that's valuable in and of itself.

Built a machine that behaves kind of like a cat? OK to tear it to pieces. But killing an actual cat? Nope, that's wrong. Because the cat is actually alive, conscious, and has subjective experience (some philosophers like the name "qualia") that your cat-machine doesn't – even though we can't precisely define or explain what any of that means.

In the end, a lot of our explanations boil down, once again, to what one-too-many human affairs boil down to:

Our feelings.

What it feels like to be a conscious being

Thomas Nagel is a notorious philosopher for many things, among which his 1974 article "What is it like to be a bat?"

Now we know that most bats (the microchiroptera, to be precise) perceive the external world primarily by sonar, or echolocation, detecting the reflections, from objects within range, of their own rapid, subtly modulated, high-frequency shrieks. Their brains are designed to correlate the outgoing impulses with the subsequent echoes, and the information thus acquired enables bats to make precise discriminations of distance, size, shape, motion, and texture comparable to those we make by vision. But bat sonar, though clearly a form of perception, is not similar in its operation to any sense that we possess, and there is no reason to suppose that it is subjectively like anything we can experience or imagine. This appears to create difficulties for the notion of what it is like to be a bat. We must consider whether any method will permit us to extrapolate to the inner life of the bat from our own case, and if not, what alternative methods there may be for understanding the notion

Nagel is in part articulating "yes, there is probably such thing as subject experience" and also "no, I don't think we have any idea of what we're talking about when talking about it."

What seems like a rough philosophical consensus, though, is that this consciousness, the subjective experience of life, is kind of "the point" of life, and life that doesn't have it is, well, not quite properly called "living" for most ethical purposes.

Let's take oysters, the favorite vegan controversy example. Did you know that since Peter Singer's original book Animal Liberation in 1975, there's often been a debate on whether ethical vegans should eat oysters? Wait, why would there be such debate you ask – oysters are obviously animals, after all? Yes, but they have no central nervous system ie. something like a brain that can feel things.

So maybe oysters are at the verge of the consciousness boundary – presumably no subjective experience. Perhaps there's nothing that it's like to be an oyster.

In most animal rights debates, the question boils down to some variation of "oysters are animals with nerves, but can what they feel be called pain?" with many conclusions being "probably not." Without a central nervous system, even if they have other nerves and react to negative stimuli like closing their shells when in danger or by not filtering polluted water, they presumably aren't sophisticated enough for conscious experience.

So consciousness is correlated with feeling and sophistication. More sophistication in reacting negatively to stimuli? Very conscious. Less sophistication in reacting negatively to stimuli? Probably not very conscious. No central nervous system? .. you get the idea.

The challenge, of course, is that we humans haven't yet advanced science enough to create life, let alone conscious life. Of course, if you're following along, you'll note this statement is also kind of meaningless without a clear definition of what "life" and "consciousness" mean – definitions we don't yet have.

And since we know our definitions are confusing and in need of refinement anyway, perhaps part of this refinement will come from addressing the substrate where life forms and consciousness can live.

So let's talk about this substrate of consciousness by understanding a little more about what it feels like to feel pain and, presumably, to be concious of it.

The substrate of pain

We understand some things about how pain works in humans. Let's illustrate it by talking about pain asymbolia.

Pain asymbolia can happen for example due to brain damage on the anterior cingulate cortex (ACC), a part of the cingulate gyrus responsible, among other things, for making pain feel bad to us. Those afflicted by pain asymbolia experience pain, noting something hurts, without the unpleasantness – quite something!

Now, it's unclear that pain asymbolia means that subjects are unconscious of pain, and therefore, less conscious beings overall. If you note something, but it's not unpleasant, do you have less of a subjective experience? And what does noting mean? What does feeling unpleasant mean?

This again comes down to the tying up of feelings to consciousness. If unpleasantness is something we feel, and consciousness is about the sophistication of our brain's reactions to external stimuli, presumably there's some connection between feeling and being conscious.

But we fall into another poorly articulated concept: feeling. What does it mean, precisely, to feel unpleasantness? If we can sensitize this unpleasantness by artificially stimulating the ACC (we can) or prevent it by numbing it with morphine (yep), are we then just artificially generating and trimming consciousness?

And if we had the technology and treated a patient with pain asymbolia by building them a synthetic brain part, after which they again felt pain as unpleasant, would we then have manufactured consciousness in that person?

If such a synthetic ACC part of the brain were possible, then manufacturing consciousness would presumably also be possible. Because the consciousness concept is poorly articulated, all we can use to define whether we can manufacture consciousness is whether this thought experiment mechanical ACC behaves like its organic counterpart, generating the signals that create what we are calling the subjective experience.

And in a way, with such technology we could build the Brain of Theseus. If we could manufacture synthetic neurons and replace our brains with them, one at a time, either at one point we would no longer be conscious in the subjective sense, or we would have manufactured a conscious being. Both can't be true at the same time.

And we'd have a hard time explaining that although our synthetic brain is well, fully manufactured by us, it's conscious because we have this part of our machine that we decided to call the ACC that, when detecting certain stimuli, changes its neuron configuration to detect and avoid similar stimuli in the future. That'd be a very low bar to consider our synthetic ACC conscious, but we're unsure what the bar for our biological ACC is, or why exactly it clears it.

So if we had such synthetic brain with our manufactured ACC, wouldn't we technically have a machine that not only detects pain, but feels it? And if not, then why not?

In short, is feeling pain a concept, a function, a set of patterns, or is it necessarily a biological reaction from biological sources? Does feeling pain, whatever that means, require a biological substrate to count?

And if so, why exactly the special treatment?

Is subjective experience just experience?

When asking ChatGPT 4o if it's alive, it tells me that it isn't because:

Its justifications for answering this way therefore are:

  1. "Because I'm not organic," which makes sense
  2. "Because I'm not conscious," which of course, without defining what consciousness means (since nobody knows), makes no sense
  3. "Because I've been trained to say so," which, well, is understandable

So we're left with this pickle: Without defining consciousness precisely, or subjective experience, we can't really assert that it exists, or how much of it oysters, shrimps, chickens or silicon neural networks have it.

How could we?