Ask A Genius 61 — Consciousness 2
Scott Douglas Jacobsen and Rick Rosner
January 17, 2017
Scott: Let’s continue on consciousness.
Rick: I think a nice half-definition of consciousness is the feeling of shared information. As conscious beings, we know what it is like to experience consciousness, but it is hard to characterize. But you can compactly label it, the feeling of shared information.
Every part of your brain sharing information with every other part of your brain. There’s, experientially, a certain informational flavor. It feels like something, being conscious feels like being conscious. That feeling is based on massive information-sharing within your awareness.
Every other part of your brain is gossiping about everything going on in the reality you’re thinking about from moment to moment. I was thinking about other examples of consciousness that further characterize different aspects of consciousness.
People like to argue, strongly, that it is not consciousness unless there is self-awareness. That, unless you’re aware of yourself as an entity, then you’re not conscious. Does a dog know it’s a dog? Does a lizard know it is a lizard?
But that whole thing is a little off base. To be conscious, there has to be a mass of information. A stream of information that is being shared among specialist sub-systems. In living creatures, a lot of that information pertains to the status of the creature itself.
In living creatures, self-consciousness is itself a big part of consciousness. You can argue it doesn’t have to be. I argue that. With the Go machine, it can be a Go machine without experiencing itself as a Go machine.
Another example, a conscious, sophisticated security system that uses a number of different sensors and heuristics to evaluate the security situation, moment-to-moment, in a set of warehouses. The cameras consist of temperature and pressure, and visual, sensors and analytic programs that examines and evaluates people in the warehouses.
It has a bunch of sensors and tools to examine the situations in the warehouses. It doesn’t have to experience itself as a security system. It could simply experience the situation in the warehouse security system. It might have some self-evaluative machinery such as seeing if it is having power problems or various malfunctions like the loss of a camera.
Even in an engineered system like that, you would expect a degree of self-consciousness because it makes sense for a machine that would do its job well, but you could design a machine without any of that and have it conscious only of the doings in the warehouse.
Similarly, you could have some kind of Peeping Tom or security setup that watches a bunch of people in an apartment house. Say the apartment house consists of 24 units and 40 people, and somebody has wired all of the units to a system that observes everybody as they go about their lives in the apartment house, the system may take in visual information, auditory information, and could take in smells, and feelings such as the pressure as people walk around that trigger pressure sensors.
It could have analytic tools to understand what is happening in the lives of the people in the apartment house. It could even have sentiments about what is going on in the lives of the people in the apartment house based on it being programmed to have humanistic sentiments about people and to be happy when things are going well for people, and not so well when things aren’t going so well.
This thing is watching people and is conscious of the people in the apartment house, but doesn’t have to be conscious of itself as an observing system. Eventually, you would think it would discover itself and its limitations as a monitor, but it doesn’t have to have that.
In fact, you could design something specifically without self-awareness and conscious of the people in the apartment house, and not conscious of itself. But it is not conscious because it doesn’t have self-consciousness.
It is highly aware and gossiping with itself about the goings on in the apartment house with what we could consider a weird lack of self-consciousness, which we would consider similar to somebody who has had a stroke and lost an aspect of awareness that we consider pretty essential to being a conscious being.
There are plenty of examples of people who have had a stroke and lose the idea of left. Every idea about left for them is gone. You ask somebody who is missing left to draw a clock. They draw the right half of the clock, or they cram all twelve numbers onto the right side of a dial. It is a half-clock with all of the numbers.
You ask them if there is anything weird about this. They say, “No.” They are not conscious of the lack of left because that went with left in general. If you read Oliver Sacks, there are numerous cases of people who have lost large segments of what we would consider a normal identity and who can still function in many other ways, and are still conscious.
Even though, a huge portion of their consciousness has been cut out of them because of the stroke. As long as you have shared information, if you take somebody and cut away their auditory awareness and their taste, smell, and feeling awareness, then left them with visual awareness, then you could still work with them and present them information visually and still see that they are obviously conscious beings, even though 4 out of their 5 senses and the awareness of that sensory information has been cut away from them.
But any time you have massively shared information among specialist sub-systems, you still have the flavour of consciousness, which equal consciousness.
Scott: If you take the 300 sub-systems, the number you threw out earlier. There has to be a sussing out of contradictions among the mutually shared information. So, you have mutually shared sets of information that are taking different angles on a particular gestalt. There are going to be contradictions in perspectives.
If you get 300 people in a room and ask them to debate, they are going to have different perspectives. Some are going to be contradictory. So the question is “how does that get sussed out?”
Rick: There’s an F. Scott Fitzgerald quote: “The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function.” It is being able to handle contradictions. You can have differing opinions among the specialist sub-systems in consciousness, in your head.
They can continue to disagree and sometimes you get interesting results behaviourally. I went to Marshalls about getting new curtains for our bedroom window because we live in LA. The old ones turn brown from pollution, probably like our lungs from living here.
I was in Marshalls standing in front of a bunch of curtains. There are probably 3 or 4 good choices of curtains, or contenders for the curtains that I will buy. So, I am standing in front of them. I know I can’t just pick the one I like at any given moment until I have spent a full 5 minutes developing the full implications of these stupid curtains until I make my choice.
I stand there. I process, “I don’t like the pattern. The horizontal might make the other pattern bad.” It was an old-school, 60s, oval pattern overlaid over a horizontal striping, which annoyed me. It was like “just lose the 60s pattern and give me the modern one.”
But they couldn’t. So, I had to look at color and analyze things. I have various sub-systems — I would think — that are aesthetic, evaluative systems running. I was aware of the various considerations.
I had to let all of the arguments build. I had to build an imaginary picture of the curtains hanging in the room, even though I got it wrong. The curtains are not blackout. They are somewhat see-through. They are teal. They have a new color with the light coming through blue-ishly.
The color when things are different than the older beige ones when the curtains are closed. It’s fine, but it didn’t enter my imaginary picture of what the curtains would look like in my internal picture.
I had to let the internal part of my mind yammer. At some point, I had to come to the point of thinking, “These are the curtains I am choosing.” I have to buy them then. Even though I am saying, “I am choosing.” It doesn’t mean I am in agreement with myself.
It means that after all of the arguments play out. I decided that, I — the construction function of myself, it is time to go with what I want to go with right now, with the strongest candidate right now.
But the I construct that decides on the curtain, which takes all of the other yammering sub-system Is and makes a choice that is not 100% ideal, but is the best I could do at the time. The curtains are fine. The curtains don’t exactly represent a unitary choice or a consensus.
Because the curtains aren’t ideal, I have an awareness of the other possible curtains. Each curtains’ pluses and minuses. My different specialist sub-systems are giving different scores to different aspects.
Although, I’m sure they’ve calmed down about it because they’ve been informed of my choice, see the curtains in the room, are aware of how they work and look, and the curtains are less of an issue then when I was actively considering which curtains to buy.
But there are all sorts of contradictions going on and disagreements going on in only 3 or 4 different patterns of curtains under consideration. So, that stuff goes on all of the time. That is what consciousness is for.
If a decision was easy, we wouldn’t need to throw it into consciousness. I put my left foot down. What foot do I put down next? It’s a simple choice. Unless, you’ve got a weird thing, like OCD. My OCD will be on the verge between concrete and grass.
It will be like “which foot has to cross this border?!” Then that Fs up my walking. “Okay, the right one! But you just put your right one down. Now, you’re going to have to hop!” OCD makes an unconscious choice conscious.
We were talking about consciousness being an epiphenomenon. I don’t think so. It is in that it shares information, but you have these consciousness-gone-wild-aspects — where consciousness begins doing jobs too well.
You have OCD as too much vigilance. It messes with things that should be unconscious. Similarly, turrets might be a vigilance thing, where your brain forces a tick to make you imagine or say the worst possible thing.
There are various little disorders of consciousness, large and small, where the business of consciousness produces the division of labor between conscious and unconscious tasks, which become a little messed up.
Consciousness serves as the central arena to hash out complicated ambiguous, contradictory information and situations because if it were simple, something you’re not entirely consciously aware of, then it would’ve been taken care of.
Scott Douglas Jacobsen
Editor-in-Chief, In-Sight Publishing
American Television Writer
License and Copyright
In-Sight Publishing and In-Sight: Independent Interview-Based Journal by Scott Douglas Jacobsen is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Based on a work at www.in-sightjournal.com and www.rickrosner.org.
© Scott Douglas Jacobsen, Rick Rosner, and In-Sight Publishing and In-Sight: Independent Interview-Based Journal 2012–2017. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Scott Douglas Jacobsen, Rick Rosner, and In-Sight Publishing and In-Sight: Independent Interview-Based Journal with appropriate and specific direction to the original content.