The Human Brain Is Incapable Of Volition Or Free Will

You are right. All conscious experiences have similarities, but are not necessarily identical. An eagle's eyesight is much sharper than a human, but the difference is an evolved genetic advantage for the eagle. I think you are wrong there. Almost all people have subjective conscious experiences .
You seem to accept the notion that subjective experiences must be exactly alike or one or the other is an emotional zombie. IMO, that is just a limited view of what goes on when we observe and experience cognition and how certain sensory experiences are balanced against prior memories.

The fact is that the experience of qualia is generated by the physical senses. When the sensory receptors of color are impaired by a condition such as deuteranopia, that does not make a person a zombie, it makes that person partially disabled.

Let's reverse the question. If I have arthritis and moving my hand causes pain do I have better qualia than a person without arthritis and who moves his hand without experiencing pain? In that case the increased qualia are an indication of inflammation, a homeostatic warning that something is wrong with my hand.

However, I have demonstrated before that deuteranopia does impair a person's perception (best guess) of reality and when that impairment is filtered for conflicting wave lengths and the brilliant distinction between red a green is accentuated , the emotional response is usually quite obvious. People may cry from the overwhelming experience of color differentiation, which increases the deeper emotional experience in the observer. But zombie is such a useless term.
I was specifically referring to the Color Experience. It would seem that there might be people that do not Experience the Color Qualia, but yet they have full Color differentiation. They are not Color Blind, even though they do not Experience Color (the Qualia). So I should have said they are Color Zombies, and not imply total Zombies. Anyway, I am still exploring this unbelievable possibility and am not completely sure it could be true. We have no way of Experiencing what another Mind is Experiencing. It is the inability or unwillingness of people to admit to having Color Qualia that raises this as an issue for me.
 
I was specifically referring to the Color Experience. It would seem that there might be people that do not Experience the Color Qualia, but yet they have full Color differentiation. They are not Color Blind, even though they do not Experience Color (the Qualia). So I should have said they are Color Zombies, and not imply total Zombies. Anyway, I am still exploring this unbelievable possibility and am not completely sure it could be true. We have no way of Experiencing what another Mind is Experiencing. It is the inability or unwillingness of people to admit to having Color Qualia that raises this as an issue for me.
That is a totally incomprehensible observation. Seems to me you are getting things mixed up.
A person either can or cannot experience color qualia. If they don't it is a physical sensory impairment, not an inability to for experiential emotions, like autism.

Consider an optical illusion. Is that a lack of qualia ? A false experiential (emotional) interpretation? Or an inability to see normal sensory data?

Qualia as social effects of minds

Introduction

Colors, sounds and smells do not exist in the outside world. They are the creations of our brain in response to light waves, rhythmic variations of air pressure and inhaled molecules, respectively. External stimulations are responsible for action potentials whose processing in the brain may then produce colors, sounds and smells.
These, so-called qualia 1, 2 are then apparently projected outside around us and constitute our perceptual world, sometimes called the phenomenal world 3. Although perceived internally, except in the case of induced out-of-body experiences 4, events, such as feelings of meanings, as in the tip of the tongue phenomenon, or such as emotions, conscious intentions to act and sensations of our body can also be seen as qualia.
In any case, if the qualia produced by our brains in response to a given stimulus were not similar across individuals, one could call the entire human race delusional since we all go through our everyday lives and interact with others as if they perceive the world in pretty much the same way as we do. As a matter of fact, if the phenomenal world of each individual were unique, the most fundamental social consensus would be lost. Sharing feelings by verbalizing emotions would be an illusion and our use of language as if each word designates the same qualia would be incorrect *. It thus appears reasonable to hypothesize that qualia are similar across individuals and that we are actually living in similar phenomenal worlds.
...more
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5992535.1/

We experience reality by agreement. Our brains make "best guesses" of what it is we are experiencing and project our expectation of reality against the incoming sensory data.
*This is why Anil Seth uses the term "controlled hallucination".
 
Last edited:
That is a totally incomprehensible observation. Seems to me you are getting things mixed up.
A person either can or cannot experience color qualia. If they don't it is a physical sensory impairment, not an inability to for experiential emotions, like autism.

Consider an optical illusion. Is that a lack of qualia ? A false experiential (emotional) interpretation? Or an inability to see normal sensory data?

Qualia as social effects of minds

Introduction

...more
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5992535.1/

We experience reality by agreement. Our brains make "best guesses" of what it is we are experiencing and project our expectation of reality against the incoming sensory data.
*This is why Anil Seth uses the term "controlled hallucination".
Yes, but I am saying that they don't Experience the Color Qualia and yet can distinguish Colors. Machines can distinguish Colors without using Qualia. They are more like Machines. We don't know what their Experience could be.
 
Yes, but I am saying that they don't Experience the Color Qualia and yet can distinguish Colors. Machines can distinguish Colors without using Qualia. They are more like Machines. We don't know what their Experience could be.
I think I understand what you are saying. It's true machines can display colors with being conscious of doing so. But when we store the data on a HD, we can make the machine recall the entire scene and compare it to the next incoming data and measure any differences. Example; taking a picture of a barn at sunrise and sunset.

IMO that's where the GPT3 will shine. It functions on tokens, complete bits of memory (pixels, shapes, shadings, etc), rather than binary data strings. Is this why a GPT3 can imitate Van Gogh and Beethoven. Perhaps some of our electronic experts can assist?
 
I think I understand what you are saying. It's true machines can display colors with being conscious of doing so. But when we store the data on a HD, we can make the machine recall the entire scene and compare it to the next incoming data and measure any differences. Example; taking a picture of a barn at sunrise and sunset.

IMO that's where the GPT3 will shine. It functions on tokens, complete bits of memory (pixels, shapes, shadings, etc), rather than binary data strings. Is this why a GPT3 can imitate Van Gogh and Beethoven. Perhaps some of our electronic experts can assist?

Highlighted

But never come up with neither . Because GPT3 can't think in terms of 100 millions of yrs. In evolution .
 
Highlighted

But never come up with neither . Because GPT3 can't think in terms of 100 millions of yrs. In evolution .
It doesn't need to. We can effect evolution by artificial selection much faster than nature.

That may not always be advantageous in the long run, but we can cheat nature. And if we can do it, we do it... :eek:
 
IMO that's where the GPT3 will shine. It functions on tokens, complete bits of memory (pixels, shapes, shadings, etc), rather than binary data strings.
Don't understand the distinction you are trying to make here. Everything is always Binary Data in a Computer.
 
Don't understand the distinction you are trying to make here. Everything is always Binary Data in a Computer.
Is machine language always binary?
Everything in a computer (to be precise, in any typical contemporary computer) is binary, at a certain level. "1s and 0s" is an abstraction, an idea we use to represent a way of distinguishing between two values. In RAM, that means higher and lower voltage. On the hard drive, that means distinct magnetic states, and so on. Using Boolean logic and a base 2 number system, a combination of 1s and 0s can represent any number, and other things (such as letters, images, sounds, etc) can be represented as numbers.
But that's not what people mean when they say "binary code." That has a specific meaning to programmers: "Binary" code is code that is not in text form. Source code exists as text; it looks like a highly formalized system of English and mathematical symbols. But the CPU doesn't understand English or mathematical notation; it understands numbers. So the compiler translates source code into a stream of numbers that represent CPU instructions that have the same underlying meaning as the source code. This is properly known as "machine code," but a lot of people call it "binary". [/quote] https://softwareengineering.stackexchange.com/questions/236415/is-machine-language-always-binary

GPT3 is language based code (tokens). No numbers. All you do is ask it "make a Google website" and it makes a Google website for you and it will write the code for you to boot.
This very advancement actually poses a restriction on the system. You cannot ask it to perform a complicated compound question, because of the processing limitations.
You see the difference?

This is why the human brain alone has 250 trillion synaptic connections. It allows us to consider abstract concepts.
 
Last edited:
Is machine language always binary?
But that's not what people mean when they say "binary code." That has a specific meaning to programmers: "Binary" code is code that is not in text form. Source code exists as text; it looks like a highly formalized system of English and mathematical symbols. But the CPU doesn't understand English or mathematical notation; it understands numbers. So the compiler translates source code into a stream of numbers that represent CPU instructions that have the same underlying meaning as the source code. This is properly known as "machine code," but a lot of people call it "binary".
https://softwareengineering.stackexchange.com/questions/236415/is-machine-language-always-binary

GPT3 is language based code (tokens). No numbers. All you do is ask it "make a Google website" and it makes a Google website for you and it will write the code for you to boot.
This very advancement actually poses a restriction on the system. You cannot ask it to perform a complicated compound question, because of the processing limitations.
You see the difference?

This is why the human brain alone has 250 trillion synaptic connections. It allows us to consider abstract concepts.[/QUOTE]
So does GPT3 run on some new kind of Hardware? What exactly is a Token in the context of GPT3 operations?
 
https://softwareengineering.stackexchange.com/questions/236415/is-machine-language-always-binary

GPT3 is language based code (tokens). No numbers. All you do is ask it "make a Google website" and it makes a Google website for you and it will write the code for you to boot.
This very advancement actually poses a restriction on the system. You cannot ask it to perform a complicated compound question, because of the processing limitations.
You see the difference?

This is why the human brain alone has 250 trillion synaptic connections. It allows us to consider abstract concepts.
So does GPT3 run on some new kind of Hardware? What exactly is a Token in the context of GPT3 operations?[/QUOTE]
AFAIK a token is an identifier. Human also use tokens in constructing an internal picture or a sentence, an expectation which is then compared to incoming data, Anil Seth calls that a process of "controlled hallucination"

If we think of this for the GPT3 all we need is offer a verbal command. "make a chair that looks like an avocado".

The GPT3 searches for the tokens necessary to fill the "necessary constituent parts". i.e. what does an avocado look like, what does a chair look like . Then it "compiles" these shapes into objects.
The result;
upload_2021-7-8_11-49-7.jpeg
Is this not what furniture designers do?

In short, this is no longer a simple mathematical binary process of 0 and 1, but a language, an orchestrated set of necessary representative "tokens" sufficient for the construction of an object with a function. A compilation of a chair that looks like an avocado. A visualization of the words; "make a chair that looks like an avocado".

That is how I see this entirely new mode of computation which in part mimics how humans think,
 
The brain has nothing in common with a digital computer, a Turing machine, it does not execute instructions, it is not - it seems - algorithmic.
 
The brain has nothing in common with a digital computer, a Turing machine, it does not execute instructions, it is not - it seems - algorithmic.
You are correct. The brain does not work in linear fashion like an old digital computer. But neither does the GPT3.

Both the Human brain and the GPT3 brain are compilation, integration, and orchestration machines.
 
You are correct. The brain does not work in linear fashion like an old digital computer. But neither does the GPT3.

Both the Human brain and the GPT3 brain are compilation, integration, and orchestration machines.

I disagree.

"GPT-3" is software and so runs on a digital computer, it is algorithmic.

There's no evidence that the human brain is a symbol manipulator (which is what a digital computer can be described as).
 
I disagree.

"GPT-3" is software and so runs on a digital computer, it is algorithmic.
It is more than that. It is not linear, but parallel like a human brain.
There's no evidence that the human brain is a symbol manipulator (which is what a digital computer can be described as).
Watch Anil Seth.
He explains how the brain manipulates symbolic data by controlled hallucinations.
 
AFAIK a token is an identifier. Human also use tokens in constructing an internal picture or a sentence, an expectation which is then compared to incoming data, Anil Seth calls that a process of "controlled hallucination"

If we think of this for the GPT3 all we need is offer a verbal command. "make a chair that looks like an avocado".

The GPT3 searches for the tokens necessary to fill the "necessary constituent parts". i.e. what does an avocado look like, what does a chair look like . Then it "compiles" these shapes into objects.
The result;
View attachment 4328
Is this not what furniture designers do?

In short, this is no longer a simple mathematical binary process of 0 and 1, but a language, an orchestrated set of necessary representative "tokens" sufficient for the construction of an object with a function. A compilation of a chair that looks like an avocado. A visualization of the words; "make a chair that looks like an avocado".

That is how I see this entirely new mode of computation which in part mimics how humans think,
This is just about all the Computer can do:

1) Add, Sub, Mult ...
2) AND, OR, XOR ...
3) ShiftL, ShiftR ...
4) Move Data CPU to RAM, Move Data RAM to CPU ...
5) Jump , Jump Conditionally ...

There is no inherent Token processing operation. Any kind of Token concept will be implemented as Binary Data.
 
It is more than that. It is not linear, but parallel like a human brain.
Watch Anil Seth.
He explains how the brain manipulates symbolic data by controlled hallucinations.

No it is not "more than that" it may be a simulation of some kind of system but it is software and is algorithmic like all software.

There's no evidence that the human brain can be simulated by an algorithmic system.
 
We may have to move this conversation to the AI thread.

So in context of the OP, how does the human brain work and does it have FW or Choice from among available options?
 
There is no inherent Token processing operation. Any kind of Token concept will be implemented as Binary Data.
Yes GPT3 employs "tokens" (gathers lexical and optical data) before it converts everything into binary data, just like humans. That's its strength.

But we should discuss this in the AI threads.

For now, I'm interested in how the human brain works, the "easy problem".
Once this is established we can use that as a fundamental baseline for processing compound data sets and make choices from available possibilities and compare it to the new species of AI.
 
Last edited:
This may be a good starting point;

Our brains have a basic algorithm that enables our intelligence
Date: November 21, 2016
Source: Medical College of Georgia at Augusta University
Summary: A new Theory of Connectivity represents a fundamental principle for how our billions of neurons assemble and align not just to acquire knowledge, but to generalize and draw conclusions from it.
"Intelligence is really about dealing with uncertainty and infinite possibilities," Tsien said. It appears to be enabled when a group of similar neurons form a variety of cliques to handle each basic like recognizing food, shelter, friends and foes.
Groups of cliques then cluster into functional connectivity motifs, or FCMs, to handle every possibility in each of these basics like extrapolating that rice is part of an important food group that might be a good side dish at your meaningful Thanksgiving gathering. The more complex the thought, the more cliques join in. That means, for example, we cannot only recognize an office chair, but an office when we see one and know that the chair is where we sit in that office.
I believe Tononi calls this "integration" and Hameroff calls it "orchestration"
"You know an office is an office whether it's at your house or the White House," Tsien said of the ability to conceptualize knowledge, one of many things that distinguishes us from computers.
Tsien first published his theory in a 1,000-word essay in October 2015 in the journal Trends in Neuroscience. Now he and his colleagues have documented the algorithm at work in seven different brain regions involved with those basics like food and fear in mice and hamsters. Their documentation is published in the journal Frontiers in Systems Neuroscience.
"For it to be a universal principle, it needs to be operating in many neural circuits, so we selected seven different brain regions and, surprisingly, we indeed saw this principle operating in all these regions," he said.
Intricate organization seems plausible, even essential, in a human brain, which has about 86 billion neurons and where each neuron can have tens of thousands of synapses, putting potential connections and communications between neurons into the trillions. On top of the seemingly endless connections is the reality of the infinite things each of us can presumably experience and learn.
Neuroscientists as well as computer experts have long been curious about how the brain is able to not only hold specific information, like a computer, but -- unlike even the most sophisticated technology -- to also categorize and generalize the information into abstract knowledge and concepts.
"Many people have long speculated that there has to be a basic design principle from which intelligence originates and the brain evolves, like how the double helix of DNA and genetic codes are universal for every organism," Tsien said. "We present evidence that the brain may operate on an amazingly simple mathematical logic."
"In my view, Joe Tsien proposes an interesting idea that proposes a simple organizational principle of the brain, and that is supported by intriguing and suggestive evidence," said Dr. Thomas C. Südhof, Avram Goldstein Professor in the Stanford University School of Medicine, neuroscientist studying synapse formation and function and a winner of the 2013 Nobel Prize in Physiology or Medicine.
https://www.sciencedaily.com/releases/2016/11/161121165921.htm
 
Last edited:
This may be a good starting point;

Our brains have a basic algorithm that enables our intelligence
Date: November 21, 2016
Source: Medical College of Georgia at Augusta University
Summary: A new Theory of Connectivity represents a fundamental principle for how our billions of neurons assemble and align not just to acquire knowledge, but to generalize and draw conclusions from it. I believe Tononi calls this "integration" and Hameroff calls it "orchestration"
https://www.sciencedaily.com/releases/2016/11/161121165921.htm

Except our brains do not execute algorithms, they are not symbol manipulators nor is there any proof that a symbol manipulator can replicate brain functions.

This is covered in books What computers still can't do and others.
 
Back
Top