Jump to content

Is brain a computational machine?


Genady
 Share

Recommended Posts

I've read today in a recent book on Artificial intelligence this statement:

"a brain is a computational machine that happens  to be made of neurons."

(Stone, James. Artificial Intelligence Engines: A Tutorial Introduction to the Mathematics of Deep Learning (p. 183), 2020.)

Is brain a "computational machine"? If so, in what sense?

Link to comment
Share on other sites

The impression i get from the neuro and computer science community is that people think it is a computation because it's the only viable naturalistic/materialistic explanation on the table. Kind of like how abiogenesis is the only game in town regarding the origin of life - without supernatural appeals what else could it be?

That said there are some who won't bet on it:

 

And Penrose offers a quantum alternative:

 

Link to comment
Share on other sites

Materialistic/naturalistic perspective leads me to consider a brain to be some kind of machine. But, why computational? What do they mean when they say computational?

Link to comment
Share on other sites

3 hours ago, Prometheus said:

That said there are some who won't bet on it:

Me for instance.

 

That said, and with respect to Genady, the OP is poorly phrased.

What is a computational machine ?

My slide rule is undoubtedly a 'computational machine', but I could dig the garden with it, or stir the soup or with my model make physical measurements with it.
Does 'computation'  include making measurements ?

Link to comment
Share on other sites

53 minutes ago, studiot said:

Me for instance.

 

That said, and with respect to Genady, the OP is poorly phrased.

What is a computational machine ?

My slide rule is undoubtedly a 'computational machine', but I could dig the garden with it, or stir the soup or with my model make physical measurements with it.
Does 'computation'  include making measurements ?

To add to this question: 

Does 'computation' include determining what to measure?

Link to comment
Share on other sites

2 hours ago, Genady said:

Materialistic/naturalistic perspective leads me to consider a brain to be some kind of machine. But, why computational? What do they mean when they say computational?

The system, whether we call it a machine computational or otherwise, is based on a type of balance. Ionic differences and chemical floods lead to the propagation of electric signals. Those signals multiply and diverge in very specific ways and in conjunction with other inputs to the system from the microbiome and surrounding environment. 

In computational terms, there is a binary on/off signal. In neurobiological terms, there is a sort of Fourier analysis being conducted each moment and it is the outcome of those “computations” that makes us who and what we are. There either is or is not a signal coming from any part of the system at any given time.

Ions concentrate outside the neurons. Once a threshold is crossed, the channel polarizes and ionic gates open. Once that happens, it’s like a rock dropping into a pond with ripples spreading outward and the cells beside it respond to that same unbalanced chemistry and help cascade the action potential elsewhere… each step following from the last. 

The English language is notoriously fuzzy, but as with all things scientific, whether or not we call the brain a machine or computational machine is determined by how exactly we decide to define those terms in context. 

Personally, I lean toward Yes. I feel very comfortable calling the brain a type of computational machine, but I do the same thing for the body as a whole and its individual component parts and I acknowledge that others will disagree with me for entirely valid reasons. 

Edited by iNow
Link to comment
Share on other sites

13 hours ago, Genady said:

Is brain a "computational machine"?

The question itself relies on a false assumption: that an organ can be considered independently of the body.

You can regard the entire body as an organic machine and organs as its components, then consider each organ in terms of its functional contribution to the machine as a whole. A carburetor can be described as a kind of heart-lung machine, but that term is meaningless without  1. the internal combustion engine to which it belongs and 2.the flesh machine to which it is being compared. 

The correlation of body parts to mechanical parts is merely analogy; they are never literally alike. The brain has a whole lot of functions, of which mathematical computation is a very small, and symbolically derived part. It started as a sensory device and developed into a communication device, a regulating device, a recording device, etc. https://www.newscientist.com/article/mg21128311-800-a-brief-history-of-the-brain/ Eventually, it invented arithmetic to tally objects and measure distances. 

Vice-versa, the computer did arithmetic first, and all the other things it does now are derived from arithmetic. And, while a computer can be adapted to and integrated with other machines, such as vehicles, weapons and production lines, a particular kind of brain can only grow in and with and for a specific organism. 

So, that would be a NO, plus: such a simplistic analogy can't shed light on either of its subjects. 

 

1 hour ago, Genady said:

Does 'computation' include determining what to measure?

No. The question and terms have to input by the operator. The adding machine doesn't know, and doesn't need to know whether the numbers it's adding are cows, dollars or stars.

Link to comment
Share on other sites

The brain is not a computational machine, if we are speaking of a Turing machine.  A TM cannot decide whether or not to perform a computation.  A brain can.  However, if we broaden the meaning of computational machine to include AI neural nets, which can learn and grow and change the rules and even self-program more in the manner of a biological brain, then maybe the brain could be equated to one.  

While I don't subscribe to Penrose's quantum microtubule theory, I think he does argue convincingly that some cognitive functions of a brain are non algorithmic.  There are approximations and holistic insights from a brain that do not seem reducible to algorithms.  While the human brain can perform algorithms, it is an adaptive entity that resides in a biosphere which is not algorithmic.  Novel adaptations and inventions do not proceed algorithmically.

If you want to dig deeper, look at the formal concept of affordances in object-oriented programming.  And why some cognitive scientists do not believe formal affordances can succeed in object representation.  Worth a google - out of time here.  Back later.  

Edited by TheVat
Thpo
Link to comment
Share on other sites

I note we have yet to have guidance from our OP as to what sort of 'computational machine' we are discussing.

Turing machines are not the only sort of computational machine, and I don't don't see why I can't assume that if Genady meant to limit the discussion to Turing machines he would not have specified one of these.

 

Link to comment
Share on other sites

13 minutes ago, studiot said:

I note we have yet to have guidance from our OP as to what sort of 'computational machine' we are discussing.

Turing machines are not the only sort of computational machine, and I don't don't see why I can't assume that if Genady meant to limit the discussion to Turing machines he would not have specified one of these.

 

I didn't feel a need to mention Turing machines because any computation can be implemented by a TM. TM is a formally defined device useful for formal analyzing computations, e.g. compare their complexities. A computational machine doesn't have to be a TM, but whatever it does can be done by a TM.

This includes AI neural nets of all types. Since they are implemented by computers, they can be implemented by a TM.

Link to comment
Share on other sites

Computation itself has a straightforward meaning. "the action of mathematical calculation." - and by extension, the use of machines originally called computation machines. 

Modern computers are quite a lot more than that; they have evolved. But then, so has the brain. Human brains are only the most recent iteration, not the only version, just as AI is not very like Babbage's Analytical Engine. I'm sure analogies can be drawn at each level of complexity, but I doubt they would be useful.

Link to comment
Share on other sites

2 hours ago, Genady said:

I think today "computation" applies to "whatever computers can do". This is certainly what they mean in the book I've cited in the OP.

They can compare things, count things, record things, even juxtapose things to make new things. I guess that meas they can sort of think - at least about what they're told to think about - and evaluate things - according to preset values. They can be hooked up to devices that monitor external processes and conditions: measure temperature and raise or lower the thermostat accordingly; measure pressure, etc. I'm pretty sure a computer can't tell what part of your leg to scratch or whether you've fallen in love.

I guess that's a windy way of repeating: no, a computer can't decide what to measure, and an organic brain, even one as small as a frog's, can. I think the comparison breaks down at several points. The sensory input is personal to the brain; literally a matter of life and death; to the computer, it's just another equation to solve. Emotional response and volition, afaik, are still well beyond the mechanical range of functions: it can make rational decisions, if it's given sufficient information on which to base one; it won't do anything instinctively, or little or no data. 

Edited by Peterkin
Link to comment
Share on other sites

2 hours ago, Genady said:

I didn't feel a need to mention Turing machines because any computation can be implemented by a TM. TM is a formally defined device useful for formal analyzing computations, e.g. compare their complexities. A computational machine doesn't have to be a TM, but whatever it does can be done by a TM.

This includes AI neural nets of all types. Since they are implemented by computers, they can be implemented by a TM.

 

2 hours ago, Peterkin said:

Computation itself has a straightforward meaning. "the action of mathematical calculation." - and by extension, the use of machines originally called computation machines. 

Modern computers are quite a lot more than that; they have evolved. But then, so has the brain. Human brains are only the most recent iteration, not the only version, just as AI is not very like Babbage's Analytical Engine. I'm sure analogies can be drawn at each level of complexity, but I doubt they would be useful.

This is an oft misunderstood belief.

That is why I asked the question.

Modern computers in general tend to operate on 'digital' principles and for anything computed this way a TM can indeed be programmed to implement a digital computation for.

However there are other types of machine that can be used to perform some of the same computations by other (non digital) methods and indeed can perform some computations that a TM cannot perform directly.

The extraction of powers, roots and reciprocals, the solution of polynomial equations and even differential equations and also both the integral calculus and the differential calculus can all be performed graphically by a competent draughtsman.

So a drawing board, and its associated equipment constitute a computational machine.

 

Edited by studiot
Link to comment
Share on other sites

11 minutes ago, studiot said:

The extraction of powers, roots and reciprocals, the solution of polynomial equations and even differential equations and also both the integral calculus and the differential calculus can all be performed graphically by a competent draughtsman.

The vast majority of organic brains on this planet can't perform any of those sophisticated mathematical feats; in this, the dumbest computer beats the smartest rat.

That's why i don't think the analogy is useful. My brain doesn't understand the machine's workings, but understand the rat's perfectly - us organics together. 

Link to comment
Share on other sites

So, brains do many things that computers don't do, and computers do many things that brains don't do. Maybe the question should be narrowed to a domain where their functions seemingly overlap, namely, intelligence:

Is human intelligence a biologically implemented computer?

Link to comment
Share on other sites

1 minute ago, Genady said:

is human intelligence a biologically implemented computer?

Why human? There are more overlaps between a computer's functions and human intelligence's functions, because humans made computers as an extension of their own intelligence. A computer is just more human computing capability. It can't do much for a moth or salamander or an orangutan, because thy don't want operations performed that their own little organic brains can't perform. Humans need computer augmentation because their big organic brains are already performing operations they want performed, but just not enough per second.   

Link to comment
Share on other sites

4 minutes ago, Peterkin said:

Why human? There are more overlaps between a computer's functions and human intelligence's functions, because humans made computers as an extension of their own intelligence. A computer is just more human computing capability. It can't do much for a moth or salamander or an orangutan, because thy don't want operations performed that their own little organic brains can't perform. Humans need computer augmentation because their big organic brains are already performing operations they want performed, but just not enough per second.   

Yes, human because it is more interesting and understandable to us. OTOH, the goal is not necessarily pragmatic. It can be for sport or for research, for example. I don't think they developed artificial GO champion because it was needed.

Link to comment
Share on other sites

3 hours ago, Genady said:

I don't think they developed artificial GO champion because it was needed.

Did it make money?

3 hours ago, Genady said:

Yes, human because it is more interesting and understandable to us.

More interesting [to humans] and understandable [to humans] just means that the discussion is limited to the 'intelligent' functions that matter to only one species out of 8 million, and have meaning to only one species out of 8 million and that - just so happen! - to have been invented by that very same species. There is a dot on your forehead. The machine you programmed thinks the way you do; therefore that intelligence must be a mirror image of yours.

But not the other way around.  

Link to comment
Share on other sites

Posted (edited)
1 hour ago, Peterkin said:

More interesting [to humans] and understandable [to humans] just means that the discussion is limited to the 'intelligent' functions that matter to only one species out of 8 million, and have meaning to only one species out of 8 million and that - just so happen! - to have been invented by that very same species. There is a dot on your forehead. The machine you programmed thinks the way you do; therefore that intelligence must be a mirror image of yours.

But not the other way around.  

You explain, correctly, why the current artificial intelligence is human-like. However, my question is different:

Is human intelligence computer-like?

It specifically refers to the human intelligence abilities which are not realized in the current AI. The current AI realizes only very small subset of human intelligent tasks. What about the unrealized tasks? Are they or some of them unrealizable, in principle? Is there some fundamental limitation in computer abilities that prevents AI from mimicking all of human intelligence?

Following the @studiot's clarification, let's stay with classical digital computers, because their functionality is precisely defined, via reducibility to TM. Anyway, all AI today is realized by this kind. Is human intelligence just a very complicated TM, or rather its functionality requires some fundamentally different phenomenon, irreducible to TM in principle?

We know at least one such physical phenomenon, quantum entanglement. It is mathematically proven that this phenomenon cannot be mimicked by classical digital computers. Is human intelligence another one like that?

If human intelligence in fact is reducible to TM, i.e. is realizable by classical digital computers, then perhaps intelligence of all other animals on Earth is so, too. But, if it is not, then another question will be, when and how evolution switched to this kind of intelligence? Mammals? Vertebrates? CNS? ... 

Edited by Genady
Link to comment
Share on other sites

48 minutes ago, Genady said:

Is human intelligence computer-like?

I already answered that. You are not an image of your image.

49 minutes ago, Genady said:

The current AI realizes only very small subset of human intelligent tasks. What about the unrealized tasks?

What are the unrealized tasks of human intelligence? Are any of them not being served?

50 minutes ago, Genady said:

Are they or some of them unrealizable, in principle?

Possibly. I don't know what they are, so I can only speculate on speculations, as it were. Can we ever know the meaning of the universe, life and everything? Probably not (because it probably hasn't got any; the pattern-seeking mind has chased a series of random events up an unpatterned tree, and will never stop barking.) 

 

55 minutes ago, Genady said:

Is there some fundamental limitation in computer abilities that prevents AI from mimicking all of human intelligence?

Again, I would expect so, just because it's man-made and thus constrained by human logic, rather than free to respond to the vagaries of nature and the pressures of survival. But even if it mimics all of human intelligence, it can't really do human stupidity convincingly; doesn't have human yearnings and irrational desires; doesn't fly into hormone-induced rages and make sentimental choices.  A mimic is still just an image.

If AI become sentient in its own right, it will stop mimicking humans. The self-aware android will not wish above all things to be a real live boy - he will strive to be the best possible android. Maybe the real live boys will mimic him.

1 hour ago, Genady said:

Is human intelligence just a very complicated TM, or rather its functionality requires some fundamentally different phenomenon, irreducible to TM in principle?

Yes. I don't think human intelligence is fundamentally different from rodent intelligence, but is fundamentally different from adding machines of any level of sophistication. 

 

1 hour ago, Genady said:

We know at least one such physical phenomenon, quantum entanglement. It is mathematically proven that this phenomenon cannot be mimicked by classical digital computers. Is human intelligence another one like that?

I would not know a quantum entanglement or the mathematical proofs if I fell over them on the towpath. I have an ordinary monkey brain.

1 hour ago, Genady said:

then another question will be, when and how evolution switched to this kind of intelligence? Mammals? Vertebrates? CNS? ... 

It didn't switch. Nothing was ever switched in evolution. Things were added, things were enlarged, adapted, co-opted, extended, folded over, crossed over, passed over,  reversed, stapled, spindled and mutilated, but nothing is ever discarded to be replaced by some whole new thing. Computation is just a new trick learned by an old pony - probably as an extension of speed and distance calculation for running down prey and running away from predators.  

 

Link to comment
Share on other sites

@Genady

No criticism but you have spread the subject of the difference between human and AI ( and perhaps other digital computer) thought processes over several threads, some of which are now being left behind and forgotten.

This leaves me in a quandrary as I want to post an important comparison, relevant to this topic, but is not computation so would perhaps be off topic here.

I have already referred to some differences in the other thread(s).

Link to comment
Share on other sites

Posted (edited)
58 minutes ago, studiot said:

@Genady

No criticism but you have spread the subject of the difference between human and AI ( and perhaps other digital computer) thought processes over several threads, some of which are now being left behind and forgotten.

This leaves me in a quandrary as I want to post an important comparison, relevant to this topic, but is not computation so would perhaps be off topic here.

I have already referred to some differences in the other thread(s).

Yes, I know, they started as somewhat different questions, but boiled down to the same subject. I'd like to hear your comparison, regardless where you post it. Perhaps the thread of what computers can't do, is more relevant.

I took a note of the differences you've referred to before. Thank you.

7 hours ago, Peterkin said:

 I think human intelligence is fundamentally different from adding machines of any level of sophistication.

Perhaps, but what are these machines fundamentally missing that leads to this difference? What would prevent a sophisticated system of them to behave like a system described by iNow earlier:

21 hours ago, iNow said:

The system, whether we call it a machine computational or otherwise, is based on a type of balance. Ionic differences and chemical floods lead to the propagation of electric signals. Those signals multiply and diverge in very specific ways and in conjunction with other inputs to the system from the microbiome and surrounding environment. 

In computational terms, there is a binary on/off signal. In neurobiological terms, there is a sort of Fourier analysis being conducted each moment and it is the outcome of those “computations” that makes us who and what we are. There either is or is not a signal coming from any part of the system at any given time.

Ions concentrate outside the neurons. Once a threshold is crossed, the channel polarizes and ionic gates open. Once that happens, it’s like a rock dropping into a pond with ripples spreading outward and the cells beside it respond to that same unbalanced chemistry and help cascade the action potential elsewhere… each step following from the last. 

The English language is notoriously fuzzy, but as with all things scientific, whether or not we call the brain a machine or computational machine is determined by how exactly we decide to define those terms in context. 

Personally, I lean toward Yes. I feel very comfortable calling the brain a type of computational machine, but I do the same thing for the body as a whole and its individual component parts and I acknowledge that others will disagree with me for entirely valid reasons. 

 

Edited by Genady
Link to comment
Share on other sites

5 hours ago, Genady said:

Perhaps, but what are these machines fundamentally missing that leads to this difference

In my lay opinion: evolution. They actually were made in the image of their creator - in an [incomplete] image that creator had of one of its own functions in isolation from all other processes taking place in its complex organism. That creator did not build in all the mistakes and blind alleys nature had, because this creation actually was purposeful.

5 hours ago, Genady said:

What would prevent a sophisticated system of them to behave like a system described by iNow earlier:

I have to say: Nothing, because I can't raise a valid argument against something I don't fully understand. I followed it up to the rock in the pool.

I have no problem with machines doing every logical and intelligent procedure of which humans - even if only a few exceptionally clever humans - are capable.

What I can't see machines duplicating is passion, impulse, instinct, stupidity and craziness. I'm sure they could fake it, or be duped into acting on false information, but genuine craziness is an animal thing, a primeval thing. You can program in bugs and glitches, but I don't think you can program in the random replication errors that result in new strains of organic behaviour.

I suppose the main difference is: machines can never be accidental or non-purposeful. Here is an example of fuzzy English: 'having a purpose. Animals do have a purpose for their actions, which they don't always carry out rationally, while machines exist on purpose, for the purpose of carrying out rational actions.  It's the difference between ID and abiogenesis.

Edited by Peterkin
add a line
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.