Jump to content

AI sentience


Prometheus

Recommended Posts

On 4/25/2019 at 2:42 PM, FreeWill said:

Why do you suppose that an ant(colony) cannot be aware of a problem?

It can recognize if it needs to eat and know as well how to get that. It can adapt and actively solve the problem of nutrition. It is aware that an enemy is closed on the Ant Queen. It will react to it! It is on a level sentient and enough intelligent to respond.

Maybe that's why I use Sentience and Intelligence as an interconnected recognition. 

Because a single ant can't survive/respond/react on it's own.

17 hours ago, wtf said:

If I study everything known about a given scientific topic; then you tell me in addition that it's "emergent"; what do I know then that I didn't know before you told me that?

Nothing, why would I?

17 hours ago, wtf said:

ps -- David Chalmers on emergence.

Quote

Emergence is a tricky concept. It's easy to slide it down a slippery slope, and turn it into something implausible and easily dismissable. But it's not easy to delineate the interesting middle ground in between. Two unsatisfactory definitions of emergence, at either end of the spectrum:

Which end are you?

Edited by dimreepr
Link to comment
Share on other sites

2 hours ago, dimreepr said:

Because a single ant can't survive/respond/react on it's own.

Of course it can. Note the level of impact needed for complex respond. A single ant can go around a leaf felt on the track and even sign out the new route. Note it's ability to respond together with ever other ant from the hill on a bigger danger. If you just observe an ant without its natural habitat you will not be able to be aware of this functions even the ant "contains" it. 

An ant is a simple intelligence with lower level of sentience than humans, but that does not exclude the existence of that lower! level of intelligence, awareness and sentience.

Handle everything on it's best (estimated) place.

Edited by FreeWill
Link to comment
Share on other sites

16 hours ago, FreeWill said:

Of course it can. Note the level of impact needed for complex respond. A single ant can go around a leaf felt on the track and even sign out the new route. Note it's ability to respond together with ever other ant from the hill on a bigger danger. If you just observe an ant without its natural habitat you will not be able to be aware of this functions even the ant "contains" it. 

An ant is a simple intelligence with lower level of sentience than humans, but that does not exclude the existence of that lower! level of intelligence, awareness and sentience.

Handle everything on it's best (estimated) place.

An ant is sentient because it feels not because of its thinking, or lack thereof.

Edited by dimreepr
Link to comment
Share on other sites

6 minutes ago, dimreepr said:

What's your point?

Sensation is an animal's, including humans', detection of external or internal stimulation (e.g., eyes detecting light waves, ears detecting sound waves).

An ant can do those and so it is sentient.

Note that an AI can do those things as well through cameras, voice recorders, and a software processing the information, so we could count it as sentient on a lower level.

Edited by FreeWill
Link to comment
Share on other sites

1 hour ago, FreeWill said:

Sensation is an animal's, including humans', detection of external or internal stimulation (e.g., eyes detecting light waves, ears detecting sound waves).

An ant can do those and so it is sentient.

An ant is sentient because it feels not because of its thinking, or lack thereof.

1 hour ago, FreeWill said:

Note that an AI can do those things as well through cameras, voice recorders, and a software processing the information, so we could count it as sentient on a lower level.

A computer doesn't see or hear it just process' data.

Link to comment
Share on other sites

16 minutes ago, dimreepr said:

Then what are my eyes for?

To make able the brain sense the outside reality. 

Note that the brain will still be able to see without eyes by direct brain stimuli for example.

Eyes are not necessary to see but you can not comprehend or see anything without the brain, even you have perfectly functioning eyes.

Edited by FreeWill
Link to comment
Share on other sites

42 minutes ago, FreeWill said:

To make able the brain sense the outside reality. 

Note that the brain will still be able to see without eyes by direct brain stimuli for example.

Eyes are not necessary to see but you can not comprehend or see anything without the brain, even you have perfectly functioning eyes.

Again what's your point? A computer doesn't see or hear it just process' data. If I close my eyes my brain still process', if you switch the camera off, the computer has nothing to process.

Edited by dimreepr
Link to comment
Share on other sites

5 hours ago, dimreepr said:

Again what's your point? A computer doesn't see or hear it just process' data. If I close my eyes my brain still process', if you switch the camera off, the computer has nothing to process.

 

If the data would not be processed you would not be able to see a picture. Can you elaborate why you do not get the analogies, what is unclear?

Basically you mean that the previously registered data in the computer can not be processed if the camera is off. I think it is not true.

My point is to point the existence of different levels of sentiences. Let it be AIs or a biological entities.

Edited by FreeWill
Link to comment
Share on other sites

On 4/27/2019 at 5:20 PM, FreeWill said:

 

If the data would not be processed you would not be able to see a picture. Can you elaborate why you do not get the analogies, what is unclear?

Basically you mean that the previously registered data in the computer can not be processed if the camera is off. I think it is not true.

My point is to point the existence of different levels of sentiences. Let it be AIs or a biological entities.

My point is ATM AI is not sentient, that's not to say it can't happen, but we are a very long way from a universal AI, the minimum requirement IMO; otherwise you may as well call a toaster sentient and the word looses all meaning.

Link to comment
Share on other sites

1 hour ago, dimreepr said:

My point is ATM AI is not sentient

Please reason your point.

I think our AI currently is partially sentient. It can see and hear, process the data and give respond. It can not feel yet.

1 hour ago, dimreepr said:

we are a very long way from a universal AI,

20-25 years I would guess, depending on how fast our physical reality based digital recognition evolves.

I think it requires that Humanities cognitive capabilities evolve as well.

AI cognition is in symbiosis with human cognitive functions i.e every bit of information in a computer is originating from human individuals past and present work and experience.

1 hour ago, dimreepr said:

you may as well call a toaster sentient and the word looses all meaning

This is not impossible and the word may finally get meaning. 

It depends how intelligent that toaster became. Form and material recognition, with robotic arms executing the whole mechanical process, while using the perfect time and temperature, based on the recognized matter(the kind of bread bought). I think this is limited sentience and intelligence. Connect  the toaster through the internet to a Global Network of Information (a hypothetic platform collects, organizes and process all data) it could be Absolutely Aware of Everything the toaster do.

Would not be very difficult to an Advanced Universal AI to monitor humanity....

Edited by FreeWill
Link to comment
Share on other sites

54 minutes ago, FreeWill said:

Please reason your point.

I think our AI currently is partially sentient. It can see and hear, process the data and give respond. It can not feel yet.

A computer just does as its told, it's barely autonomous, and a car thinks (processes) of nothing else (much like a toaster).

59 minutes ago, FreeWill said:

20-25 years I would guess, depending on how fast our physical reality based digital recognition evolves.

We've been 20 years away from a fusion power plant for the last 50 years.

 

1 hour ago, FreeWill said:

This is not impossible and the word may finally get meaning. 

It depends how intelligent that toaster became. Form and material recognition, with robotic arms executing the whole mechanical process, while using the perfect time and temperature, based on the recognized matter(the kind of bread bought). I think this is limited sentience and intelligence. Connect  the toaster through the internet to a Global Network of Information (a hypothetic platform collects, organizes and process all data) it could be Absolutely Aware of Everything the toaster do.

The essential meaning of sentience is "a sense of self"; just because a toaster can make toast is no basis for a sense of self and the meaning is lost because one could then argue that a fire is sentient because it too can make toast.

Link to comment
Share on other sites

7 hours ago, dimreepr said:

A computer just does as its told

You need catch up on how modern computers operate. It's the basis of this entire discussion. They're not just doing as they're told anymore. They're (not all, but many, and at least those being discussed here) building their own neural network through experience. 

Link to comment
Share on other sites

On 4/26/2019 at 6:41 AM, dimreepr said:

 

Which end are you?

I'll stop talking about emergence. My job now is to read more. I'm perfectly well aware that a lot of smart people find emergence meaningful. I don't. Hydrogen's not wet and oxygen's not wet, but water is wet. Ok. I don't find that interesting or meaningful. Atoms don't have chair-ness in them but my chair's made of atoms. What of it. 

But when it comes to consciousness, it's even worse. If you tell me that consciousness is an emergent property, that tells me even LESS than telling me wetness emerges from hydrogen and oxygen. Because with consciousness, you can't even tell me WHAT it emerges from. Does consciousness emerge from brain goo? Or from computations that may be implemented in any suitable substrate? Nobody knows. So calling consciousness emergent is doubly meaningless. First because emergence in general is meaningless (in my opinion) and secondly because nobody can say with authority what consciousness even emerges from. 

Having now gotten in one more remark about emergence I will go do some more reading on the topic. Perhaps I'll learn something.

 

7 hours ago, iNow said:

You need catch up on how modern computers operate. It's the basis of this entire discussion. They're not just doing as they're told anymore. They're (not all, but many, and at least those being discussed here) building their own neural network through experience. 

This is not true. The most sophisticated deep-learning AI in existence -- let's say AlphaZero, to be specific -- is a physical implementation of a Turing machine running on perfectly conventional hardware. It has no more computing power than any Turing machine. We could in theory implement AlphaZero using a human being equipped with nothing more than a pencil and a sufficiently long strip of paper on which a sequence of cells have been marked. This is a fact of computer science. I was going to write a long explanation of this point but perhaps I'll leave it at this and try to answer any specific questions. 

Deep learning and in general machine learning (ML) techniques are very clever ways to organize a Turing machine. They don't do anything new. They are based on analyzing a corpus of data, assigning weights to nodes, and making decisions based on the weightings of the nodes. It's perfectly conventional programming done with conventional programming languages and running on conventional hardware. These programs are deterministic. Even if they are equipped with "randomness" via a pseudo-random number generator or even a physically-based RNG, it's well-known that introducing randomness into computations does not increase their computational power. A Turing machine can be programmed to simply execute ALL possible logic paths at any branch point to simulate the element of randomness. 

It's true that ML algorithms do "build their own neural network" by continually refining the weight functions by which they evaluate their data. But they are programmed to do so, and the programming is perfectly conventional.

In short, contemporary AI's don't go beyond Turing machines. They're simply very clever applications of Turing machines. And not all that new; abstract neurons were conceived in the 1940's.

AI's do exactly what they're told. We know of no other way to implement computation.

Edited by wtf
Link to comment
Share on other sites

Dim said computers only do what they’re told. That’s patently untrue since they do things beyond their base programming every day. 

Regardless... We use software for that, not hardware. Perhaps we told them initially to explore and learn as they go, but that too is coded, not hardware.

Btw, wetness is a property of a surface, not water. Water is a molecule and is not itself wet. Weird, but true.

 I hope a clearer understanding is emerging for us all through your continued reminding us that you find the term emergent property lacking. 

Link to comment
Share on other sites

6 hours ago, wtf said:

with consciousness, you can't even tell me WHAT it emerges from.

It emerges from the physical reality (space-time, energy and matter).

Think about the solar system 4.8 billion years ago. Check the physical properties and estimate the level of biophysical consciousness could have been present. I think it would be close to zero.

Than observe today's solar system and the level of biophysical consciousness present. It is a lot.

You see the difference between the two states and you can realize that from space, energy and matter consciousness has emerged by time.

Edited by FreeWill
Link to comment
Share on other sites

20 hours ago, iNow said:

You need catch up on how modern computers operate. It's the basis of this entire discussion.

I agree and I'm aware that these are exciting times in computing.

12 hours ago, iNow said:

Dim said computers only do what they’re told. That’s patently untrue since they do things beyond their base programming every day. 

Except that is what the base programming intended. That may lead to sentience but as of now there's no evidence to suggest it has.

 

 

13 hours ago, wtf said:

I'll stop talking about emergence. My job now is to read more. I'm perfectly well aware that a lot of smart people find emergence meaningful. I don't. Hydrogen's not wet and oxygen's not wet, but water is wet. Ok. I don't find that interesting or meaningful. Atoms don't have chair-ness in them but my chair's made of atoms. What of it. 

But when it comes to consciousness, it's even worse. If you tell me that consciousness is an emergent property, that tells me even LESS than telling me wetness emerges from hydrogen and oxygen. Because with consciousness, you can't even tell me WHAT it emerges from. Does consciousness emerge from brain goo? Or from computations that may be implemented in any suitable substrate? Nobody knows. So calling consciousness emergent is doubly meaningless. First because emergence in general is meaningless (in my opinion) and secondly because nobody can say with authority what consciousness even emerges from. 

Having now gotten in one more remark about emergence I will go do some more reading on the topic. Perhaps I'll learn something.

It basically means we don't know anything other than it's more than the sum of it's parts.

20 hours ago, FreeWill said:

Isn't the sense of self is: self-awareness?

My dog, by all conventional thinking is not self-aware but I have no doubt that she has a sense of self; perhaps if I add preservation to "a sense of self" it'll become clear.

Link to comment
Share on other sites

22 minutes ago, iNow said:

Conventional? I think this may be a minority position, but welcome correction. 

Did the tests change? OK, I'll go down a level

Quote

My dog My worm, by all conventional thinking is not self-aware

 

Link to comment
Share on other sites

3 hours ago, dimreepr said:

My dog, by all conventional thinking is not self-aware but I have no doubt that she has a sense of self; perhaps if I add preservation to "a sense of self" it'll become clear.

Dogs has the intelligence level of a 2-3 years old child.

A dog is aware about it self.

It is aware also that it is not every other dog or living creature and that s/he is not the environment.

It knows it's exact space (try to touch a guarding dog). 

I think dogs are self-aware at some level

A dog is also aware what is thought to him and can use that knowledge when needed without command. 

A dog can even show you  what it knows even you do not ask it.

Edited by FreeWill
Link to comment
Share on other sites

3 minutes ago, FreeWill said:

Dogs has the intelligence level of a 2-3 years old child.

A dog is aware about it self.

It is aware also that it is not every other dog or living creature and that s/he is not the environment.

It knows it's exact space (try to touch a guarding dog). 

I think dogs are self-aware at some level

What about my worm?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.