Jump to content

AI sentience


Prometheus

Recommended Posts

What's people's opinions on this: can AI become sentient?

Taking the wikipedia definition:

 Sentience is the capacity to feel, perceive or experience subjectively

Can a fundamentally quantitative system really experience subjectivity?

Personally, given sentience has evolved at least once on Earth, i don't see why it can't manifest from a different substrate. But that's similar reasoning to given i'm alive at least once, i don't see why i can't live again...

Link to comment
Share on other sites

3 minutes ago, Prometheus said:

What's people's opinions on this: can AI become sentient?

Taking the wikipedia definition:

 Sentience is the capacity to feel, perceive or experience subjectively

Can a fundamentally quantitative system really experience subjectivity?

Personally, given sentience has evolved at least once on Earth, i don't see why it can't manifest from a different substrate. But that's similar reasoning to given i'm alive at least once, i don't see why i can't live again...

Assuming the equivalent environmental sensors are there and the program contains all the necessary information that makes a human, say, I think so. My reasoning is that we are ultimately just information.

Link to comment
Share on other sites

7 minutes ago, Prometheus said:

What's people's opinions on this: can AI become sentient?

Taking the wikipedia definition:

 Sentience is the capacity to feel, perceive or experience subjectively

Can a fundamentally quantitative system really experience subjectivity?

Quite possibly.  I would assume the sentience would be quite different from ours since an AI would have none of the drives that we have such as sex, food, physical comfort and others.

10 minutes ago, Prometheus said:

Personally, given sentience has evolved at least once on Earth, i don't see why it can't manifest from a different substrate. But that's similar reasoning to given i'm alive at least once, i don't see why i can't live again..

Wow that was a pretty hard turn off of the subject!

Link to comment
Share on other sites

14 hours ago, StringJunky said:

My reasoning is that we are ultimately just information.

That's a good way of putting it.

 

14 hours ago, Bufofrog said:

Quite possibly.  I would assume the sentience would be quite different from ours since an AI would have none of the drives that we have such as sex, food, physical comfort and others.

Wow that was a pretty hard turn off of the subject!

Maybe i should have said that i can think of no reason to preclude the possibility based on what i know. But that's not much, so i was wondering whether anyone knows of limits that make might it impossible; something along the lines of because computers are strictly digital they can never simulate the brain which is a mixture of analog and digital.

Link to comment
Share on other sites

2 hours ago, Prometheus said:

Maybe i should have said that i can think of no reason to preclude the possibility based on what i know. But that's not much, so i was wondering whether anyone knows of limits that make might it impossible; something along the lines of because computers are strictly digital they can never simulate the brain which is a mixture of analog and digital.

As long as we can digitize to a resolution that for practical purposes is unnoticeable; rather like digital photography has reached and surpassed the 20MP boundary, whereby at normal picture sizes analogue and digital prints are indistinguishable. The keyword is 'indistinguishable' when saying whether AI can be like some aspect of a biological organism. If they are not distinguishable they are the same. If it walks like a duck.... 

Edited by StringJunky
Link to comment
Share on other sites

I fully concur with SJ here. Functional equivalence at some future date is nearly certain, hence assuming actual equivalence is perfectly valid IMO.

We’d need to change our definitions if we wish them to remain segregated, because as of now it’s a distinction without a difference. 

Edited by iNow
Link to comment
Share on other sites

13 minutes ago, iNow said:

I fully concur with SJ here. Functional equivalence at some future date is nearly certain, hence assuming actual equivalence is perfectly valid IMO.

We’d need to change our definitions if we wish them to remain segregated, because as of now it’s a distinction without a difference. 

1

segregation seems a dangerous path

Link to comment
Share on other sites

20 hours ago, Prometheus said:

What's people's opinions on this: can AI become sentient?

Taking the wikipedia definition:

 Sentience is the capacity to feel, perceive or experience subjectively

Can a fundamentally quantitative system really experience subjectivity?

I think the answer to both questions is yes however I do not think human developed AI is headed in that direction. I do not think intelligence, as understood by humans, is a key component to something being sentient. As such I do not think AI is currently moving towards sentience. I don't believe the ability to feel and perceive existence is correlated with intelligence. Very intelligent people are not more sentient than those with below average levels of intelligence. 

Creating something sentient and creating something super highly intelligent are separate feats which don't have to be connected to one another. Perhaps they can be but it seems to me like a more complicate way of doing it. 

Link to comment
Share on other sites

1 hour ago, iNow said:

I fully concur with SJ here. Functional equivalence at some future date is nearly certain, hence assuming actual equivalence is perfectly valid IMO.

We’d need to change our definitions if we wish them to remain segregated, because as of now it’s a distinction without a difference. 

Yes, 'functional equivalence' is a good way to put it.

21 minutes ago, Ten oz said:

I think the answer to both questions is yes however I do not think human developed AI is headed in that direction. I do not think intelligence, as understood by humans, is a key component to something being sentient. As such I do not think AI is currently moving towards sentience. I don't believe the ability to feel and perceive existence is correlated with intelligence. Very intelligent people are not more sentient than those with below average levels of intelligence. 

Creating something sentient and creating something super highly intelligent are separate feats which don't have to be connected to one another. Perhaps they can be but it seems to me like a more complicate way of doing it. 

I don't think anybody has brought intelligence into it... unless you are being rhetorical. If an AI contains all the operating information of a sentient being, then it is sentient because it contains the necessary information. Sentience emerges from the mechanical processes contained within. The sentient-like behaviour arises with sufficient complexity such that the two types are indistinguishable.I don't know that you are sentient, only that you behave like someone that has it.

Edited by StringJunky
Link to comment
Share on other sites

9 minutes ago, StringJunky said:

I don't think anybody has brought intelligence into it... unless you are being rhetorical. If an AI contains all the operating information of a sentient being, then it is sentient because it contains the necessary information. Sentience emerges from the mechanical processes contained within.

AI stands for Artificial Intelligence. 

Link to comment
Share on other sites

2 minutes ago, Ten oz said:

AI stands for Artificial Intelligence. 

But the focus is on sentience. AI is just an umbrella term for processes that are autonomous. AI doesn't necessarily mean clever.

Edited by StringJunky
Link to comment
Share on other sites

Just now, StringJunky said:

But the focus is on sentience. AI is just an umbrella term for processes that are autonomous.

....and my first sentence stated that I think the answer to the questioned posed in the OP is yes. 

Link to comment
Share on other sites

We don't even know what consciousness  is and the simplest brains are orders of magnitude more complex than a computer or the internet.   

Nevermind we can't really even define intelligence either, how are we going to create AI?   It would be about akin to teaching a stone to fly.  

If we ever have machine intelligence there will be nothing "artificial" about it.   

Edited by cladking
Link to comment
Share on other sites

On 4/4/2019 at 5:48 PM, Prometheus said:

Personally, given sentience has evolved at least once on Earth, i don't see why it can't manifest from a different substrate. But that's similar reasoning to given i'm alive at least once, i don't see why i can't live again...

As long as the information that is you is stored, you should always exist or at least in a state of potential to be animate.

20 minutes ago, cladking said:

We don't even know what consciousness  is and the simplest brains are orders of magnitude more complex than a computer or the internet.   

Nevermind we can't really even define intelligence either, how are we going to create AI?   It would be about akin to teaching a stone to fly.  

If we ever have machine intelligence there will be nothing "artificial" about it.   

It's an emergent, behavioural property of the brain. It doesn't matter to know what it is, only that it can be replicated and be useful.

Link to comment
Share on other sites

On 4/5/2019 at 2:20 PM, Ten oz said:

Creating something sentient and creating something super highly intelligent are separate feats which don't have to be connected to one another. Perhaps they can be but it seems to me like a more complicate way of doing it. 

If we take something like self-driving cars: they have an extensive sensory network and the goal of moving from A to B without hitting anything else. That's quite similar to basic organisms responding to sensory inputs to perform goal orientated tasks. If such a system were to become conscious it might be similar to these creatures. 

Link to comment
Share on other sites

2 hours ago, Prometheus said:

If we take something like self-driving cars: they have an extensive sensory network and the goal of moving from A to B without hitting anything else. That's quite similar to basic organisms responding to sensory inputs to perform goal orientated tasks. If such a system were to become conscious it might be similar to these creatures. 

I agree. I believe it is possible for AI to become sentient. 

What consciousness is and exactly how it works is still unknown. Are there degrees to consciousness (awakened, enlightened, etc), are there types (subconscious, unconscious, etc), and do all living things experience consciousness the same as human isn't totally understood. So how sentience may apply to AI is difficult to categorizes. 

In my opinion (seriously just an opinion I am aware it could be off) I think consciousness as experienced in humans is just an illusion. The senor inputs we receive aren't processed by our consciousness. Numerous decisions are made in the brain and projected into our consciousness as fully developed thoughts. A superficial example of this would be waking up in the morning and knowing you want eggs rather than cereal. The idea arrives complete. We do not consciously analyses how our stomach feel or our nutritional needs and then conclude eggs would be best. We just waking up and want eggs. Often in my case I wind up not even having eggs and am stuck eating cereal anyway, lol. Also the experience of sensor inputs can be manipulated in real time by our mind. Something which may feel debilitating as I am casually walking around the house may entirely go away if there were and emergency. We are not consciously throttling those inputs ourselves via decision making. It is done by our minds for us utilizing chemistry. 

I think consciousness in humans as we experience it exists to provide us with a personality. We have selected for personality well as intelligence. Between 2 people one with normal intelligence but a great personality and the other with great intelligence and normal personality  I think the person with great personality would be considered more desirable to potential mates. Insufficient levels of either (personality or intelligence) is bad but it seems to me that when it comes to attracting a mate more than normal levels of intelligence doesn't seem to help where as additional levels of personality does. I think the sense of control over ourselves and our minds we consciously have is needed to have a personality as our actually true motivators are dry and fairly standard across the species. 

I realize I failed to define personality but that was for brevity as my views on consciousness overall are not the topic of this thread. I outlined it to set up the question of how would we know if AI were sentient? What would need to happen for us to identify an AI as sentient?

Link to comment
Share on other sites

On 4/7/2019 at 1:09 PM, Ten oz said:

I outlined it to set up the question of how would we know if AI were sentient? What would need to happen for us to identify an AI as sentient?

Tricky. We infer consciousness in other humans because we have direct expereince of our own, and other's behaviour is consistent with ours. Similar for animals, although it gets  harder to imagine the more different the animal is to us. Given that in some instances AI is deliberately programmed to mimic human responses, it will always be open to the criticism that it merely mimics, not recreates, consciousness.

I read somewhere that one possibility would be to 'raise' an artificial intelligence in isolation then see if displays behaviour consistent with consciousness. The problem with that is that it infers AI consciousness will be similar enough to something we know such its behaviours are interpretable. 

Link to comment
Share on other sites

i was looking at the previous and this is my hypothesis I have come to. that if there are many competing ai and if it is made so that sentinace is rewarded then maybe if given a lot of time.

p.s. i am not able to prove it i am just arisotialing  it.

Link to comment
Share on other sites

1 minute ago, peterwlocke said:

i was looking at the previous and this is my hypothesis I have come to. that if there are many competing ai and if it is made so that sentinace is rewarded then maybe if given a lot of time.

p.s. i am not able to prove it i am just arisotialing  it.

Is the essence of your suggestion that since AIs "feed on data" then natural selection, in a competitive environment, would favour efficient data processing and that might well correlate with sentience?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.