Jump to content

AI sentience


Prometheus

Recommended Posts

9 minutes ago, wtf said:

I respectfully decline to play.

And yet you continue posting on page after page, day after day, week after week. 

I humbly submit that there’s nothing respectful about the approach you’ve chosen. 

Link to comment
Share on other sites

6 hours ago, iNow said:

And yet you continue posting on page after page, day after day, week after week. 

I humbly submit that there’s nothing respectful about the approach you’ve chosen. 

If I decline to respond to a mention, I'm being disrespectful.

If I do respond, then I'm continuing to post page after page.

No win with you, is there? 

I prefer not to say any more about emergence because I truly have nothing else to say. I'm satisfied knowing that David Chalmers considers emergence meaningful; and that frankly I don't. I'm perfectly ok with that. He's a smart guy and a clear writer, I'll take another run at his papers down the road.

What exactly was the purpose of your post? To tell me that it upsets you if I post and upsets you if I don't? 

Ok. Well ... I hope you find a way to deal with that. Clearly nothing I could do would make the slightest difference.

Edited by wtf
Link to comment
Share on other sites

57 minutes ago, wtf said:

No win with you, is there? 

It’s not about winning and losing.

It’s about asking you what other approach might take us closer to the truth since the idea of emergence doesn’t seem to get us there. 

Your continued evasion of and vitriolic attacks against this rather straight forward question are, I must say, rather telling. 

57 minutes ago, wtf said:

What exactly was the purpose of your post? To tell me that it upsets you if I post and upsets you if I don't? 

 Ok. Well ... I hope you find a way to deal with that. Clearly nothing I could do would make the slightest difference.

Just stop

Edited by iNow
Link to comment
Share on other sites

FreeWill;

This response is very late and too long, but then I am very slow, and you asked a lot of questions. Because my understanding of consciousness is a little different, I ask that you read through the entire post before formulating a response.
 

On 5/9/2019 at 12:54 AM, FreeWill said:

I do not have a problem to access wiki on my own. I even can pick up one of my many textbooks or rely on my 20+ years of experience from the field of medicine and check any questionable point related hormones. But thanks for the link. I have read it. I have asked for it to see where from you gained your understandig. 

So you were just exercising your rights? I understand that you may need to verify what I state, but please note that I have MS, and tire easily -- I am also not proficient at providing links -- so if you are interested, please look up the information I reference, then ask for a link if necessary. As far as gaining my understanding, and that is what I call it, my Understanding, I learned from Philosophy, Psychology, specifically Jung, Freud, and Blanco, Environmental studies, Ecology, History, Biology, various Religions, etc., then rolled that information together comparing commonalities and analyzing the results for about 50 years. That is how I did it, and I'm not done.


There are a lot of well-educated people in this forum. I also have textbooks, but mine reference law, and I am afraid that neither yours nor mine will answer many questions on consciousness. Yours will likely work against you, because every person that I have talked to, who has trained in the medical field, can not separate the idea of consciousness from the idea of the brain in their thinking. 


Years ago, I realized that ALL survival instincts work through feeling and/or emotion. This information compared well with Freud's interpretation of the "drives" in the Id aspect of mind. Since all survival instincts work through hormones internally and pheromones externally, we finally have a solid physical connection between mind (the Id) and all multicellular life. Therefore a (brainless) blade of grass is conscious (sentient) of the need to maintain itself and continue its specie.
 

Quote

Hormones impact a lot of things sure, like control (regulate) the internal environment of the body but that is different from controlling the entire body. 

On the other hand, without the homeostasis (the self balancing) that hormones provide, we would all die. Ecosystems also self balance and pheromones and chemistry play an important roll in balancing life in ecosystems. The pertinent words here being "self" balancing -- life maintains and promotes itself.
 

Quote

Actually what we are looking to answer is about AI sentience,

Which means that we have to know what "sentience" actually is. Sentience is a lower level of consciousness as established in Philosophy and Biology. Both consciousness and sentience require subjective experience in order to BE sentience/consciousness, and subjective experience requires a "self" in order to be subjective. All I have done is to show that survival instincts are evidence of a "self" that must be protected -- must survive -- and that all life possesses this "self"/ subjectivity. It is also clear that this is reactionary and works through chemistry and the unconscious aspect of mind. Thinking and/or a brain is not required.
 

Quote

and you provide a lof to unrelated and inaccurate information. 

It is all related. You just have to stop thinking about the brain and start thinking about life and consciousness. You may not like my interpretations, but I resent the accusation that the information is "inaccurate". I work hard to be accurate and often verify the information before posting.


Do you realize that if AI is actually sentient, that it would be a kind of new specie? That it will have a subjective "self" that it will try to protect? This is where the Sci-fi movies become the horror flicks when AI starts to see humans as unnecessary and redundant. This is what sentience signifies -- self maintaining and self promoting.
 

Quote

I do not see that rainforest can protect themselves from deforestation with pheromones.

But if we stop chopping up the forests, in a few centuries they will rebuild themselves and pheromones will play an important role in that work. There have been many studies on this; Nature rebuilding after floods, volcanoes, earthquakes, etc., and even bacteria that has learned to neutralize areas that were destroyed by toxic waste.

 

Quote

I can not see the named homeostasis between Humanity and Nature. 

Regarding the homeostasis between "Humanity and Nature". Consider that if we continue to be pests and destroy the forests, what will happen? The Greenhouse effect? Global warming? Changes in climate causing droughts and floods, oceans rising, food shortages, eventually leading to pestilence and disease?  When it is all over, thousands of years from now, there will be less pesky humans and the rain forests will rebuild in some form. Like homeostasis within a body; life WILL balance and promote itself.
 

Quote

Well, I can be aware of that I am hungry, horny, fleeing or fighting and I can be absolutely aware of when and how the body is preparing and reacting in those scenarios. 

Yes. The feeling that makes you react is automatic and analogue; however, you become consciously aware of it after you experience it and digitalize that information into knowledge. Once you have the information, then you can even anticipate it and often control it -- because you have a brain and can learn. Consider that if someone throws something at your head, you will probably duck -- that is self protection (survival instincts) but if you are a ball player used to catching a ball, you may well react by catching what was thrown -- that is reflexive learned instincts. Either way, you react to protect yourself.
 

Quote

 

I wonder when you said the hormones control the body,  haven't you thinking about, that for example in your underlined points the brain can easily overwrite what the hormones suggest? 

 

Of course, though it is not always easy. Consider that the ability to commit suicide is evidence that we can overwrite our survival instincts if we choose. But people surviving horrors that they would prefer to not live through, is evidence that dying is not always an option.


I am not making an argument for or against free will. I have learned that it is best to think of the differences between the conscious and unconscious as influences, rather than controls. These two aspects of mind routinely feed information back and forth.
 

Quote

I am horny but I would never force myself on anyone, a soldier even fear the fight would not leave the battlefield,

And yet, there is well-documented history of rape following war. Fighting men kill and raise their testosterone levels. When the fighting is over they tend to grab the first person they can find and shove themselves into that person creating life -- also caused by an abundance of testosterone. This is only one example of how homeostasis works within a specie. Killing does not cause rape, but seems to encourage it.
 

Quote

I am hungry but I would not take your sandwich, because with the brain you can control and overwrite what the hormones suggest and how the body acts. (note that no court on the world would accept an excuse: my hormones control my body and forced me to do something improper) 

The above is not true. Hunger can be a defense against stealing food if it borders on starvation. Most US Courts will accept a plea of self defense, even when you kill, if it was necessary to protect your actual self, your spouse, your children, or your home. These are covered under survival instincts. Courts will generally not accept a plea of self defense if you kill to protect your friend, your parents, your siblings, etc., which are not covered under survival instincts.
 

Quote

You have difficulties to accept AI sentience because you have a false picture about the hormone system and the brain and because of that, you have a false picture about unconsciousness and consciousness as well. 

No. I have difficulty accepting AI sentience because I have not seen evidence of "self" in AI. Hormones are just evidence of survival instincts and "self" in life, nothing more. What do you actually know about the conscious and unconscious aspects of mind? Not the brain, but mind.
 

Quote

 You can not possibly know what we can achieve in the future. We already know a great deal about the body and the unconscious part of its functions so I can not exclude that we will further understand it. 

Well, I certainly agree with this. Of course we will learn more, but will we ever learn everything? No.  
 

Quote

Can you elaborate what do you mean by that the conscious aspect of mind is digital? Do you mean that instead of pictures, senses, thoughts put together by different cells from different areas of my brain, the thought, memories, knowledge I have are just numbers? How and why? What 0 would mean,  and how would it work in this scenario and how would I be aware of that? 

You are talking about binary; when I say digital, I may be thinking about "discrete data" (information). I have tried a half dozen different ways to explain this over the last weeks and have finally settled on a mirror analogy. When you look in a mirror, what you see is yourself, what is behind you, and around you, but are you in the mirror? No. This is what the brain does for us, it gives us a reflection and information about ourselves, our experiences in memory, the causal world that surrounds us, and it does it in digital or data form (thoughts). This reflection is what we call our consciousness, or the rational aspect of mind, and it is produced and processed by the brain.


But does the mirror reflect everything mental? No. Although it can give us clues, it does not directly reflect emotion, nor does it reflect feelings (moods) which is why they are so hard to explain or verbalize. It can reflect awareness allowing us to be self-aware, but it is not truly the "self" so it does not actually experience awareness -- it only reflects what we are aware of. All of these things are analogue and do not reflect in the digital or data form as presented by the brain. These things make up what we call the unconscious aspect of mind, because they are not known consciously -- because they can not be reflected. 


This also means that they do not require a brain in order to be real. So when we say that a blade of grass is not conscious, what we are really saying is that it does not possess a brain, or mirror, so it has no knowledge of it's consciousness. But it does possess the knowledge necessary to maintain itself, probably through DNA, and it does possess feeling as evidenced by survival instincts, so it does possess consciousness in some form (sentience). It can die and lose consciousness.
 

Quote

 

Who made this recognition you claim? Can you share a link about that? I do not find any reference to it in my neurology book....

 

I made it, although I can not be the only person, and you would not find a reference in a neurology book as neurology does not study mind, it studies the brain and CNS. A good understanding of analytic Psychology might help. I have not yet found any legitimate source that disputes my Understanding, and have found a great deal that supports it.


 I am not sure how to explain this in a way that you will understand, as I have been over it repeatedly. Let's reverse this so we don't go any further off the topic. AI is basically a computer that processes information. Right? So what empowers the processing? What makes the activity or motion? Some form of electricity. 


Well the conscious rational aspect of mind works like the computer. What empowers the processing, activity, or motion? The unconscious aspect of mind, awareness, feeling, and emotion is what empowers it. The unconscious is to a body/brain what electricity is to AI.


So I think we are looking in the wrong place for consciousness (sentience). The "self", the ability to self promote, and homeostasis, all source from the unconscious aspect of mind as evidenced by survival instincts. Evolution also seems to indicate that the unconscious came before the conscious, so I don't see AI as being conscious. Of course, I could be wrong. But I have also seen no evidence of sentience. As I stated in my first post in this thread: I don't expect it, but neither can I say it is impossible.


Are we certain that we want AI to be conscious? Hawking was not convinced that it was such a good idea, and I find that I agree with him.

Quote

The development of full artificial intelligence could spell the end of the human race. Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn't compete and would be superseded.
— Stephen Hawking[328]

When he says "full artificial intelligence", what he is saying is that it would have a "self" that would cause motivation, ambition, will, and self promotion. It would have emotion, feeling, and awareness, which means that it would have an unconscious aspect of mind. This means that it could theoretically be capable of having psychotic episodes, becoming schizophrenic, or insane. Wouldn't that be a kick in the pants?


Gee

 

Quote

And as Dimreepr well said: 

Dimreepr says a lot of things, but does not like to read. If he did, then he would not have to state two or three more times what I stated in my first post in this thread regarding AI sentience: I don't expect it, but neither can I state that it is impossible.

 

Edited by Gees
Link to comment
Share on other sites

Hello Gees, 

Thanks for the respond. 

Please note that less is sometimes more.

It is difficult to analyse and respond to a wall of words filled with inaccuracies and misconceptions. 

I do not understand why objective sentience, what I think can be achieved by an advanced AI, would exclude self awareness rather then promote it...

Edited by FreeWill
Link to comment
Share on other sites

8 hours ago, Gees said:

Therefore a (brainless) blade of grass is conscious (sentient) of the need to maintain itself and continue its specie.

To conclude from 'self-preservation' to having a sentient self, is like asking what rains in 'it rains'.

Say you find a raw diamond in a field, and put it on a scale, you find it weighs 20 grams. But then somebody reacts, and says you must clean it first, the diamond itself might weigh less, e.g. 18 grams. Does that mean that a diamond has a 'self'? You lay heavily on the spell of our daily language use. 

Link to comment
Share on other sites

3 hours ago, Gees said:

I believe that "objective sentience" is an oxymoron.

Why would it be an oxymoron?

I know you think sentience requires subjective unconsciousness,  but I think just because You think like that, objective sentience of AI's or humans is not prohibited or unachievable. 

Edited by FreeWill
Link to comment
Share on other sites

11 hours ago, Gees said:

I have difficulty accepting AI sentience because I have not seen evidence of "self" in AI.

That is a ridiculous argument. That would be like someone at the time when only black and white television existed saying that the concept of "colour television" is hard to accept because they have only seen B&W TV.

It is also a straw man because no one has claimed that any form of current AI is sentient and has a "self".

11 hours ago, Gees said:

You are talking about binary; when I say digital, I may be thinking about "discrete data" (information).

You often claim to be a "philosopher" and yet you are willing to make up your own meanings for words. Clarity of expression is at the heart of philosophy. You need to try harder.

11 hours ago, Gees said:

But does the mirror reflect everything mental? No. Although it can give us clues, it does not directly reflect emotion, nor does it reflect feelings (moods) which is why they are so hard to explain or verbalize. It can reflect awareness allowing us to be self-aware, but it is not truly the "self" so it does not actually experience awareness -- it only reflects what we are aware of. All of these things are analogue and do not reflect in the digital or data form as presented by the brain. These things make up what we call the unconscious aspect of mind, because they are not known consciously -- because they can not be reflected. 


This also means that they do not require a brain in order to be real.

This is even worse.

One can draw an analogy between the fact that there is an image in the mirror (and not "reality") and the way that our brain constructs our perception of external "reality".

But it is just an analogy. You cannot use it to draw conclusions about the behaviour or nature of the brain. Because the brain is not a mirror. 

Basically, you are confusing the image in the mirror for the real thing by trying to make this argument.

11 hours ago, Gees said:

Well the conscious rational aspect of mind works like the computer. What empowers the processing, activity, or motion? The unconscious aspect of mind, awareness, feeling, and emotion is what empowers it. The unconscious is to a body/brain what electricity is to AI.

Or maybe the unconscious part of the mind is just a different sort of processing going on in the brain.

The analogy with electricity is wrong. Electricity provides energy to power the computer, The thing that provides the energy to power the brain is biochemistry (and ultimately the food we eat).

One could draw a dubious analogy between the conscious mind and application software, on the one hand, and the unconscious mind and the operating system on the other (most people are not aware of what the operating system is doing behind the scenes, in the same way they are unaware of the unconscious processing the mind does). In both cases, the application software and the operating system, they are just programs being run by the computer.

This is not a great analogy (I am not a fan of analogies) but it is better than the "electricity" one.

 

Link to comment
Share on other sites

10 hours ago, Gees said:

Dimreepr says a lot of things, but does not like to read. If he did, then he would not have to state two or three more times what I stated in my first post in this thread regarding AI sentience: I don't expect it, but neither can I state that it is impossible.

I don't like to read what you right, which has cost me in the past; I assumed other self-proclaimed philosophers write to much as a way to dodge reality.

Quote

I stated in my first post in this thread regarding AI sentience: I don't expect it, but neither can I state that it is impossible.

Then why all the subsequent words?

Link to comment
Share on other sites

1 hour ago, Strange said:

That is a ridiculous argument. That would be like someone at the time when only black and white television existed saying that the concept of "colour television" is hard to accept because they have only seen B&W TV.

It is also a straw man because no one has claimed that any form of current AI is sentient and has a "self".

You often claim to be a "philosopher" and yet you are willing to make up your own meanings for words. Clarity of expression is at the heart of philosophy. You need to try harder.

This is even worse.

One can draw an analogy between the fact that there is an image in the mirror (and not "reality") and the way that our brain constructs our perception of external "reality".

But it is just an analogy. You cannot use it to draw conclusions about the behaviour or nature of the brain. Because the brain is not a mirror. 

Basically, you are confusing the image in the mirror for the real thing by trying to make this argument.

Or maybe the unconscious part of the mind is just a different sort of processing going on in the brain.

The analogy with electricity is wrong. Electricity provides energy to woe the computer, The thing that provides the energy to power the brain is biochemistry (and ultimately the food we eat).

One could draw a dubious analogy between the conscious mind and application software, on the one hand, and the unconscious mind and the operating system on the other (most people are not aware of what the operating system is doing behind the scenes, in the same way they are unaware of the unconscious processing the mind does). In both cases, the application software and the operating system, they are just programs being run by the computer.

This is not a great analogy (I am not a fan of analogies) but it is better than the "electricity" one.

 

Yes, electricity is the medium upon which information is conveyed and they both have that in common; only the generation of that electricity is different.

Link to comment
Share on other sites

5 minutes ago, StringJunky said:

Yes, electricity is the medium upon which information is conveyed and they both have that in common; only the generation of that electricity is different.

Well yes, but so is its conveyance; not to mention all the other methods of communication.

 

I hate to have any sort of agreement with gees but that does limit the potential of our current iteration of computers, to jump the gap.  

Link to comment
Share on other sites

18 minutes ago, StringJunky said:

Yes, electricity is the medium upon which information is conveyed and they both have that in common; only the generation of that electricity is different.

Well, information in neurons is conveyed by chemical signals rather than electric currents. But, yes, the communication is another analogy that could be made (I don't think that is what Gees intended though.)

9 minutes ago, dimreepr said:

I hate to have any sort of agreement with gees but that does limit the potential of our current iteration of computers, to jump the gap.  

I don't see why. Computation is not defined by the communication mechanism but by the operations performed. As far as we know the brain and a computer are limited to the same set of operations and the same limits of computability. So nothing we currently know says that strong AI is impossible. Apart from a belief in some magical "extra" that defines the human mind.

 

Link to comment
Share on other sites

3 minutes ago, Strange said:

Well, information in neurons is conveyed by chemical signals rather than electric currents. But, yes, the communication is another analogy that could be made (I don't think that is what Gees intended though.)

There is a continuous transfer of electrons in either and it's actually electrochemical in organic substrates. By 'information', I mean the pattern of the impulses.

Link to comment
Share on other sites

1 minute ago, StringJunky said:

There is a continuous transfer of electrons in either and it's actually electrochemical in organic substrates. By 'information', I mean the pattern of the impulses.

Agree completely :) (Didn't mean to sound as if I was disagreeing.)

Link to comment
Share on other sites

6 minutes ago, Strange said:

I don't see why. Computation is not defined by the communication mechanism but by the operations performed. As far as we know the brain and a computer are limited to the same set of operations and the same limits of computability. So nothing we currently know says that strong AI is impossible. Apart from a belief in some magical "extra" that defines the human mind.

Fair point, but I did say "our current iteration of computers".

Link to comment
Share on other sites

Just now, dimreepr said:

Fair point, but I did say "our current iteration of computers".

But (as far as we know) there is no other type of computation that can do things that current computers can't do.

This is often used to argue that either:

a) Computers must be able to do everything the human mind can (sentience, creativity, emotions, etc); or

b) The brain can't be just a computer; there must be "something else" involved.

The question is how can we ever know. If one day there is an AI that claims to be sentient, says it loves beautiful music, falls in love, cries in a romantic movie, writes poetry, etc. then some people will say it is just pretending to experience those things because that is what it is programmed to do. Others will say that is no different from humans.

So, even if we were to create a "real" AI, it won't settle the argument!

Link to comment
Share on other sites

4 minutes ago, Strange said:

The question is how can we ever know. If one day there is an AI that claims to be sentient, says it loves beautiful music, falls in love, cries in a romantic movie, writes poetry, etc. then some people will say it is just pretending to experience those things because that is what it is programmed to do. Others will say that is no different from humans.

So, even if we were to create a "real" AI, it won't settle the argument!

An AI that objects to being turned off, is evidence that may sway the argument, but if they "take our jobs" it probably wont.

Link to comment
Share on other sites

14 minutes ago, Strange said:

But (as far as we know) there is no other type of computation that can do things that current computers can't do.

This is often used to argue that either:

a) Computers must be able to do everything the human mind can (sentience, creativity, emotions, etc); or

b) The brain can't be just a computer; there must be "something else" involved.

The question is how can we ever know. If one day there is an AI that claims to be sentient, says it loves beautiful music, falls in love, cries in a romantic movie, writes poetry, etc. then some people will say it is just pretending to experience those things because that is what it is programmed to do. Others will say that is no different from humans.

So, even if we were to create a "real" AI, it won't settle the argument!

One could argue; How do I know you are sentient? You could be emulating the behaviour of a sentient being. If we are the sum of our behaviours then so is an AI, if it performs as we do. I know we have agreed on this before.

Edited by StringJunky
Link to comment
Share on other sites

2 minutes ago, StringJunky said:

One could argue; How do I know you are sentient? You could be emulating the behaviour of a sentient being. If we are the sum of our behaviours then so is an AI, if it performs as we do. I we have agreed on this before.

Indeed but the real question is, do I care...

Link to comment
Share on other sites

4 minutes ago, StringJunky said:

One could argue; How do I know you are sentient? You could be emulating the behaviour of a sentient being. If we are the sum of our behaviours then so is an AI, if it performs as we do. I know we have agreed on this before.

Exactly. The Turing test was suggested as a solution to this, but it seems to be insufficient. 

2 minutes ago, dimreepr said:

Indeed but the real question is, do I care...

Just realised that there are 10 pages of this thread, most of which I have not read. I don't want to rehash old arguments (I'm sure others are quite capable of doing that!) so maybe I should drop out again now ...

12 minutes ago, dimreepr said:

An AI that objects to being turned off, is evidence that may sway the argument, but if they "take our jobs" it probably wont.

An AI that demands to see a priest before you turn it off might be a good indication!

Link to comment
Share on other sites

FreeWill;

 

On 5/19/2019 at 7:30 AM, FreeWill said:

Why would it be an oxymoron?

I know you think sentience requires subjective unconsciousness,  but I think just because You think like that, objective sentience of AI's or humans is not prohibited or unachievable. 

Sentience is the ability to feel, perceive, or experience subjectively -- according to Wiki. Subjectivity is an inner perspective and experience of information. Objectivity is an outward perspective and experience of information. Subjectivity and objectivity are opposites. You can look this up if you don't believe me.

So when you say, "objective sentience", what you are really saying is something like, "I was in the front yard all morning making beds and cleaning bedrooms." This makes no sense as beds and bedrooms are not IN the front yard. An oxymoron is when you put opposing words together that make no sense, or are impossible.

I never stated that AI sentience was prohibited or unachievable. I said it didn't seem likely and I didn't like the idea.

Feedback is important to me, so if you have more questions, I will be happy to address them. You already helped me once when you showed me that my use of the word "digital" was confusing people, so I studied until I found "discrete data", which I hope conveys my meaning better. Thank you for your time.

Gee

 

Strange;

 

On 5/19/2019 at 8:15 AM, Strange said:

That is a ridiculous argument. That would be like someone at the time when only black and white television existed saying that the concept of "colour television" is hard to accept because they have only seen B&W TV.

Lots of "concepts" are interesting; some are potentially real and others are just imagination. Are you criticizing me for asking for evidence? Isn't this a Science center?

 

Quote

It is also a straw man because no one has claimed that any form of current AI is sentient and has a "self".

Have you read the title of this thread?

 

Quote

You often claim to be a "philosopher" and yet you are willing to make up your own meanings for words. Clarity of expression is at the heart of philosophy. You need to try harder.

Maybe I worked too long in law; lawyers are notorious for making up their own meanings and words -- as are philosophers -- think Vitalism. No, it is not a power drink or a vitamin. :D

Would it help if I explained that ten plus years ago, when I had my last major attack of MS, I lost some cognitive skills, at least half of my vocabulary, and the ability to read? It took two years to learn to read again, then I spent the next five years walking around with a dictionary or thesaurus, because I could not think of, or find, words that I know damned well I know. Luckily, things that I already understood remained in my understanding, and only new technologies that I had never studied became difficult to learn. I had never experienced difficulty learning anything that I wanted to -- prior to that attack. 

Because I did not lose my understanding of things, it is easy to see that your insistence, that I am not a real philosopher, is in fact a personal attack. Do you understand that it is a personal attack?

 

Quote

 

This is even worse.

One can draw an analogy between the fact that there is an image in the mirror (and not "reality") and the way that our brain constructs our perception of external "reality".

But it is just an analogy. You cannot use it to draw conclusions about the behaviour or nature of the brain. Because the brain is not a mirror. 

Basically, you are confusing the image in the mirror for the real thing by trying to make this argument.

 

Actually I was trying to clarify our understanding of consciousness with this analogy. I am certain that you are aware of the illusion theories in consciousness that imply that everything we know is just illusion as it is not direct experience. Illusion theories are referring to the mirror. Most people believe that their consciousness is the rational conscious aspect of mind where we think our thoughts and plan our days. These people are also referring to the mirror. So anyone, who thinks that consciousness is thought, is actually supporting the illusion theories, as thought is not direct experience. It is knowledge of, or a reflection of, direct experience.

We do have direct experience, but it works through the unconscious aspect of mind, which in turn is interpreted into, or reflected into, the conscious aspect of mind. In the unconscious, we feel, perceive, and sense things directly, which is why all life can feel, sense, or perceive, because it is conscious. The unconscious is actually our consciousness. Self also sources through the unconscious, which is only one reason why it is such a confusing subject. 

So this is the reason that I don't support the illusion theories, as I think they are based on a false premise, that consciousness is thought. I suspect that AI sentience is also based on that same premise, that consciousness is thought. There are people who believe that if there is enough thought moving fast enough, that it will become analogue and consciousness will "emerge". This theory is, I believe, based on "complexity" and the idea that the unconscious aspect of mind sources from the conscious aspect of mind. Maybe it can work that way, but I haven't seen it, so I am not buying it. 

 

Quote

Or maybe the unconscious part of the mind is just a different sort of processing going on in the brain.

Why does it have to be in the brain? For myself, I don't see a big difference between chemicals floating around in the brain and chemicals floating around in an ecosystem.


 

Quote

 

The analogy with electricity is wrong. Electricity provides energy to power the computer, The thing that provides the energy to power the brain is biochemistry (and ultimately the food we eat).

One could draw a dubious analogy between the conscious mind and application software, on the one hand, and the unconscious mind and the operating system on the other (most people are not aware of what the operating system is doing behind the scenes, in the same way they are unaware of the unconscious processing the mind does). In both cases, the application software and the operating system, they are just programs being run by the computer.

This is not a great analogy (I am not a fan of analogies) but it is better than the "electricity" one.

 

Fine. Consciousness is the application software. The unconscious is the operating system. What empowers the system? I think we are back to electricity.

Do you remember Watergate? The reporters, who broke the story said that they "followed the money" to learn the truth. Why did they follow the money? Because money is power, it is causal and traceable. When I study consciousness, I am looking for the power and have traced it to the unconscious, which I am studying now.

 

On 5/19/2019 at 9:45 AM, Strange said:

Well, information in neurons is conveyed by chemical signals rather than electric currents. But, yes, the communication is another analogy that could be made (I don't think that is what Gees intended though.)

I don't know why you would assume that my intention is otherwise. I have often stated that consciousness is essentially communication.

After watching a video where Feynman tried to explain magnets, I brought up that idea in a thread. The silly twits in the thread thought that I was comparing myself to Feynman and got angry. I was not. What I was doing is noting the commonalities of the problem of understanding consciousness and the problem of understanding magnets, as I see similarities.

Feynman explained that electricity was and was not the source of magnets, then went on to state that we would have to be a student in his class to understand that. That was the first time, and probably the only time, that I wished I had turned to Science so that I could have been a student in his class. An understanding of bonding and physics will be necessary to actually understand consciousness. imo

Gee

Link to comment
Share on other sites

4 hours ago, Gees said:

Sentience is the ability to feel, perceive, or experience subjectively -- according to Wiki. Subjectivity is an inner perspective and experience of information

I think it is not so clear as according to wiki:

In modern Western philosophy, sentience is the ability to experience sensations

Sensation is an animal's, including humans', detection of external or internal stimulation. (e.g., eyes detecting light waves, ears detecting sound waves).

I think gathered information is based on reality even the individual has subjective perception. 

An AI can have billions of IoT devices providing data, while can register and analyze billions of peoples perception of a scenario which gives the possibility to a factual recognition of a scenario.

4 hours ago, Gees said:

Objectivity is an outward perspective and experience of information.

I do not really understand what you wanna say with this, but here is a thought about objectivity.  

Wiki:

Subjectivity:

  • Some information, idea, situation, or physical thing considered true only from the perspective of a subject or subjects.

Objectivity in science is an attempt to uncover truths about the natural world by eliminating personal biases, emotions, and false beliefs.

Objectivity is a philosophical concept of being true independently from individual subjectivity caused by perception, emotions, or imagination. 

AI can have every information to analyze about a scenario, including the perception of the AI, as well as the involved human individual sensations and perceptions, which means since the source of the same information is multiple, objectivity can be an option.   

 

Edited by FreeWill
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.