Jump to content

AI sentience


Prometheus

Recommended Posts

5 minutes ago, dimreepr said:

What about my worm?

Much more difficult to measure the responses but the body react on pain (electroshock) and actively react with its capabilities on the environment.

Sadly we can not ask them about it because the lack of communication routes and Intelligence (a bit like between an Advanced Universal AI and Humanity)

Edited by FreeWill
Link to comment
Share on other sites

25 minutes ago, dimreepr said:

What about my worm?

Much more difficult to measure the responses but the body react on pain (electroshock) and actively act with its capabilities on the environment.

Sadly we can not ask them about it because the lack of communication routes and Intelligence (a bit like between an Advanced Universal AI and Humanity)

15 minutes ago, dimreepr said:

Not really, or at all...

We can not ask them about their exact point of self awareness because they do not hear and can not speak. (They have for sentience few light sensible cells over the body and tactile hair)

We have 500 million years of evolutionary difference between worms and humans.

I think between the biological humanity and an advanced AI existing 500 million years in technological singularity, the difference will be similar what humans and worms has in between (physically too). 

Edited by FreeWill
Link to comment
Share on other sites

On 4/30/2019 at 2:06 PM, dimreepr said:

My dog, by all conventional thinking is not self-aware but I have no doubt that she has a sense of self; perhaps if I add preservation to "a sense of self" it'll become clear.

Here is a test about self-awareness in animals including ants. (wikipedia)

https://en.wikipedia.org/wiki/Mirror_test

Insects

  • In a Belgian study from 2015, 23 out of 24 adult ants scratched at small blue dots painted on their clypeus (part of their "face") when they were able to see the dot in a mirror. According to the purported results, the ants were individually tested and were from three species, Myrmica sabuleti, Myrmica rubra and Myrmica ruginodis. None of the ants scratched the clypeus when they had no mirror to see the dot. None tried to scratch the blue dot on the mirror. When they had a mirror and a brown dot similar to their own color, only one of thirty ants scratched the brown dot; researchers said she was darker than average so the dot was visible. They also reacted to the mirror itself. Even without dots, 30 out of 30 ants touched the mirror with legs, antennae and mouths, while 0 of 30 ants touched a clear glass divider, with ants on the other side. Ants a few days old did not react to the dots. These three species have limited eyesight, with 109–169 facets per eye, and the authors suggest doing tests on ants with more facets (some have 3,000) and on bees.[3][29][30]
Edited by FreeWill
Link to comment
Share on other sites

16 minutes ago, FreeWill said:

Here is a test about self-awareness in animals including ants. (wikipedia)

https://en.wikipedia.org/wiki/Mirror_test

Insects

  • In a Belgian study from 2015, 23 out of 24 adult ants scratched at small blue dots painted on their clypeus (part of their "face") when they were able to see the dot in a mirror. According to the purported results, the ants were individually tested and were from three species, Myrmica sabuleti, Myrmica rubra and Myrmica ruginodis. None of the ants scratched the clypeus when they had no mirror to see the dot. None tried to scratch the blue dot on the mirror. When they had a mirror and a brown dot similar to their own color, only one of thirty ants scratched the brown dot; researchers said she was darker than average so the dot was visible. They also reacted to the mirror itself. Even without dots, 30 out of 30 ants touched the mirror with legs, antennae and mouths, while 0 of 30 ants touched a clear glass divider, with ants on the other side. Ants a few days old did not react to the dots. These three species have limited eyesight, with 109–169 facets per eye, and the authors suggest doing tests on ants with more facets (some have 3,000) and on bees.[3][29][30]

 

AGAIN, what is your point?

Edited by dimreepr
Link to comment
Share on other sites

I suspect he’s challenging your assumptions about which life forms are and are not self-aware. It’s premature at best to root any arguments in such questionable premises. 

Link to comment
Share on other sites

14 minutes ago, iNow said:

I suspect he’s challenging your assumptions about which life forms are and are not self-aware. It’s premature at best to root any arguments in such questionable premises. 

Which premise? Life forms or self-awareness?

Link to comment
Share on other sites

54 minutes ago, dimreepr said:

Which premise? Life forms or self-awareness?

Your question doesn't make sense in context. You asserted humans are self-aware, but dogs are not. When challenged, you moved the arbitrary threshold and instead asked about worms. 

The short answer is we don't really know. Maybe worms are aware. Freewill shared evidence suggesting bugs and ants are. 

Either way, you made an assertion that has been shown to be lacking. You seem to be suggesting that assertion applies equally to computers and AI, even though the original assertion (or "premise") is itself quite questionable.

The point you keep asking about is that we don't yet know enough to make such firm assertions in the way you've done throughout the thread.

/tryingtohelp

Link to comment
Share on other sites

Prometheus:

I have been off line for a while now, but am actually going to respond in this thread. I read the entire thread and there is so much disinformation and misunderstanding about consciousness in it, that I felt I should at least try to shed some light. 

 

On 4/4/2019 at 12:48 PM, Prometheus said:

What's people's opinions on this: can AI become sentient?

Taking the wikipedia definition:

 Sentience is the capacity to feel, perceive or experience subjectively

Can a fundamentally quantitative system really experience subjectivity?

Personally, given sentience has evolved at least once on Earth, i don't see why it can't manifest from a different substrate. But that's similar reasoning to given i'm alive at least once, i don't see why i can't live again...

I have no idea if AI will become sentient. I don't expect it to, but can not be sure that it is impossible either.

First I think it would be a good idea to define some terms. Sentience is a feeling, and feeling is experienced subjectively. People will state that you can put a sensor on some equipment and that makes it sentient -- it does not. Although the equipment may be able to sense things, that does not imply a subjective experience. The real question of whether or not AI is sentient is whether or not it can have subjective experience.

Also it is important to not confuse sentient (feeling) with sapient (thought) -- they are not the same thing. AI is sapient, or can be.

All life, ALL life, is sentient. It is one of the tests we use to determine life. I got that from a working neurologist, who was a moderator in another Science forum, and who also was working on AI projects. How do we know that all life is sentient? Because all life has survival instincts, and survival instincts all work through feeling/emotion. We are now calling survival instincts "self preservation" because someone thought it was a better explanation as life will do whatever it can, including adapting, to preserve not only it's individual self, but also to preserve it's specie.

So now we are down to consciousness. Consciousness is information, yes, but it has different forms. Consciousness is not one simple pure thing. Most people associate it with thought, and that is true, but only a partial explanation. Consciousness is what we think, what we know, what we remember, what we are aware of, what we feel, and our emotions. There are six basic components to consciousness that together make up the various mental aspects that we experience. 

Of these components, three, thought, knowledge, and memory, are digital. They make up the rational conscious aspect of mind and require a brain in order to be known because the brain digitalizes consciousness into thoughts. The brain actually produces thought. AI is a representation of this digitalized, logical, rational aspect of mind.

The other three components, awareness, feeling, and emotion, are analogue. They make up the unconscious aspect of mind and are not known, but are experienced. This is why the unconscious aspect of mind is un-conscious, because we can not know the information unless it is digitalized by the brain. When we have a fever, the body has been working on and aware of the problem for hours or even days before we know about it. When we have an instinctive reaction, we don't know about it and do not even try to control it until we experience it and the information is then digitalized into knowledge of that experience by the brain.

A daffodil has knowledge of how to grow, maintain itself, and reproduce, probably in it's DNA, but it does not consciously know this. It reacts to the sun and to water by analogue feeling and awareness of a need, not with knowledge or thought -- because it does not have a brain.

So where does subjective experience come into this? Well it appears to show up in the analogue aspects of consciousness because all life is sentient -- no brain required. So can a representation of a brain, digital thought, actually cause analogue experience? I don't see how. According to evolution, the exact opposite is how it originally worked, analogue to digital.

 

Intrigued:

On 4/8/2019 at 12:19 PM, Intrigued said:

The day they develop religion and start sacrificing calculators.

You know, this is actually kind of true. We study other species and have found some primates and elephants that seem to be developing some kind of death rituals when one of their specie dies. It is an indication of their intelligence that they are beginning to understand a type of spirituality, which is actually an awareness of the unconscious. So if AI is already intelligent and develops an analogue unconscious, what would happen?

 

wtf:

On 4/23/2019 at 10:24 PM, wtf said:

My seeming rage? Is there a language issue? I'm poking on the concept of emergence. I think it's quite murky. There are philosophers who agree with me. I guess I can't engage in further discussion till I get clarity here. If I'm upsetting you somehow I'll stop replying to you.

I don't care if you call consciousness emergent. I'm trying to get you to explain to me what that adds to anyone's understanding of consciousness. If you can't tell me that, then perhaps emergence isn't a useful concept.

I also have a problem with the emergence idea, but maybe I can give you a little information. 

When people talk about consciousness, they generally mean the rational conscious aspect of mind -- thought. Thought comes from the brain, so you could call it emergence because the brain produces digital consciousness (thought) out of analogue consciousness (experience).

All life possesses analogue consciousness, feeling and awareness, so that is the actual source point. I suspect at this point in my studies that we are going to have to learn a great deal more about bonding before we can answer the questions of life and beginning consciousness. We are going to have to accept that the physical and the metaphysical join to create life. How bonding works is the immediate question. imo

Gee

Link to comment
Share on other sites

1 hour ago, Gees said:

survival instincts all work through feeling/emotion.

Hello Gees, 

How do you mean this? Isn't survival instinct is a genetically predetermined reaction, refined by experience and knowlegde, and can be ignited by sensation?

 

1 hour ago, Gees said:

Of these components, three, thought, knowledge, and memory, are digital. They make up the rational conscious aspect of mind and require a brain in order to be known because the brain digitalizes consciousness into thoughts. 

What do you mean by this? My knowledge, memory, and thoughts are digital? I think they CAN be digitalized but fundamentally they are physical. The brain perceives and organize information and based on that give a reaction. The different levels of knowledge, sentience, and awareness gives different levels of consciousnesses, but they still will be physical.

 

1 hour ago, Gees said:

The other three components, awareness, feeling, and emotion, are analogue. They make up the unconscious aspect of mind and are not known, but are experienced.

What do you mean by this? Why would be unconscious if I love or hate someone? I can be (consciously) aware of my abilities.  I think I can know about my feelings, emotions, etc

 

1 hour ago, Gees said:

 daffodil has knowledge of how to grow, maintain itself, and reproduce, probably in it's DNA, but it does not consciously know this.

How do you know that it is Absolutely unaware of its physical properties? 

Consciousness has different levels. How can you determine what is the minimum level of sentience and by that consciousness? (Bacterias can sense the environment....)

 

1 hour ago, Gees said:

the brain produces digital consciousness (thought)

Why do you call thoughts as digital consciousness? Can you elaborate on this? It is unclear what do you mean by this. 

Edited by FreeWill
Link to comment
Share on other sites

FreeWill:

 

34 minutes ago, FreeWill said:

Hello Gees, 

How do you mean this? Isn't survival instinct is a genetically predetermined reaction, refined by experience and knowlegde, and can be ignited by sensation?

I have no doubt that survival instincts are genetically predetermined. They work through chemistry,, mostly hormones and pheromones.

If you go to Wiki and look up hormones, about half way down the page there is a listing of different things that hormones control in bodies. You will find that survival instincts are contained within that list. All multicell species have hormones and pheromones. Survival instincts, whether we are talking about the need to sleep, eat, protect our young, feel lusty, be afraid of a big bear, or any other survival instinct, they are all activated by or through some kind of feeling/emotion. This isn't terribly surprising since hormones affect emotion and emotion affects hormones -- it is circular.

 

Quote

What do you mean by this? My knowledge, memory, and thoughts are digital? I think they CAN be digitalized but fundamentally they are physical. The brain perceives and organize information and based on that give a reaction. The different levels of knowledge, sentience, and awareness gives different levels of consciousnesses, but they still will be physical.

Yes. Thoughts are digital. Imagine a bucket of sand (digital) and a bucket of water (analogue). It might take a while, but you can actually count the pieces of sand; you can not count the pieces of water. Thoughts are individual like the sand. 

Maybe this will help to explain the difference. Have you heard that memory is not reliable? Actually it is pretty reliable, but emotional memory is not. This is because emotion is analogue, fluid, so it can actually change your memory to something that never happened, or exaggerate your memory, or even erase your memory. This has been well documented and is one of the primary reasons for debriefing. It is important to record the memory of an emotional situation as soon as possible, preferably within 24 hours, before your emotions have time to change the memory.

 

Quote

What do you mean by this? Why would be unconscious if I love or hate someone? I can be (consciously) aware of my abilities.  I think I can know about my feelings, emotions, etc

This is harder to explain, and I probably won't be able to do it fully, but will give you some examples. 

The unconscious aspect of mind is ruled by emotion -- not rational thought. This is where prejudice comes from, when we make a biased decision and don't realize why we did it. This is also what Psychology studies, when we have behaviors that do not match up with the circumstance that produced them. This is also where instincts, intuition, imagination, dreams, and a lot of other feelings and thoughts originate, and these are not things that we planned or thought out. The conscious mind is something that we direct, the unconscious is not. The unconscious is where we are always playing catch up.

Another point is that emotion is often difficult to verbalize and to put into thought. We often express emotion through art, music, dance, or poetry because it is easier to communicate it that way because it is analogue.

 

Quote

 

How do you know that it is Absolutely unaware of its physical properties? 

Consciousness has different levels. How can you determine what is the minimum level of sentience and by that consciousness? (Bacterias can sense the environment....)

 

Because daffodils don't have a brain. Thought comes from the brain. Do not confuse awareness with thought, as they are two very different things. There is no testing that I know of that claims a daffodil is self aware. Actually I know of no testing that any specie without a brain is self aware. 

I did not determine these levels and suspect that Philosophy and Biology had a hand in that determination.

Gee

Link to comment
Share on other sites

Hi Gee, thanks for the input. I've been a spectator on this thread, because it's a subject i need to think about carefully and i haven't had the time recently, and everyone else is contributing so much i've been able to watch and learn. But here's some quick, and likely muddled, thoughts on your contribution. 

 

3 hours ago, Gees said:

Sentience is a feeling, and feeling is experienced subjectively. People will state that you can put a sensor on some equipment and that makes it sentient -- it does not.

Perhaps the salient feature for sentience is that sensors are somehow linked to give a representation of the external world. A new sensory input will need to fit into that existing representation. But this is no different in humans: an eye in isolation doesn't 'see' anything, it needs a brain to parse the information.

 

3 hours ago, Gees said:

and survival instincts all work through feeling/emotion.

I'm not sure that's true. If a flame touches my skin the reflex arc doesn't go to my brain, it's initiated by the spine (IIRC) - then there is a delayed (milliseconds) feeling as the brain catches up with what's happening in the body and I feel pain.

 

3 hours ago, Gees said:

Of these components, three, thought, knowledge, and memory, are digital. They make up the rational conscious aspect of mind and require a brain in order to be known because the brain digitalizes consciousness into thoughts. The brain actually produces thought. AI is a representation of this digitalized, logical, rational aspect of mind.

The other three components, awareness, feeling, and emotion, are analogue. They make up the unconscious aspect of mind and are not known, but are experienced. This is why the unconscious aspect of mind is un-conscious, because we can not know the information unless it is digitalized by the brain. When we have a fever, the body has been working on and aware of the problem for hours or even days before we know about it. When we have an instinctive reaction, we don't know about it and do not even try to control it until we experience it and the information is then digitalized into knowledge of that experience by the brain.

 

My understanding was that all cognitive processes contain both analogue and digital neurophysiology. Do neuroscientists now believe that memory, for instance, is an entirely digital process? Do you have any references i could look up regarding this?

Link to comment
Share on other sites

35 minutes ago, Prometheus said:

My understanding was that all cognitive processes contain both analogue and digital neurophysiology. Do neuroscientists now believe that memory, for instance, is an entirely digital process? Do you have any references i could look up regarding this?

Does this sound like a digital process?:

Quote

Since the early neurological work of Karl Lashley and Wilder Penfield in the 1950s and 1960s, it has become clear that long-term memories are not stored in just one part of the brain, but are widely distributed throughout the cortex. After consolidation, long-term memories are stored throughout the brain as groups of neurons that are primed to fire together in the same pattern that created the original experience, and each component of a memory is stored in the brain area that initiated it (e.g. groups of neurons in the visual cortex store a sight, neurons in the amygdala store the associated emotion, etc). Indeed, it seems that they may even be encoded redundantly, several times, in various parts of the cortex, so that, if one engram (or memory trace) is wiped out, there are duplicates, or alternative pathways, elsewhere, through which the memory may still be retrieved.

Therefore, contrary to the popular notion, memories are not stored in our brains like books on library shelves, but must be actively reconstructed from elements scattered throughout various areas of the brain by the encoding process. Memory storage is therefore an ongoing process of reclassification resulting from continuous changes in our neural pathways, and parallel processing of information in our brains.

http://www.human-memory.net/processes_storage.html

 

Edited by StringJunky
Link to comment
Share on other sites

1 hour ago, Gees said:

I have no doubt that survival instincts are genetically predetermined.They work through chemistry,, mostly hormones and pheromones.

No, they work through the nerve and musculoskeletal system which is supported and regulated by the hormone system.

1 hour ago, Gees said:

listing of different things that hormones control in bodies. You will find that survival instincts are contained within that list.

Can you provide the link, please?

I know that hormones regulate a lot of chemistry related processes in the body, which can have a role in surviving a scenario, but they do not control the body, they regulate! cell functions.

1 hour ago, Gees said:

Yes. Thoughts are digital. Imagine a bucket of sand (digital) and a bucket of water (analogue). It might take a while, but you can actually count the pieces of sand; you can not count the pieces of water. Thoughts are individual like the sand. 

 I can count every silicon atom in every grain of sand and every hydrogen and oxygen atom in the bucket of water. I also can know how many H2O molecules are in the bucket.

Thoughts have an individual spacetime when they occur, but they plastic so they change by time and experience, thinking, learning...

I can be aware of my past understandings (thoughts), and how they have been changing by Time.  

Thoughts can be common as well. For example when we are understanding scientific or philosophic recognitions. 

 

Edited by FreeWill
Link to comment
Share on other sites

20 hours ago, iNow said:

Your question doesn't make sense in context. You asserted humans are self-aware, but dogs are not. When challenged, you moved the arbitrary threshold and instead asked about worms. 

The short answer is we don't really know. Maybe worms are aware. Freewill shared evidence suggesting bugs and ants are. 

Either way, you made an assertion that has been shown to be lacking. You seem to be suggesting that assertion applies equally to computers and AI, even though the original assertion (or "premise") is itself quite questionable.

The point you keep asking about is that we don't yet know enough to make such firm assertions in the way you've done throughout the thread.

/tryingtohelp

Ok, yes, I got sidetracked but in the context of the OP, whether an ant is or isn't self aware is of little importance when applied to a rock or a machine. When an AI objects to being turned off; I'll accept it as evidence that the machine can dream. 

Edited by dimreepr
Link to comment
Share on other sites

On 4/29/2019 at 4:58 PM, iNow said:

Dim said computers only do what they’re told. That’s patently untrue since they do things beyond their base programming every day. 

...

I hope a clearer understanding is emerging for us all through your continued reminding us that you find the term emergent property lacking. 

Computers (hardware + software) can only do exactly what they're told. If they're told to search a corpus of data, build a neural net with weighted nodes based on the data, and produce outputs based on the weightings, then that's exactly what it's told to do. It is not magic, it's deterministic coding. [And even with randomness, nondeterministic automata don't compute anything new].

I'm not sure if the emergence remark is sarcastic or not. I'm not talking about emergence anymore. I've heard about it and read about it for years and my opinion is, rightly or wrongly, the product of some thought.

On 4/30/2019 at 5:06 AM, dimreepr said:

It basically means we don't know anything other than it's more than the sum of it's parts.

Yes but that's true of EVERYTHING. My chair is more than a bunch of atoms. My liver is more than a bunch of molecules. The Mormon Tabernacle Choir is more than a bunch of singers. So yes, emergence is true. I don't deny that. It's just useless because it tells us nothing. Practically everything is more than the sum of its parts. 

Edited by wtf
Link to comment
Share on other sites

 If you add 2 cm to 3 cm you get 5 cm, that is nothing more than sum of its parts, because it is exactly that, although it can be also a sum of its different parts, such as 1 cm and 4 cm, and I get your Feynman point about energy, applied to emergence. I can not tell how useful concept it is, emergence I mean, but I find adding an useful operation. Much more than these sophisms that I just demonstrated. There surely is mathematics to express emergent phenomena that does not require questioning of usefulness of addition.

Link to comment
Share on other sites

8 hours ago, wtf said:

Yes but that's true of EVERYTHING. My chair is more than a bunch of atoms. My liver is more than a bunch of molecules. The Mormon Tabernacle Choir is more than a bunch of singers. So yes, emergence is true. I don't deny that. It's just useless because it tells us nothing. Practically everything is more than the sum of its parts. 

Does it matter? It's a word that describes something we know almost nothing about; I'd call it a wogawomphtamoph, it has a much better phonemic.

Link to comment
Share on other sites

Prometheus;

On 5/2/2019 at 7:49 AM, Prometheus said:

Hi Gee, thanks for the input. I've been a spectator on this thread, because it's a subject i need to think about carefully and i haven't had the time recently, and everyone else is contributing so much i've been able to watch and learn. But here's some quick, and likely muddled, thoughts on your contribution. 

Hi.  You have mentioned the brain in two of the following quotes and neurology in the other. If you truly want to understand what I am talking about, the first thing to do is understand that I am not talking about the brain. I study consciousness and mind, the brain being only ancillary to that study. Yes, I know that most people think that the brain and mind are the same thing, but there is little evidence to support that idea. Most of the evidence that we have associates the brain with the rational conscious aspect of mind, the Ego. The Ego is very small when comparted to the massive unconscious aspect of mind, the Superego, and we still don't know what the parameters of mind actually are. Some theories suggest that it is the entire Universe. I haven't gone that far, but you can be assured that I am not just talking about the brain.

 

Quote

Perhaps the salient feature for sentience is that sensors are somehow linked to give a representation of the external world. A new sensory input will need to fit into that existing representation. But this is no different in humans: an eye in isolation doesn't 'see' anything, it needs a brain to parse the information.

If the "salient feature" is as you state, then there is a real probability that surveillance equipment is conscious, and a possibility that my garage door opener is conscious. You would suggest that they have subjective experience, are conscious, and therefore have some sort of mind. Do you really believe that?

 

Quote

I'm not sure that's true. If a flame touches my skin the reflex arc doesn't go to my brain, it's initiated by the spine (IIRC) - then there is a delayed (milliseconds) feeling as the brain catches up with what's happening in the body and I feel pain.

And then what happens when you feel pain. Do you think, "Well, that is not pleasant." and continue to burn? Or do you get away from the flame to protect yourself then apply first aid? This is self preservation -- survival instincts. Even a plant will grow its roots toward water and turn its leaves and grow toward the sun to maintain itself. Trees that live along a river have been known to grow their roots into the soil to hold on to life when erosion tries to take them down, they will even grow extra branches over the solid earth to try to preserve their balance -- their life.

 

Quote

My understanding was that all cognitive processes contain both analogue and digital neurophysiology. Do neuroscientists now believe that memory, for instance, is an entirely digital process? Do you have any references i could look up regarding this?

First understand that I am not talking about the brain or "cognitive processes", I am talking about the components of consciousness. How the brain processes them is a whole different question. 

Do you remember that link you provided when we last talked that proved that the brain is analogue? You were trying to disprove my assertion that the brain is digital. The problem is that I would never state anything so wrong-headed. What I stated is that the brain digitalizes consciousness; it takes analogue consciousness and turns it into thought -- digital consciousness. It would have to be able to process both in order to do that. Now you can dispute that idea, but in doing so, what you are saying is that thought does not come from the brain. Or maybe you are saying that the tree mentioned above actually thinks about preserving itself.

Thoughts are digital, knowledge is essentially thought that is true, memory is generally stored thought or knowledge. If you want examples of entirely digital stored thought/knowledge (memory) then just take away awareness -- a book without a reader, or DNA outside of a body.

 

If it would help you to understand me better, you could look up "fluid and crystalized intelligence" in Wiki. It is not the same work that I do because it is focused on intelligence, but it is comparable. Fluid intelligence has much in common with what I call analogue consciousness, and crystalized intelligence is comparable to what I call digital consciousness.

Gee

Link to comment
Share on other sites

On 5/2/2019 at 10:23 PM, Hrvoje1 said:

 If you add 2 cm to 3 cm you get 5 cm, that is nothing more than sum of its parts, because it is exactly that, although it can be also a sum of its different parts, such as 1 cm and 4 cm, and I get your Feynman point about energy, applied to emergence. I can not tell how useful concept it is, emergence I mean, but I find adding an useful operation. Much more than these sophisms that I just demonstrated. There surely is mathematics to express emergent phenomena that does not require questioning of usefulness of addition.

Was this for me? I sometimes don't read posts unless I get a little red mention rectangle when I come to the site. 

If there were mathematics to describe how consciousness comes out of brain goo, we wouldn't need the name emergence. Just like when we add hydrogen to oxygen in a lab, apply heat, and end up with water. We don't call that emergence, we call it chemistry. A chemist can describe the exact mechanism by which it happens. Likewise when we take pieces of wood and create a chair, we don't call it emergence, we call it carpentry and we know exactly how the process works. It seems that emergence is just a placeholder for processes that we don't understand. And after we call these processes emergent, we don't understand them any more than we did before. 

I am not sure I followed the example of addition. I didn't question the usefulness of addition. But addition of like quantities (centimeters in your example) is not an example of emergence. Consciousness somehow arising from brain goo is called an example of emergence, but that's a long way from addition of small integers.

Edited by wtf
Link to comment
Share on other sites

3 hours ago, Gees said:

I study consciousness and mind, the brain being only ancillary to that study. Yes, I know that most people think that the brain and mind are the same thing, but there is little evidence to support that idea.

3 hours ago, Gees said:

First understand that I am not talking about the brain or "cognitive processes", I am talking about the components of consciousness. How the brain processes them is a whole different question. 

Wait what now? :doh:It sure ain't up my arse... ;)

4 hours ago, Gees said:

If the "salient feature" is as you state, then there is a real probability that surveillance equipment is conscious, and a possibility that my garage door opener is conscious. You would suggest that they have subjective experience, are conscious, and therefore have some sort of mind. Do you really believe that?

One day, maybe; for the really clever garage door opener. 

When we, and I include you, don't understand consciousness, we can't dismiss the possibility.

 

3 hours ago, wtf said:

Was this for me? I sometimes don't read posts unless I get a little red mention rectangle when I come to the site. If there were mathematics to describe how consciousness comes out of brain goo, we wouldn't need the name emergence. Just like when we add hydrogen to oxygen in a lab, apply heat, and end up with water. We don't call that emergence, we call it chemistry. A chemist can describe the exact mechanism by which it happens. Likewise when we take pieces of wood and create a chair, we don't call it emergence, we call it carpentry and we know exactly how the process works. It seems that emergence is just a placeholder for processes that we don't understand. And after we call these processes emergent, we don't understand them any more than we did before. 

I've changed my mind, emergence is the perfect word; my consciousness emerged from the slime. 

Unless you consider evolution an unsatisfactory explanation.

Link to comment
Share on other sites

On 5/3/2019 at 6:23 AM, Hrvoje1 said:

 If you add 2 cm to 3 cm you get 5 cm, that is nothing more than sum of its parts, because it is exactly that, although it can be also a sum of its different parts, such as 1 cm and 4 cm, and I get your Feynman point about energy, applied to emergence. I can not tell how useful concept it is, emergence I mean, but I find adding an useful operation. Much more than these sophisms that I just demonstrated. There surely is mathematics to express emergent phenomena that does not require questioning of usefulness of addition.

Analogously, emergence is 3+2=6

Link to comment
Share on other sites

3 hours ago, StringJunky said:

Analogously, emergence is 3+2=6

In other words, false? 

 

4 hours ago, dimreepr said:

I've changed my mind, emergence is the perfect word; my consciousness emerged from the slime. 

 

Of course it did. As a chair emerges from atoms, my pancreas emerges from electrons, and consciousness arguably (but not yet conclusively) emerges from brain goo.

And once you've told me that, in each case I know nothing I didn't already know before you told me.

Edited by wtf
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.