Jump to content
Sign in to follow this  
Bernd.Brincken

Evidence for Strong AI

Recommended Posts

Machine learning employs 3 primary tools for learning; unsupervised techniques in which structures in data sets (such as visual, audio or text) are sought without any additional information, supervised learning in which a label is associated with each instance of learning (i.e. a cat picture is labelled as 'cat'), and reinforcement learning in which a score is used to optimise an algorithm (often used in gaming and anything that can can apply some metric to the state the algorithm sees at any instance).

A robot may then learn to walk by the experience of continually falling down via reinforcement learning. No words are needed, only a sense of balance.

Edited by Prometheus

Share this post


Link to post
Share on other sites
30 minutes ago, Bernd.Brincken said:

For training you need some kind of media or language to transfer knowledge from trainer to trainee.
As described in 'interpersonal communication' (WP), many signals are not expressed in words, and may not even be expressible in language.
Experience does not need this.
A (human / AI) 'intelligent' system can interpret such a signal out of the experience with the same situation.

Training can use exactly the same techniques as "experience".

Training and experience are the same thing.

Share this post


Link to post
Share on other sites
2 hours ago, Strange said:

Training can use exactly the same techniques as "experience".

Training and experience are the same thing.

It does not make it easier for the AI cause if you prefer to mix the terms in this way 😉

There are non-language interpersonal signals in human relations which rely on similar experiences:
Joy, grief, fear, success, embarrassment, mobbing, humor, desparation, optimism etc.
How do you imagine an AI to gather (or be trained on) these experiences, in order to be able to understand humans?

2 hours ago, Prometheus said:

Machine learning employs .. unsupervised techniques .. supervised learning .. and reinforcement learning ..

A robot may then learn to walk by the experience of continually falling down via reinforcement learning. No words are needed, only a sense of balance.

Yes, this works.
But what is the relation to human-human ( /-AI) interpersonal communication?
Walking either works or not. As long as I fall - I have to try again, wisely with modified movements.
In interpersonal communication - and further into society - there is no direct, binary success-feedback of this kind.

Edited by Bernd.Brincken

Share this post


Link to post
Share on other sites
1 hour ago, Bernd.Brincken said:

There are non-language interpersonal signals in human relations which rely on similar experiences:
Joy, grief, fear, success, embarrassment, mobbing, humor, desparation, optimism etc.
How do you imagine an AI to gather (or be trained on) these experiences, in order to be able to understand humans?

Some of these may be instinctive in humans. (Although I don't think that is certain.) In which case, you build the same knowledge into the AI.

Others are learned by experience. And an AI could do the same.

1 hour ago, Bernd.Brincken said:

Yes, this works.
But what is the relation to human-human ( /-AI) interpersonal communication?
Walking either works or not. As long as I fall - I have to try again, wisely with modified movements.
In interpersonal communication - and further into society - there is no direct, binary success-feedback of this kind.

Walking is not a binary thing, either. The AI robot's first attempt, after falling over once, might be to proceed by falling, dragging itself forward with its arms, standing up and then falling over again. It might progress from that to crawling. Then eventually staggering. Then walking in the most efficient way.

Similar stages of progression could take place in communication skills.

All of your counter arguments seem to consist of: "this thing [which is well known and demonstrated in practical systems] could never happen".

I think you should go and study some examples of machine learning (for communication, interaction, mechanical skills, etc) before dismissing it as implausible. An argument from ignorance or incredulity is never convincing. As I said in my first post.

Share this post


Link to post
Share on other sites
On 2/10/2020 at 2:02 PM, dimreepr said:

Strong AI (AGI) isn't trying to-be-human.

That depends on what data and how, are delivered to AI. If AI has to understand human words must be able to see and hear. e.g. "elephant" for algoritm is just a word. Sequence of characters. Algorithm will read Wikipedia page about elephant. Read the all books about animal but won't be able to truly understand. Will have the all text information about animal. Like chatbot. Repeating the same sentences over and over again. The all info except images and sounds etc. Human without data from other senses is unable to imagine. During teaching of human, words are correlated with images and sounds and touch and smell etc. Together they are full information about the subject. Try to explain colors and sounds to somebody who can't see and hear since birth. That's kinda like AI without eyes and ears. What means distance for somebody who can't touch and see and hear? Something is far or near? Without eyes and ears impossible to explain. "Website or server is far away therefor long delay to load it..." to explain "distance" to algorithm? 

If one programmer is implementing AI using human brain simulating techniques, then AI will identify itself as human.. 

If I will ask you: who are you? You will answer that you're human. Are you sure? How can you know that? Maybe you are just AI?

Share this post


Link to post
Share on other sites
6 hours ago, Sensei said:

If I will ask you: who are you? You will answer that you're human. Are you sure? How can you know that? Maybe you are just AI?

That might be true, but I'm not trying to be human. 😉

Share this post


Link to post
Share on other sites
13 hours ago, Sensei said:

That depends on what data and how, are delivered to AI. If AI has to understand human words must be able to see and hear. e.g. "elephant" for algoritm is just a word. Sequence of characters. Algorithm will read Wikipedia page about elephant. Read the all books about animal but won't be able to truly understand. Will have the all text information about animal. Like chatbot. Repeating the same sentences over and over again. The all info except images and sounds etc. Human without data from other senses is unable to imagine. During teaching of human, words are correlated with images and sounds and touch and smell etc. Together they are full information about the subject. Try to explain colors and sounds to somebody who can't see and hear since birth. That's kinda like AI without eyes and ears. What means distance for somebody who can't touch and see and hear? Something is far or near? Without eyes and ears impossible to explain. "Website or server is far away therefor long delay to load it..." to explain "distance" to algorithm?

Not a perfect correlation, but AI might gain a sense of distance in terms of its own processing delays and ping times(roughly via d/c=t).

It could match that against a database to gain a sense of the larger world.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.