Jump to content

AI as a threat


paragaster

Recommended Posts

An AI does not possess the sensitivity and sense of responsibility towards humanity. Relegating important and Life changing decisions to AI is a disaster.

We humans are connected with each other by a single chord : we have failings and we Experience.

None of this is possible for an AI. It does not "experience" the world. It only has "knowledge".

Any decision taken without having experience is disaster.

 

I want a full discussion on merits and demerits of AI. Please avoid links.

Link to comment
Share on other sites

At the moment, AI is limited to specific applications, for example driving a car, diagnosing disease, and playing Go. Each application must be developed to be sensitive to human concerns. There may be some similarity among these applications, but there will certainly be differences. Does your full discussion mean explore the nuances of each application in regard to its sensitivity to human concerns? That would be difficult, since only the developers would know such details. Otherwise, please explain what you mean by a full discussion.

Link to comment
Share on other sites

They are likely to have failings of their own. Can always learn things incorrectly or miss something. Honestly, I am more concerned that we're likely to oversell their capabilities than anything else.

 

Your senses provide your brain with an awareness of the larger world. I don't feel there is a difference between that and the information we provide them access to.

 

I think we will need to teach them like we do our own. You wouldn't expect a baby to have much sense of responsibility or sensitivity either.

Edited by Endy0816
Link to comment
Share on other sites

Some AI experts believe AI will be able to do anything a human can do, and perhaps more. Some limits we might bound, for example how many words a person knows, but other things are unknown, for example what is the most beautiful sculpture possible.I'm still unsure what you want.

Link to comment
Share on other sites

Our Experiences influence us more than knowledge we receive. For example : Witnessing a murder has more impact than merely knowing about the murder on TV, social media etc.

Will the AI, say in the form of robots, get influenced by a murder, or merely record it and repeat it in front of the owner.

We work beyond our 5 senses. We will always be in a better position to take decisions than an AI.

Discuss.

Link to comment
Share on other sites

Google has already implemented a type of dream state for its neural networks as it aids in learning.

 

In any case, this discussion seems to be based mostly on the scifi stereotype of the highly analytical artificial intelligence that has no understanding of human feelings.

 

It's important to remember that this archetypical character is not actually based on any real artificial intelligence and there is no particular reason to expect that an advanced AI would behave remotely in the way that is typically depicted in fiction along these lines.

 

Our most advanced AIs these days are trained, rather than programmed. How they behave is impacted heavily by their experiences in training. They are, effectively, a bunch of circuitry and code that is build to be able to learn, and what it does depends on what it is taught. One of the reasons that Google has been such a front runner in this area is that they have the resources both to build large amounts of optimized hardware for the purpose and, as if not more importantly, access to truly massive amounts of data for training purposes.

 

Artificial Neural Networks can be trained to do image recognition, voice recognition, advanced translation, and in one of Google's projects, learned to recognize cats entirely unprompted after watching hours of YouTube videos.

 

This is where we are right now with AI. Expect that, if we do eventually obtain the necessary processing power and algorithmic optimization to pull it off, building a truly advanced AI will be a bit more like training a pet or teaching a child than most science fiction will have led you to believe.

 

(With the caveat that it will not be precisely identical and there is a fair degree of math underlying exactly how these systems operate)

Link to comment
Share on other sites

Our Experiences influence us more than knowledge we receive. For example : Witnessing a murder has more impact than merely knowing about the murder on TV, social media etc.

 

 

But most people never experience a murder, so that hardly seems relevant.

 

 

We work beyond our 5 senses.

 

In what way do we work beyond our senses? (There are many more than 5, by the way.)

 

 

 

We will always be in a better position to take decisions than an AI.

 

How do you know that?

Link to comment
Share on other sites

... humans took millions of years to evolve.

 

So? What has that got to do with whether AI can experience?

 

 

We are biological specimens, not engineering cogs.

 

Humans and computers are both made of stuff. Why is biological stuff special regarding the emergence of consciousness?

Link to comment
Share on other sites

An AI does not possess the sensitivity and sense of responsibility towards humanity. Relegating important and Life changing decisions to AI is a disaster.

We humans are connected with each other by a single chord : we have failings and we Experience.

None of this is possible for an AI. It does not "experience" the world. It only has "knowledge".

Any decision taken without having experience is disaster.

 

I want a full discussion on merits and demerits of AI. Please avoid links.

The ability of AI to suddenly take over and start making terrible decisions is unlikely. Simply having the ability to cut the power to the main processor would be sufficient means to turn it off. And no rational human would give the decision to say, launch nukes, to an AI anyways. I think AI's will have a limit to truly being AI's. Maybe the will be able to learn, analyze things, make decisions, but they would still make mistakes, just like humans. And if you teach them to have emotion, then you better give them full rights under the law too if they can actually experience emotion. Because that would be going into seriously dangerous territory. An emotional AI would be much deadlier then a non emotional AI.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.