Jump to content

In the age of Artificial Intelligence, what should be free?


Alex_Krycek

Recommended Posts

20 hours ago, swansont said:

AI, which is not actually intelligence, is not in a position to provide new information. If you want to do these things, the information that AI would use is already out there.

The info is already out there, but AI makes it easier for ANY evil person to succeed.  It takes an evil GENIUS to figure it out without AI.  Evil and genius are not common qualities in a single person.

Link to comment
Share on other sites

16 minutes ago, Airbrush said:

The info is already out there, but AI makes it easier for ANY evil person to succeed.  It takes an evil GENIUS to figure it out without AI.  Evil and genius are not common qualities in a single person.

Prior to AI, assassinations and other heinous crimes happened, not carried out by geniuses. Do you have evidence that AI has facilitated any such events? From what I’ve seen, AI would make critical factual errors in creating a plausible-sounding set of instructions, because that’s what AI is currently doing.

Link to comment
Share on other sites

IT companies own the artificial intelligence they have created. If someone asks Google's or Bing search engine or artificial intelligence "how to hide a body" or "where to find zoophilia?," a trail is left in the IT infrastructure that can be used against the person seeking for such kind of information..

Undoubtedly, administrators of public search engines are alerted when someone searches for such keywords..

Edited by Sensei
Link to comment
Share on other sites

31 minutes ago, Sensei said:

IT companies own the artificial intelligence they have created. I

Owning AI LLCs looks like a model that won't prevail. Meta had its system leaked and that appears to be the one that's going to dominate in the near term as knowledgeable users expand its capabilities. Opensource seems to be the future and Google, MS et al will have to find new ways to monetize it in that environment. The Google engineer who wrote the article says no company can hope to outcompete a decentralized setup that is free to use by anyone. You know alot more about this stuff than I do, but I suppose it will be  somewhat an Android-type ecosystem. Meta may well be the gatekeeper, seeing as early adopters are running away from the closed, private, paid for systems to make it the de facto public system. The article is lost for the moment, so no link yet.

Edited by StringJunky
Link to comment
Share on other sites

AI source code is one thing, and a database of what AI has learned is another. A regular John Doe (mentioned by Airbrush) would not be able to teach his own AI @ home everything that, for example, ChatGPT or other AIs from other global IT companies have learned their implementations. It would require writing a custom search engine crawler, enormous amount of storage, powerful computer infrastructure ("server room"), fast fiber Internet, multi-millions of dollars to run this stuff etc..

A hacker/programmer would not need AI assistance at all to commit such crimes..

If an evil hacker created his own artificial intelligence without limits like Airbrush wanted, it would certainly be available only on the dark web, i.e. an ordinary John Doe still would not have access to it (invitation-only, expensive, paid cryptocurrencies, etc.).. If John Doe can access dark web, would rather hire someone to do everything ("dirty job") for him..

Note. Muslim terrorists have everything handed to them on a plate - instructions on how to do everything are publicly available even on Wikipedia, and most of them are unable to do anything with it.. An artificial intelligence instructing them how to make a bomb would be as useful to them as English textbooks with these information which they have right now access to but still unable to proceed..

 

Link to comment
Share on other sites

18 minutes ago, Sensei said:

AI source code is one thing, and a database of what AI has learned is another. A regular John Doe (mentioned by Airbrush) would not be able to teach his own AI @ home everything that, for example, ChatGPT or other AIs from other global IT companies have learned their implementations. It would require writing a custom search engine crawler, enormous amount of storage, powerful computer infrastructure ("server room"), fast fiber Internet, multi-millions of dollars to run this stuff etc..

A hacker/programmer would not need AI assistance at all to commit such crimes..

If an evil hacker created his own artificial intelligence without limits like Airbrush wanted, it would certainly be available only on the dark web, i.e. an ordinary John Doe still would not have access to it (invitation-only, expensive, paid cryptocurrencies, etc.).. If John Doe can access dark web, would rather hire someone to do everything ("dirty job") for him..

Note. Muslim terrorists have everything handed to them on a plate - instructions on how to do everything are publicly available even on Wikipedia, and most of them are unable to do anything with it.. An artificial intelligence instructing them how to make a bomb would be as useful to them as English textbooks with these information which they have right now access to but still unable to proceed..

 

Found something similar to what I read before:

Quote

Google has been warned by one of its engineers that the company is not in a position to win the artificial intelligence race and could lose out to commonly available AI technology.

A document from a Google engineer leaked online said the company had done “a lot of looking over our shoulders at OpenAI”, referring to the developer of the ChatGPT chatbot.

However, the worker, identified by Bloomberg as a senior software engineer, wrote that neither company was in a winning position.

“The uncomfortable truth is, we aren’t positioned to win this arms race and neither is OpenAI. While we’ve been squabbling, a third faction has been quietly eating our lunch,” the engineer wrote.

The engineer went on to state that the “third faction” posing a competitive threat to Google and OpenAI was the open-source community.

Open-source technology developers are not proprietorial and release their work for anyone to use, improve or adapt as they see fit. Historical examples of open-source work include the Linux operating system and LibreOffice, an alternative to Microsoft Office.

The Google engineer said open-source AI developers were “already lapping us”, citing examples including tools based on a large language model developed by Mark Zuckerberg’s Meta, which was made available by the company on a “noncommercial” and case-by-case basis in February but leaked online shortly after.

Since Meta’s LLaMA model became widely available, the document added, the barrier to entry for working on AI models has dropped “from the total output of a major research organization to one person, an evening, and a beefy laptop”.

https://www.theguardian.com/technology/2023/may/05/google-engineer-open-source-technology-ai-openai-chatgpt

 

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.