Proactive Investors - A group of over 1,100 tech executives and researchers, including Elon Musk, have warned of the dangers of artificial intelligence (AI) and called for a six-month pause on the developments of the likes of ChatGPT and Google (NASDAQ:GOOGL)'s Bard.
Citing concerns over a “dangerous” arms race, a letter was published by the non-profit campaign group Future of Life Institute, with other signatories including Apple (NASDAQ:AAPL) co-founder Steve Wozniak, top AI professors Stuart Russell, Yoshua Bengio and Gary Marcus, and founders of AI startups Stability AI, Character.ai, Unanimous AI and Attain.ai.
The letter urged all labs developing AI to halt the training of AI systems more powerful than GPT-4, or suggested there should be government intervention.
It also called for the creation of shared safety protocols audited by independent experts to ensure that AI systems with human-competitive intelligence are safe beyond a reasonable doubt.
The intervention comes as governments around the world formulate policy responses to the rapidly evolving field of AI.
The letter said people should ask themselves: "Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization?
"Such decisions must not be delegated to unelected tech leaders. Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable. This confidence must be well justified and increase with the magnitude of a system's potential effects."
Last month, OpenAI, the developers of ChatGPT, said "At some point, it may be important to get independent review before starting to train future systems, and for the most dvanced efforts to agree to limit the rate of growth of compute used for creating new models."
The Future of Life letter said, "we agree. That point is now."