An open letter calls for the development of superintelligence to stop

By EngineAI Team | Published on October 27, 2025 | Updated on December 19, 2025
An open letter calls for the development of superintelligence to stop
Public leaders from the tech and political sectors have signed a statement from the Future of Life Institute urging governments to forbid the development of superintelligence until it has been shown to be manageable and the public has given its approval. The specifics: "Human economic obsolescence," "losses of freedom, civil liberties, dignity, and control," and "potential human extinction" are among the issues raised in the letter. While current OAI employee Leo Gao was among the signatories, OpenAI, Google, Anthropic, xAI, and Meta leadership were not present. According to data given by the organization, only 5% of Americans favor unchecked advancements, while 64% want ASI work to be stopped until it is proven safe. The "godfathers of AI," Yoshua Bengio and Geoffrey Hinton, Steve Wozniak, co-founder of Apple, and Richard Branson of Virgin were also highlighted. Although there have been previous public protests against AI acceleration, they appear to be growing in volume. However, since all of the frontier labs are conspicuously absent and there is still a lack of clarity around what constitutes a "stop" to progress and how to even define ASI, this endeavor may end up garnering more attention than actual action.