There are fears that the world might end up in an AI apocalypse, just as it happened in the Terminator film franchise. The warnings that AI might destroy us aren’t coming from some random scientist or conspiracy theorist but from eminent professionals like Stephen Hawking, Elon Musk, and Bill Gates.
Bill Gates thinks AI will become too intelligent to remain under our control. Stephen Hawking shares the same opinion. He doesn’t think AI will suddenly go berserk overnight. Rather, he believes machines will destroy us by becoming too competent at what they do. Our conflict with AI will begin the moment their goals are no longer aligned with ours.
Elon Musk has compared the proliferation of AI to “summoning the demon.” He believes it is the biggest threat to humanity. To prevent the AI apocalypse, he has proposed that governments start regulating the development of AI before for-profit companies do “something very foolish.”