Stephen Hawking’s Last Paper Says AI Could End Human Race

0 8

On several occasions, Stephen Hawking has expressed his views on issues like global warming, politics, and philosophy. However, the one topic that has always grabbed attention is his predictions on the future of artificial intelligence and race of superhuman beings.

Superhumans

In Hawking’s final papers, he talks about the emergence of a superhuman race that will be a result of extensive gene editing. The richer segment of society will be able to edit their genes and the genes of their offspring with the help of advanced biotechnology methods like CRISPR.

This, in turn, will give birth to a stronger, more intelligent, and more immune humans. According to Hawking, these rich “genetically re-engineered” humans will eventually become a threat to the ordinary human world. The poorer ordinary humans who can no longer compete will either die out or become unimportant.

Artificial Intelligent

Stephen Hawking’s take on artificial intelligence is even scarier as it predicts a future where “AI could develop a will of its own, a will that is in conflict with ours.” In fact, several other scientists also believe that AI will gain consciousness in the future.

Google has also worked on a project called AutoML, an AI system that created its own AI child which had the ability to surpass its predecessor’s performance.

Even though Hawking’s own machine for communicating with others was based on AI, he feared the development and widespread of AI-based technologies could be a threat to the entire human race.

He stated in his final paper that human intelligence will be overtaken by AI in the next 100 years so humans need to ensure that the artificially sentient beings we develop have the same goals as us.

In addition to this, Stephen Hawkings’s last paper discusses the possibility of a major environmental calamity or a nuclear war that will cripple the Earth.

Source : Visit

You might also like More from author

Leave A Reply

Your email address will not be published.