AI could 'replace humans altogether': Professor Stephen Hawking warns that robots will soon be a 'new form of life' that can outperform us

  • Professor Stephen Hawking was speaking during an interview with Wired
  • He said that 'someone will design AI that improves and replicates itself'
  • He added that 'this will be a new form of life that outperforms humans'
  • During the interview he also urged more people to take an interest in science 

Professor Stephen Hawking has issued a chilling warning about the imminent rise of artificial intelligence.

During a new interview, Professor Hawking warned that AI will soon reach a level where it will be a 'new form of life that will outperform humans.'

Professor Hawking even went so far as to say that AI may replace humans altogether, although he didn't specify a timeline for his predictions.

Scroll down for video 

Professor Stephen has warned mankind will destroy the Earth, turning it into a blazing fireball, within the next 600 years. The renowned physicist believes soaring population sizes and increasing demands for energy will lead to the catastrophe (stock image)  

Professor Stephen Hawking has issued a chilling warning about the imminent rise of artificial intelligence. During a new interview, Professor Hawking warned that AI will soon reach a level where it will be a 'new form of life that will outperform humans'

A ROBOT TAKEOVER? 

A recent report by PwC found that four in 10 jobs are at risk of being replaced by robots.

The report also found that 38 per cent of US jobs will be replaced by robots and artificial intelligence by the early 2030s.

The analysis revealed that 61 per cent of financial services jobs are at risk of a robot takeover.

This is compared to 30 per cent of UK jobs, 35 per cent of Germany and 21 per cent in Japan. 

Advertisement

Professor Hawking made the chilling comments during a recent interview with Wired.

He said: 'I fear that AI may replace humans altogether.

'If people design computer viruses, someone will design AI that improves and replicates itself.

'This will be a new form of life that outperforms humans.'

During the interview, Professor Hawking also urged more people to take an interest in science, claiming that there would be 'serious consequences' if this didn't happen.

He added that a new space programme should be developed, 'with a view to eventually colonising suitable planets for human habitation.'

He said: 'I believe we have reached the point of no return.

'Our Earth is becoming too small for us, global population is increasing at an alarming rate and we are in danger of self-destructing.'

This isn't the first time that Professor Hawking has expressed fears about the rise of AI.

Professor Hawking said: 'I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans.' Pictured is a scene from The Terminator, in which robots threaten to take over Earth

Professor Hawking said: 'I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans.' Pictured is a scene from The Terminator, in which robots threaten to take over Earth

In October last year, Professor Hawking warned that artificial intelligence could develop a will of its own that is in conflict with that of humanity.

It could herald dangers like powerful autonomous weapons and ways for the few to oppress the many, he said, as he called for more research in the area.

He was speaking in Cambridge at the launch of The Leverhulme Centre for the Future of Intelligence, which will explore the implications of the rapid development of artificial intelligence.

He said: 'I believe there is no deep difference between what can be achieved by a biological brain and what can be achieved by a computer. 

SHOULD KILLER ROBOTS BE BANNED?

A report by Human Rights Watch and the Harvard Law School International Human Rights Clinic calls for humans to remain in control over all weapons systems at a time of rapid technological advances.

It says that requiring humans to remain in control of critical functions during combat, including the selection of targets, saves lives and ensures that fighters comply with international law.

'Machines have long served as instruments of war, but historically humans have directed how they are used,' said Bonnie Docherty, senior arms division researcher at Human Rights Watch, in a statement.

'Now there is a real threat that humans would relinquish their control and delegate life-and-death decisions to machines.'

Some have argued in favor of robots on the battlefield, saying their use could save lives.

But last year, more than 1,000 technology and robotics experts — including scientist Stephen Hawking, Tesla Motors CEO Elon Musk and Apple co-founder Steve Wozniak — warned that such weapons could be developed within years, not decades.

In an open letter, they argued that if any major military power pushes ahead with development of autonomous weapons, 'a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.'

According to the London-based organization Campaign to Stop Killer Robots, the United States, China, Israel, South Korea, Russia, and Britain are moving toward systems that would give machines greater combat autonomy. 

Advertisement

'It therefore follows that computers can, in theory, emulate human intelligence - and exceed it.'

Artificial intelligence is progressing rapidly and there are 'enormous' levels of investment, Professor Hawking said.

He said the potential benefits were great and the technological revolution could help undo some of the damage done to the natural world by industrialisation.

'In short, success in creating AI could be the biggest event in the history of our civilisation,' said Professor Hawking.

'But it could also be the last unless we learn how to avoid the risks.'