Let's make sure he WON'T be back! Cambridge to open 'Terminator centre' to study threat to humans from artificial intelligence
- Centre will examine the possibility that there might be a ‘Pandora’s box' moment with technology
- The founders say technologies already have the 'potential to threaten our own existence'
|
A centre for 'terminator studies', where leading academics will study the threat that robots pose to humanity, will open at Cambridge University
A centre for 'terminator studies', where leading academics will study the threat that robots pose to humanity, is set to open at Cambridge University.
Its purpose will be to study the four greatest threats to the human species - artificial intelligence, climate change, nuclear war and rogue biotechnology.
The Centre for the Study of Existential Risk (CSER) will be co-launched by Lord Rees, the astronomer royal and one of the world's top cosmologists.
Rees's 2003 book Our Final Century had warned that the destructiveness of humanity meant that the species could wipe itself out by 2100.
The idea that machines might one day take over humanity has featured in many science fiction books and films, including the Terminator, in which Arnold Schwarzenegger stars as a homicidal robot.
In 1965, Irving John ‘Jack’ Good and wrote a paper for New Scientist called Speculations concerning the first ultra-intelligent machine.
Good, a Cambridge-trained mathematician, Bletchley Park cryptographer, pioneering computer scientist and friend of Alan Turing, wrote that in the near future an ultra-intelligent machine would be built.
This machine, he continued, would be the 'last invention' that mankind will ever make, leading to an 'intelligence explosion.'
For Good, who went on to advise Stanley Kubrick on 2001: a Space Odyssey, the 'survival of man' depended on the construction of this ultra-intelligent machine.
The Centre for the Study of Existential Risk (CSER) will be opened at Cambridge and will examine the threat of technology to human kind
Huw Price, Bertrand
Russell Professor of Philosophy and another of the centre's three founders,
said such an 'ultra-intelligent machine, or artificial general intelligence (AGI)' could have very serious consequences.
He said: 'Nature didn’t anticipate us, and we in our turn shouldn’t take AGI for granted.
'We need to take seriously the
possibility that there might be a ‘Pandora’s box’ moment with AGI that,
if missed, could be disastrous.
'I don’t mean that we can predict this
with certainty, no one is presently in a position to do that, but that’s
the point.
'With so much at stake, we need to do a better job of understanding the risks of potentially catastrophic technologies
He added: 'The basic philosophy is that we should be taking seriously the fact that we are getting to the point where our technologies have the potential to threaten our own existence – in a way that they simply haven’t up to now, in human history.
'What better place than Cambridge, one of the oldest of the world’s great scientific universities, to give these issues the prominence and academic respectability that they deserve?
'Cambridge recently celebrated its 800th anniversary – our aim is to reduce the risk that we might not be around to celebrate it’s millennium.'
- Pictured: Shocking moment giant anaconda regurgitates a...
- Elderly woman dies in flooded home as deluge devastates...
- Crimson tides: Tourists flee from Bondi Beach 'Red Sea' as...
- Harrowing moment Mexico's fearless woman mayor begged for...
- Married BBC stars Tim Willcox and Sophie Long banned from...
- British man becomes first person to visit all 201...
- Woman forced to remarry the husband who threw acid in her...
- I can see my spouse from up here! Husband takes bird's-eye...
- Student, 20, dies hours after doctors sent him home for...
- Young mother wins DNA legal battle to prove her soldier...
- Children in UKIP foster row have been split up: Rotherham...
- Wife dies and husband left seriously ill after eating...
Did someone from Cambridge University go over to the recent Singularity Summit and not fall asleep this time?
- Phil Payne , Sheffield, 27/11/2012 12:48
Report abuse