the destructiveness of humanity meant our species could wipe itself out by 2100

the destructiveness of humanity meant our species could wipe itself out by 2100
"It tends to be regarded as a flaky concern, but given that we don't know how serious the risks are, that we don't know the time scale, dismissing the concerns is dangerous,"
‘I’ve watched the Terminator films, which play on our darkest fears about robots,’ said Mr Barbato. ‘Clearly, if a type of killer cyborg evolved, it might easily lead to a breakdown of morals and consciousness, the degradation of life and the disintegration of human civilisation.’
The Centre for the Study of Existential Risk looks at the "four greatest threats to humanity," including rogue biotechnology.
I'll be back....with a degree
“ Take gorillas for example – the reason they are going extinct is not because humans are actively hostile towards them, but because we control the environments in ways that suit us, but are detrimental to their survival. ”
« The seriousness of these risks is difficult to assess, but that in itself seems a cause for concern, given how much is at stake.»
Spanner isn’t quite Skynet — the self-aware artificial intelligence system in the Terminator movies — but it does show how far we’ve come at building connected systems and databases. “When there are outages, things just sort of flip — client machines access other servers in the system,” Google software engineer Andrew Fikes told Wired. “It’s a much easier service story. … The system responds — and not a human.”
In 1965, Irving John ‘Jack’ Good wrote a paper for New Scientist called ‘Speculations concerning the first ultra-intelligent machine’.