Most people follow a summary dismissal heuristic: given surface characteristics of a message, they quickly judge whether it is worth considering or dismiss with a “oh, that’s just silly!” I like to call it the silliness heuristic: we ignore “silly” things except being in a playful mood.What things are silly? One can divide silliness into epistemic silliness, practical silliness, social silliness and political silliness.
I think robots are going to have a huge impact on the world, just like computers did, or cars did, or asphalt, or electricity. That scale of impact—enormous impact—but I don’t think the impact is going to be because they become evil and take over. I think it’s going to be just because everything we do changes. Some things get easier, some things will get harder—not many things—and society will change. That’s a lot more scary, in some ways.
So who’s going to protect us from the real-life rise of the machines? Step forward a little-known body called the Centre for the Study of Existential Risk (CSER). CSER is based at the University of Cambridge, and is a multidisciplinary group of individuals – mainly scientists – whose mission, as defined on their website, is “the study and mitigation of risks that could lead to human extinction”.CSER was set up with funding from Skype co-founder Jaan Tallinn, which arguably makes him the real-life John Connor, making a lone stand against the real-life Skynets.
One need not dig into subtext to find the central message of the movies. It is John Connor’s message to himself, given to his father, passed on to his mother, and then repeated to himself, and by extension to us: ‘The future’s not set. There’s no fate but what we make for ourselves.’ And it’s the meaning and grounding of this thought that I’m going to explore.
Inside the secret network behind mass surveillance, endless war, and Skynet— “Google has ramped up its sales force in the Washington area in the past year to adapt its technology products to the needs of the military, civilian agencies and the intelligence community,”
The document cites Zaidan as an example to demonstrate the powers of SKYNET, a program that analyzes location and communication data (or “metadata”) from bulk call records in order to detect suspicious patterns.
In the Terminator movies, SKYNET is a self-aware military computer system that launches a nuclear war to exterminate the human race, and then systematically kills the survivors.
According to the presentation, the NSA uses its version of SKYNET to identify people that it believes move like couriers used by Al Qaeda’s senior leadership. The program assessed Zaidan as a likely match, which raises troubling questions about the U.S. government’s method of identifying terrorist targets based on metadata.
There were a lot of SF movies produced in the mid-eighties, but few retain the currency of the Terminator and its humanity-annihilating AI, Skynet. Everyone seems to thrum when that chord is plucked – even the NSA named one of its illegal mass surveillance programs SKYNET.
As I've written here before, science fiction is terrible at predicting the future, but it's great at predicting the present. SF writers imagine all the futures they can, and these futures are processed by a huge, dynamic system consisting of editors, booksellers, and readers. The futures that attain popular and commercial success tell us what fears and aspirations for technology and society are bubbling in our collective imaginations.
According to a recent YouGov poll, 13 percent of IT decision makers believe technology could destroy the Earth, with the leading causes of our demise including the prevention of evolution (74 percent), military and warfare (66 percent), artificial intelligence (44 percent), and environmental issues (38 percent).