iPhone app iPad app Android phone app Android tablet app More

Featuring fresh takes and real-time analysis from HuffPost's signature lineup of contributors
Shelly Palmer

GET UPDATES FROM Shelly Palmer
 

Google = Skynet... Yikes!

Posted: 01/27/2012 6:00 pm

The Shelly Palmer School of Connected Living has one primary thesis: "Technology is good." I believe that all technological progress is good and that the story of the evolution of mankind is inextricably linked to the story of the evolution of our technology. We are tool builders, and we are tool users. It is, in large measure, what separates us from virtually every other species in the known universe.

I also acknowledge "Technology is good" is an optimistic point of view. I am, by nature, an optimist. I believe in lifelong learning and I aspire daily to the joy of striving to realize things that exist in our imaginations. It may be one of our higher callings; it is certainly one of mine.

So, I am usually one of the guys who says things like, "Guns don't kill people, people kill people." Which is my way of acknowledging that firearms are simply tools to help us throw rocks faster and more accurately and, if you need to throw a rock, it's probably better to throw it faster and more accurately.

This argument can be extended to less emotional subjects like the Sony Betamax case or the more recent (though seemingly ancient) Grokster case. Both of which ended up with the court deciding that, and I'm paraphrasing: "Technology good... people bad."

"If God intended us to fly, he'd have given us wings." Yep. I totally agree. God (please use your politically correct deity, this article is not about science vs. religion) gave us brains that saw birds and imagined what it would be like to fly. The same deity gave us thumbs, manual dexterity and the ability to create tools that enabled us to have wings. We fly because we are genetically gifted to do so. (You can decide how those genetic gifts were bestowed, like I said, it is not the point of this writing.)

The point is, that technology is woven into the fabric of our lives and it, in every case, in every civilization (past and present) defines how we interact, how we live, how we work... it literally defines everything about us, including the epochs and ages of our past.

The reason for my huge pro-technology buildup is that I am about to write something that is so out of character, so remarkably against one of my strongest personal axioms, I have to talk myself into writing it...

Google is about to go too far.

On March 1, 2012, Google will consolidate the privacy policies for 60 of its products creating the singularly most significant database of the Information Age. The aggregation of these data will empower Google to correlate and contextualize our thoughts, aspirations, actions, physical locations and the timelines for the basic processes of the doing of life.

I don't think any single thought about the aggregation of data or the use of technology has ever made me as uncomfortable as this announcement. On its best day, with every ounce of technology the U.S. Government could muster, it could not know a fraction as much about any of us as Google does now. But now is not what I'm worried about. I'm not even worried about this decade. At the current rate of technological change, taking into consideration the amount of information we are creating about ourselves, and adding in the computational power that will be available in about a decade, Google will equal Skynet circa 2022.

This is a guess, of course, it could be sooner -- but it won't be later. What do I mean by Skynet? First of all, get your Terminator lore together, but then just imagine a database that could automatically determine what you are most likely going to have for dinner after your bowling league Tuesday night, where you are going to have it, who it will be with, whether you are feeling good or have a cold, if you and your wife are fighting, how your day was at work, what you are thinking about buying, who is helping you with your decisions about it, what chronic illnesses you are dealing with, what meds you are on, etc., etc., etc. And this isn't even the scary stuff.

What scares me is the advance of analytical tools and the existence of yet-uninvented ways to manipulate data for good and, inadvertently, for bad. I'm not worried about bad people doing bad things. That is the nature of our world and, generally, it is easy to identify bad people who do bad things. I'm worried about the good intentions that pave the road to hell. I can't speculate about how our near-term-future, data-dependent culture will be negatively affected by the law of unintended consequences. That's because so many of the vocations and avocations that will be impacted have also yet to be invented. I just know that there are at least as many ways for things to go wrong, as there are for things to go right.

The sky is not falling and this is not a sensationalistic FUD-mongering exercise (Fear, Uncertainty & Doubt). It is an admonition that the time has come for learned colleagues to start a Socratic discourse about what parts of the Genie need to stay in the bottle, and what parts can be let out. Imperfect metaphor? I don't think so.

This is a very complex problem and we are going to need very simple ways to describe it. Skynet can't win -- at least not in the world I want to live in. Let's get ahead of this while it's still just the subject of the occasional rhetorical blog post -- because, no matter what anyone tells you, the world of big data is never going away.

 

Follow Shelly Palmer on Twitter: www.twitter.com/@shellypalmer

 
 
  • Comments
  • 1
  • Pending Comments
  • 0
  • View FAQ
Comments are closed for this entry
View All
Recency  | 
Popularity
photo
HUFFPOST SUPER USER
Rob Huggins
09:36 AM on 01/30/2012
I don't work directly with it, but I know that the departments of the companies I've worked for focused on data analysis are highly interested in neural nets. Instead of going through an internal logical survey of yes, no, and multiple choice questions in the computer to make decisions about data, the neural net becomes something adaptable and beyond the knowledge of the programmers. It works by allowing pathways to form between the input and the output always measuring against a success failure scale to trim, grow, and improve the net. No programming of the analysis is needed, only programming of the method of improvement for the net, then evolution at trillions of decisions a second kicks in.

I don't know if you've caught this, but that is basically how a brain works, and it is the precursor to having truly artificial intelligence. So, yeah, eventually computers will be more than the sum of their parts.