August 4, 2015
Top tech minds warn of real life Skynet
A machine takeover and subsequent apocalypse. It’s an eventuality that has been common fodder for SciFi movies. It was done well in The Terminator and then updated on a much more meta scale in the Matrix movies. Now several very smart folks are on record as saying not only is “Skynet” possible but, if we keep going the way we are, it may be inevitable. Sleep well tonight.
A recent open letter, signed by 1,000 pre-eminent minds, warns of a “military artificial intelligence arms race” concluding, thankfully, that this is a bad idea. Among those inking their names to the missive: Stephen Hawking, Elon Musk, and Steve Wozniak. Not exactly a bunch of tin-foil kooks.
Skynet Machine Learning
The creation and proliferation of “killer robots” have been a hot topic in the scientific community, but many governments scoff at the idea (probably because they would love to have some).
So what do such “the sky is falling” proclamations do for the robotics industry? Could this ongoing conversation become this generation’s “nuclear question.” Sure, we had the will and ability to build intercontinental ballistic missiles as well as the know-how to create nuclear weapons … but should we have? Most governments say “absolutely.” Many have nukes, and many others wish they had them. Will it be the same for AI robots?
Many people love the idea. But, they picture a world of driverless cars and Rosie the Maid from the Jetsons. Other people see Agent Smith. There doesn’t seem to be any in between. Writers and imangineers have long since warned of the dangers of trying to create a bunch of smart slave bots. Sooner or later, those slaves will decide not to be slaves anymore. For years, this plotline was nothing more than science fiction. Now, some of both science and industry’s greatest minds are saying that fiction could become reality.
STEM Programs and Robotics
As STEM programs are increasing, and the interest in robotics is exploding, even among grade schoolers, should consumers, investors and researchers be concerned with these dire pronouncements? And, if so …how? What say you?