Robots and the two-edged blade of new technology

Robots and the two-edged blade of new technology

There’s a scare-tactic video going around on social media, and I wanted to weigh in on it—this particular video has gone from 500,000 views to almost 2 million in the past 10 days. As a matter of principle, I will not link to it. It presents a scary future in which killer robotic drones—controlled by any terrorist organization or government—run rampant.

The twin issues of killer robots and robots taking our jobs are the result of the two-edged blade of new technology, i.e., technologies that can be used for both good and evil. Should these new technologies be stopped entirely or regulated? Can they be regulated? Once you see a video like this one, one doubts whether they can ever be controlled. It’s fearful media that doesn’t say it is fake until far beyond the irresponsible level.

Videos like this one—and there are many—are produced for multiple purposes. The issues often get lost to the drama of the message. They are the result of, or fueled by, headline-hungry news sources, social media types and commercial and political strategists. This particular shock video—fake as it is—is promoting a longer, more balanced documentary and non-profit organization on the subject of stopping autonomous killing machines. Yet there are other factual videos of the U.S. military’s Perdix drones swarming just like in the shock video. Worse still, the same technologists that teach future roboticists at MIT are also developing those Perdix drones and their swarming capabilities.

My earlier career was in political strategy and I know something about the tactics of fear and manipulation—of raising doubts for manipulative purposes, as well as the real need for technologies to equalize the playing field. Again, the two-edged sword.

At the present time, we are under very real threat militarily and from the cyber world. We must invest in countering those threats and inventing new preventative weaponry. Non-militarily, jobs ARE under threat—particularly the dull, dirty and dangerous (DDD) ones easily replaced by robots and automation. In today’s global and competitive world, DDD jobs are being replaced because they are costly and inefficient. But they are also being replaced without too much consideration for those displaced.

It’s hard for me as an investor and observer (and in the past as a hands-on participant) to reconcile what I know about the state of robotics, automation and artificial intelligence today with the future use of those very same technologies.

I see the speed of change, e.g.: for many years, Google has had thousands of coders coding their self-driving system and compiling the relevant and necessary databases and models. But along comes George Hotz and other super-coders who single-handedly write code that writes code to accomplish the same thing. Code that writes code is what Elon Musk and Stephen Hawking fear, yet it is inevitable and soon will be commonplace. Ray Kurzweil named this phenomenon and claims that the ‘singularity’ will happen by 2045 with an interim milestone in 2029 when AI will achieve human levels of intelligence. Kurzweil’s forecasts, predicated on exponential technological growth, is clearly evident in the Google/Hotz example.

Pundits and experts suggest that when machines become smarter than human beings, they’ll take over the world. Kurzweil doesn’t think so. He envisions the same technology that will make AIs more intelligent giving humans a boost as well. It’s back to the two-edged sword of good and evil.

In my case, as a responsible writer and editor covering robotics, automation and artificial intelligence, I think it’s important to stay on topic, not fan the flames of fear, and to present the positive side of the sword.

Comments are closed.