Robot wars

Robotics and AI, now more than ever, is a thriving industry – though care must be taken to ensure our increasing technological prowess does not take us down a dark path

 
Boston Dynamics’ Spot The Robot Dog
Boston Dynamics’ Spot The Robot Dog 

It was in 1921 that Czech writer Karel Capek coined the term ‘robot’ when writing his play Rossum’s Universal Robots. It is perhaps from these early beginnings that we have developed a general wariness towards technology and a fear of things created in our own image, for in the play, and now a common trope in science fiction, the machines rise up against their masters and bring humankind to the brink of extinction.

We marvel at robots and AI that can mimic lifelike behaviour, but also find them just a little bit disconcerting. Capek’s play, set around the year 2000, was a vision of the future that did not come to pass, though industrial robots were in widespread use by then, having first appeared in the early 60s. It is only in recent years that we have been edging closer to more nuanced applications of robotics and its associated field, AI, that lean towards the kind of mimicry that Capek envisioned.

Holding up a mirror to the face of human existence is fine, so long as it’s not the side of it that engages in war or ethnic cleansing

In June 2020, Boston Dynamics, under ownership of SoftBank, offered up its first commercial product, Spot, a four-legged inspection robot, capable of navigating terrain with ‘unprecedented mobility, allowing you to automate routine inspection tasks and data capture,’ according to the promotional material. A majority stake in the company was then purchased by car manufacturer Hyundai in December that year, so there is a lot of jostling for position in an industry that has the potential to influence a great many markets.

Over a year later, in August 2021, when introducing the Tesla Bot as part of the company’s AI day, Elon Musk said: “Tesla is arguably the world’s biggest robotic company because our cars are like semi-sentient robots on wheels.” He goes on to make the case that the work his company has done to provide his cars with the ability to understand and navigate the world is transferable to a humanoid form, joking to a reception of nervous laughter that the Tesla bot will be designed so that “you can run away from it” and “most likely overpower it.” It introduced just the slightest hint of apprehension when stepping into a future that could be right out of those well-trodden sci-fi storylines. If life does imitate art, then I would hope that artificial life does not imitate us.

A trip to the uncanny valley
For just as we are aware of our capacity for benevolence, we are aware of our shortcomings. Perhaps we are fearful that we will unwittingly teach these to an AI. In building robots we are holding a mirror up to ourselves and to the world. Therefore, when we see four-legged robots moving around as an imitation dog, or Boston Dynamic’s Atlas, a humanoid bipedal robot, perform parkour in a pre-programmed sequence, or even observe the ‘muscles’ of a robot arm flexing, we get a sense of the ‘uncanny valley’ – that feeling of something eerily similar to us, but not necessarily frighteningly so. Because holding up a mirror to the face of human existence is fine, so long as it’s not the side of it that engages in war or ethnic cleansing.

When Spot is mounted with a tactical assault rifle, or when the parkour performing Atlas robots are suited up with Kevlar, then perhaps we might start worrying. Because then, it is less uncanny, and more the stuff of science fiction nightmare.

Prof Stuart Russell, the founder of the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley, speaking to The Guardian newspaper said; “The use of AI in military applications – such as small anti-personnel weapons – is of particular concern” because “those are the ones that are very easily scalable, meaning you could put a million of them in a single truck and you could open the back and off they go and wipe out a whole city.”

Perhaps any anxiety about the machines taking over is premature, but a lean towards adapting robots for military use does set alarm bells ringing. Boston Dynamics has stated in its ethical principles that it is firmly against weaponising robots, but it is not the only robotics company to have built a robot dog. In October, Ghost Robotics unveiled its version with a special purpose unmanned rifle (SPUR) affixed to the top of it, making the unnerving case that dogs are not necessarily a man’s best friend. The module, designed specifically for these robots, comes courtesy of a company called Sword International and has an effective range of 1,200 metres. Of course this does not prove that we are moving towards a future of autonomous killing machines, but it is a worrying development.

The Biden administration has proposed an increase in R&D spending for the Department of Defense to the tune of $112bn, according to figures in the Pentagon’s fiscal year 2022 budget request, the largest such increase on record. A total of $874m would go towards development of artificial intelligence to keep up with its adversaries.

And this is where I believe the crux of the problem lies. Government defence agencies are engaged in a never-ending game of one-upmanship in order to maintain the edge in any future engagement. Many envision robot labour transforming the economy and rendering physical labour a choice for us.
I’m certain that those enthusiastically enlisting for this future hope that it ultimately wins out against our innate predisposition for self-destruction.