 |
The humanoid robot "Atlas" was developed by the American robotics company Boston Dynamics for the US Defense Advanced Research Projects Agency (DAPRA) robotics challenge. Boston Dynamics says: "We use sensor-based controls and computation to unlock the capabilities of complex mechanisms." Organizations worldwide, from DARPA -- the research unit of the US Defense Department, which developed the Internet - - the US Army, Navy and Marine Corps to Sony Corporation turn to Boston Dynamics for advice and for help creating the most advanced robots on Earth. |
Machines are
starting to slowly replace humans on the battlefield. It’s believed fully
autonomous weapons may be ready in 20-30 years. And Switzerland is moving ahead
with its own research into mobile robot technology.
At a recent meeting in Geneva on the threat of
killer robot technology (lethal autonomous weapons systems or LAWS), a Swiss
foreign ministry disarmament expert said the international community was only
starting to grasp the possible repercussions of the robotic revolution on the
future battlefield.
Laurent Masmejean added that it was crucial to further examine which
applications of autonomous capabilities would be desirable, legal and
acceptable, and which raised concerns.
Swiss representatives at the conference stressed that their ongoing defence
research programmes did not include work on fully autonomous weapons.
“But we’ll see how different research fields, if considered together could help
make progress in technology towards more autonomy in systems - lethal or not,”
said Quentin Ladetto, director of defence future technologies at armasuisse,
which oversees technology procurement at the Swiss defence ministry.
Switzerland currently pursues small research programmes for unmanned air and
ground vehicles, conducted with numerous academic and industrial partners,
including the Federal Institute of Technology in Zurich (ETHZ) and US firms
Black-I Robotics and iRobot.
In its 2012-2016 research strategy the Swiss defence ministry underlines the
importance of unmanned vehicles, especially for reconnaissance, surveillance and
communication, which it says is likely to increase.
These unmanned air and ground vehicle programmes aim at “the autonomy of
unmanned platforms, and creating the conditions for the approval and widespread
use of mobile robots for security policy tasks in Switzerland”.
“If we have a closer look at research in robotics and artificial intelligence,
the trend and focus is clearly on autonomy. Even if the ultimate research goals
for academic institutions and universities respectively are not of military
nature, the different building blocks that are being developed today and their
integration tomorrow could lead to advanced autonomous capabilities,” said
Ladetto.
But some experts have their doubts. Stuart Casey-Maslen
head of research at the Geneva Academy of International Humanitarian Law and
Human Rights, said he would be “astonished” if Switzerland wasn’t researching
LAWS.
“More autonomy is clearly the future for the military and security forces. The
question is whether, and if so how and where, the technology is restrained for
legal or ethical reasons. I'm particularly concerned about use of lethal
autonomous systems for law enforcement, which will not be considered within the
CCW [Convention on Conventional Weapons] context,” he said.
Joe Farha, a military weapons specialist with the Manchester-based Omega
Research Foundation, agreed that any civilian research into autonomy was likely
to “spill over” into military uses or law enforcement.
“It is possible that more and more will be asked of unmanned systems so that a
gradual advance in capability may lead to greater degrees of autonomy by
default,” he declared, adding that unmanned ground or air systems could have
dual usage as a weapons platform, “if the right technology is applied”.
The military of the future is likely to be increasingly unmanned. In recent
years the US has spent $6 billion annually on unmanned systems for war. The US
Defense Advanced Research Projects Agency (DARPA) has been developing military
robotics and funding projects like the LS3 rough terrain robot created by the US
firm Boston Dynamics.
And US policy documents reflect plans to increase
the autonomy of weapons systems. In its Unmanned Systems Integrated Roadmap
FY2011-2036 the US Department of Defense wrote that it “envisions unmanned
systems seamlessly operating with manned systems while gradually reducing the
degree of human control and decision-making require for the unmanned portion of
the force structure.”
Ever more autonomous defence technology already exists. The US Navy’s MK 15
Phalanx Close-In Weapons System, and its land-based version, the Counter Rocket,
Artillery and Mortar System (C-RAM), have been operational for years.
Elsewhere, Israel has deployed its Iron Dome automatic weapons defence system,
and Germany has designed the NBS Mantis system to be used to protect forward
operating bases in Afghanistan.
Talking about C-RAM, robotic warfare expert Peter W Singer said humans are part
of the decision-making but mainly in the initial programming of the robot.
“During the actual operation of the machine, the operator only exercises veto
power, and a decision to override a robot’s decision must be made in only half a
second,” he said. The danger is that this can lead to “automation bias” – the
tendency to trust an automated system.
Other precursors to fully autonomous weapons include South Korea and Israel’s
sentry robots that can detect people and, if a human grants the command, fire
its weapons.
“As nations develop the technological capability, many may choose to go down the
path toward full autonomy, because of the benefits these weapons could provide:
rapid response time, reduced risk to their own soldiers, fewer costs, and
insulation from the effects of human emotions involved in decisions to use
force,” said Mary Wareham, an arms expert at Human Rights Watch.
Unmanned aircraft are also moving beyond existing drones to have greater
autonomy. The US Navy’s X-47B plane can take off from and land on an aircraft
carrier and refuel on its own power, while Britain has unveiled the “autonomous
and stealthy unmanned” Taranis aircraft. Both reportedly have weapons bays that
could be adapted.
"Meaningful human control"
For NGOs there is no way of checking what countries mean by “meaningful human
control” of more autonomous systems. Efforts to curtail such weapons have so far
come to nothing and a pre-emptive ban is the only solution, they say.
“I have tried to engage with weapons manufacturers but they are frankly not
interested in international humanitarian law (IHL),” said Juergen Altmann,
co-founder of the International Committee for Robot Arms Control.
But Masmejean said the current legal basis was “quite solid” to handle future
changes.
“Whether autonomous weapons are involved or not, all use of force or use of
weapon systems must be in accordance with the entire framework of international
law, and in particular IHL, most notably the principles of distinction,
proportionality and precaution,” he said.
But he added that the international community should carefully examine the legal
reviews of new weapons, the means and methods of warfare, and possible
responsibility gaps.
Altmann is not convinced, however.
“They may be interested in the idea of limiting the effects of war if it’s for
better precision which might lead to better products. But they rather want to
innovate and the trend is towards unmanned vehicles which can react faster,
which is exactly the direction we are trying to prevent.”
“There needs to be a political decision by society and states to block this and
not let military and technological trends evolve freely.”
Legged Squad Support System (LS3) is a four-legged robot designed to travel 20 miles on rough terrain carrying 400 lbs of load. The video shows field testing at Twentynine Palms, California. Testing includes rough-terrain mobility, endurance, visually guided following, GPS guided maneuver, and overall reliability. LS3 is being developed by Boston Dynamics with funding from DARPA and the US Marine Corps.