We’re running out of time to stop killer robot weapons

The Guardian – By Bonnie Docherty – Wed Apr 11, 2018

robots
Photo: Mock killer robot in central London. ‘Countries that recognise the dangers cannot wait another five years to prevent such weapons from becoming a reality.’

The fully autonomous AI weapons now being developed could disastrously transform warfare. The UN must act fast.

It’s five years this month since the launch of the Campaign to Stop Killer Robots, a global coalition of non-governmental groups calling for a ban on fully autonomous weapons. This month also marks the fifth time that countries have convened at the United Nations in Geneva to address the problems these weapons would pose if they were developed and put into use.

The countries meeting in Geneva this week are party to a major disarmament treaty called the Convention on Certain Conventional Weapons. While some diplomatic progress has been made under that treaty’s auspices since 2013, the pace needs to pick up dramatically. Countries that recognise the dangers of fully autonomous weapons cannot wait another five years if they are to prevent the weapons from becoming a reality.

Fully autonomous weapons, which would select and engage targets without meaningful human control, do not yet exist, but scientists have warned they soon could. Precursors have already been developed or deployed as autonomy has become increasingly common on the battlefield. Hi-tech military powers, including China, Israel, Russia, South Korea, the UK and the US, have invested heavily in the development of autonomous weapons. So far there is no specific international law to halt this trend.

Experts have sounded the alarm, emphasising that fully autonomous weapons raise a host of concerns. For many people, allowing machines that cannot appreciate the value of human life to make life-and-death decisions crosses a moral red line.

Legally, the so-called “killer robots” would lack human judgment, meaning that it would be very challenging to ensure that their decisions complied with international humanitarian and human rights law. For example, a robot could not be preprogrammed to assess the proportionality of using force in every situation, and it would find it difficult to judge accurately whether civilian harm outweighed military advantage in each particular instance.

Fully autonomous weapons also raise the question: who would be responsible for attacks that violate these laws if a human did not make the decision to fire on a specific target? In fact, it would be legally difficult and potentially unfair to hold anyone responsible for unforeseeable harm to civilians.

killer robotth
Photo: Mock Killer Robot in London.

There are also security concerns. Without any legal restraints on fully autonomous weapons, militaries could engage in an arms race, vying to develop deadly technology that may lower the need to deploy soldiers – while possibly lowering the threshold to armed conflict.

The Campaign to Stop Killer Robots, which Human Rights Watch co-founded and coordinates, argues that new international laws are needed to preempt the development, production and use of fully autonomous weapons. Many roboticists, faith leaders, Nobel peace laureates and others have reached the same conclusion, as is evident from their open letters, publications and UN statements: the world needs to prevent the creation of these weapons because once they appear in arsenals, it will be too late.

At the UN meeting going on now, one of two week-long sessions that will take place this year, countries are striving to craft a working definition of the weapons in question and to recommend options to address the concerns they raise. The countries have offered several possible ways to proceed. The momentum for a preemptive prohibition is clearly growing. As of Monday, the African Group and Austria have joined 22 other countries voicing explicit support for a ban. Other countries have aligned themselves with a French/German proposal for a political declaration, a set of nonbinding guidelines that would be an interim solution at best. Still others have explicitly expressed opposition to a preemptive prohibition and a preference for relying on existing international law.

Despite this divergence of opinion, the discussion on the first day had a significant common thread. Almost all countries that spoke talked about the need for some degree of human control over the use of force. The widespread recognition that humans must have control over life-and-death decisions is heartening. If countries agree that such control needs to be truly meaningful, a requirement for human control and a prohibition on weapons that operate without such control are two sides of the same coin.

To stay in front of technology, they should negotiate and adopt a new legally binding ban by the end of 2019.

Read entire article here

PS We don’t need robots to kill & we don’t need robots to take our jobs in the workplace. We need to stop developing robots and work to develop humans on this earth to rise to our best potential and put humans to work–not machines. Why do we need machines (robots) to fight each other? America needs to put it’s people back to work by making things and stop all the killing.

Posted by Teri Perticone

Share

Comments are closed.

CLICK HERE to Listen with iTunes, VLC, Winamp, or other players
No Lies Radio Visitors


Join Our Email List

* required

*



*



By joining our email list you agree to our Privacy Policy.

Archives

October 2018
S M T W T F S
« Sep    
 123456
78910111213
14151617181920
21222324252627
28293031  

User Login