This page has been archived and is no longer updated. Find out more about page archiving.

Last updated at 14:59 GMT, Friday, 15 November 2013

UN discusses 'killer robots'


15 November 2013

Countries are meeting in Geneva to decide whether to consider banning 'killer robots' which can make decisions by themselves about when to kill people. Human rights groups claim weapons like these raise serious moral questions about how we conduct war.


Imogen Foulkes

A drone

Drones like this one are controlled by people on the ground


Click to hear the report


Drones have already raised questions about 21st century warfare – but while they have no pilots, they are controlled by humans on the ground. Lethal autonomous weapons, or 'killer robots', are programmed in advance; on the battlefield it could be the robot, not the human, which decides who to kill.

The United States, Britain and Israel are all developing lethal autonomous weapons, although all three countries say they don’t plan to take humans out of the decision-making loop.

Supporters of the new technology say it could save lives, by reducing the number of soldiers on the battlefield, but human rights groups question the ethics of allowing machines to take decisions over life and death.

Now the 50 countries which have ratified the convention on conventional weapons – the countries which have already approved a ban on blinding laser weapons – will consider whether to begin talks on banning killer robots.


Click here to hear the vocabulary



aircraft which are controlled by people on the ground


the activity of fighting a war


causing death


independent, able to make its own decisions

out of the decision-making loop

not part of the process of making decisions


set of beliefs or principles that tell people what is right and wrong


made (an agreement become) official


causing blindness (not able to see)

  1. Home
  2. Grammar, Vocabulary & Pronunciation
  3. Words in the News
  4. UN discusses 'killer robots'