Should lethal autonomous weapons - or killer robots - be banned?
It sounds like something straight out of a sci-fi movie...
"Killer robots" - or lethal autonomous weapons - are being discussed by weapons experts and human rights groups at a meeting at the UN in Geneva, Switzerland.
Some believe they should be banned, but others argue it could save lives by cutting the number of human fighters.
Two experts from the field give Newsbeat their differing viewpoints on it.
Professor Noel Sharkey
Professor Sharkey is Emeritus Professor of Artificial Intelligence and Robotics at the University of Sheffield and also co-founder and chair of the international committee for robot arms control (ICRAC).
He wants lethal autonomous weapons banned.
"Killer Robots are weapons that can select targets and attack them without human operators," he says.
"They are being developed as fighter jets, tanks, submarines and ships that operate on their own.
"There is even talk about using swarms of them to attack enemies. But we have to ask, do we want to delegate the kill decision to machines? Do we want to fully automate killing, because that is where it is heading.
"Supporters of these weapons ask what if they could create fewer human casualties and what if they could find targets more accurately than humans? At the moment this is pure speculation. It is like asking, 'What if mice could fight all of our wars?'
"Nations who want to develop killer robots are thinking about the short-term advantages of saving their own soldiers' lives.
"But what happens when there is global proliferation and everyone has them.
"This is really dangerous territory, where there are arms races between rivals, and terrorists get their hands on these advanced technologies.
"How autonomous weapons will interact with each other is mathematically unpredictable. They will make it easier to go to war.
"They are getting faster and faster and so they could trigger unintended conflicts or escalate tense situations without humans realizing it.
"It is the young of today that will suffer most. For the sake of humanity, we must stand up and stop killer robots now before it is too late."
Professor Ronald C Arkin
Professor Ronald C Arkin is from the Georgia Institute of Technology and is a leading US roboticist and roboethicist.
He says he doesn't agree with war but thinks using robots in combat is more ethical than using human soldiers.
"Let me unequivocally state: the status quo with respect to innocent civilian casualties is utterly and wholly unacceptable.
"I would hope that Lethal Autonomous Weapon Systems (LAWS) would never need to be used, as I am against killing in all its manifold forms.
"But if humanity persists in entering into warfare, we must protect innocent non-combatants far better than we currently do. Technology can, must, and should be used toward that end.
"I believe judicious design and use of LAWS can lead to the potential saving of non-combatant life. It should not be simply about winning wars.
"We must locate this humanitarian technology where war crimes and fatal human error occur leading to non-combatant deaths.
"It is not my belief that LAWS will ever be able to be perfectly ethical in the battlefield, but I am convinced that they can ultimately perform more ethically than human soldiers.
"Consider not just their making a decision when to fire, but rather deciding when NOT to fire.
"They could assume far more risk on behalf of non-combatants than human fighters are capable of, taking a "First do no harm" rather than "shoot first and ask questions later" stance.
"It may be possible to save non-combatant life using LAWS and these efforts should not be prematurely terminated by a pre-emptive ban.
"Until achieved, I support a moratorium on their development and deployment. We must reduce civilian casualties if we are foolish enough to continue to engage in war.
"We can't simply accept the current status quo with respect to non-combatant deaths.
"Don't turn your back on those innocents trapped in war. It is a truly hard challenge but the potential saving of human life demands such an effort."