Look at it from a military viewpoint. Soldiers, airmen, marines, and navy personnel are expensive. Correction, their experience is expensive. Their training is costly but their battle experience invaluable. That what the Japanese did not understand in WWII. We were killing all of Japans experienced pilots. They were killing the inexperienced ones with the suicide attacks. In the end, Japan did not have the air power to prevent our bombing of their cities. This directly led to them losing the war. Any military commander would rather lose equipment rather than experienced men and women. That is the reason we use drones like the Predator. Sure the enemy might shoot one down but with the pilot safe in the US we don’t lose the experience gain by the drones death (i.E. Weapon system used and it location). The same drone pilot could be flying a new drone within the hour attacking the enemy using this knowledge. From a military viewpoint, autonomous weapon systems are an improvement on that concept. There are problems with using autonomous weapon systems. The main one being that they are autonomous but not intelligent. So the parameters in which they use deadly force (make someone dead) needs to be precisely defined. Since war is anything but precisely defined there will be mistakes. Civilians will be killed in error. Collateral damage if you will. You might not want to admit this to yourself, but collateral damage is always acceptable and often planned for during military or covert operations. The nation denies or blames it on someone else, pays recompense and gets back to the mission. The Pentagon will have no compunction with using autonomous weapons even with the occasional collateral damage.
Like I said - Their Deployment is Inevitable.
A a a a a a a a a a a a a a av v vv v v vv vvv v vv v v vv v v vv v vv v v v v vv v v vv v v v v v v v v v because it would be fun
Lethal Autonomous Weapons System or LAWS are basically robots that can choose its own targets without human intervention. They are being developed by countries like USA, UK, Germany. These robots have no sense of self preservation making them lethal and an effective device for man slaughter. They can be threat to everyone if they fall in the wrong hand like Al-Qaeda or ISIS. They should not be developed
Having autonomous weapons would lead to consequences seen in X-Men: Days of Future Past and/or Avengers: Age of Ultron. They would expand their list of targets beyond their intended targets; for the Sentinels their intended target was mutants, then their list included humans who carry the mutant gene and human allies to mutants. In conclusion, self-governing weapons are extremely likely to backfire, it's too risky.
In case of malfunction (or rebellion) there needs to be a human override system. Otherwise what if these autonomous machines turn on us?
And we should aim to develop autonomous weapons systems that incapacitate and apprehend, not kill. It is certainly within our technological capabilities. Obviously there will still be a few deaths since nonlethal weapons like tasers do sometimes kill but hopefully over time the technology will get more precise in order to minimize deaths.