 It is human beings that both create the problem and are the problem. Their pursuit of power and territory will reach out for any weapon at their disposal. There is a desensitization of what it actually means to have a war, to have a battle. When I talk to my colleagues who build quadcopter control systems for a living, they say, you know, if we had a Manhattan-style project within 18 months to two years, we could deploy these in the tens of millions. I believe there's some way to go before anyone would believe we have a fully automated weapon carrier that we could deploy with confidence of no technical risk and reduced moral concern. The conduct of war might be relegated to a machine. It really started with the drones. The drones were used initially only for surveillance. If you have the man through the umbilical cord linked to the machine, then the man is still bound by the conventions of war. There is no anonymity when a person presses the button. We simply cannot build a robot or an AI system that has moral agency. There's an ethical red line going from humans being ultimately responsible for pulling the trigger and robots. Good ideas are not the preserve of good people. If something doesn't happen within the next two years in terms of essentially drawing all the main parties into a serious negotiation, it may be too late.