 Let me throw in a bit of Low into the mix We have this referred to the words key words perhaps Depersonalization automation Autonomization deception anonymity that may be even machine to machine warfare and if I'm thinking back because Vaso mentioned It's a hundred and fifty years since the first Geneva Convention this whole Codification of international humanitarian load that we've witnessed this trend through the last one and a half centuries Was the response to something it was the industrialization of warfare basically the industrial production of weapons of very powerful weapons the mass armies that started to appear with the Napoleonic Wars and throughout the 19th century of Poorly trained soldiers. Let's face it with very powerful weapons Causing a lot of damage and a lot of Suffering mainly at that time among the armed forces that the medical services couldn't deal with and so on so it was a response and the term has been coined that basically Humanitarian law very telling name the law of armed conflict humanitarian law was about the humanization of War the response to this kind of technical Industrialization in the late 19th century now. We're facing a new trend of new technologies on a very broad scale invading so to speak the the battle space are Refacing are we witnessing a new? Dehumanization so to speak of warfare that requires a new response Or can the response of the 19th century? Humanitarian law as we know it can it respond to this trend. Is it adequate? Can it keep? Just before getting into question whether the law is sufficient. I think it's interesting Where you raised the question whether? We're going we're following a trend about the dehumanization of Wulfa because actually there are some people that claim that thanks to this new technology War will be more humane because I'm not saying I'm sharing is you but there are some people that claim that technology is there to help Existing existing conventional armed forces and conventional types of weapons to better implement IHL and human rights provisions and this leads me to another question I'm also wondering whether we as civilians and I consider myself a civilian to always send the picture I'm a militian Swiss officer a volunteer Because I thought that in order to be credible to talk to my target audience, which is the military I decided that it was a good idea to enroll and some in the military justice. So I have like a small Kind of activity within the armed forces But I'm wondering whether we're civilians have not also contributed to this because if we look around in Europe The trend is to reduce the number of the armed forces in Switzerland We have just had vote on this referendum. We decided to keep Conscription in other countries like Austria and Germany they would like to go back to conscription because there's an awareness that with such limited numbers of Humane personnel available. It's very difficult to face the current threats the existing risks I've just flown back from Hungary this morning. And for instance in Hungary does also concern in this regard because Conscription has been abolished and now with the Ukraine and the Crimea situation There's a much concern about the borders not being protected. So in one way we've pushed the military to look for Alternatives to humane power and this is where technology comes in It's like during the industrial revolution at the same time if I think about the Swiss armed forces It's interesting because we have the notion of citizen soldiers and we've noticed in the field that when the citizens You know go back to the reservist duty on Sunday evening and then on Monday they see that the radio system is not working What do they do they go back to their smartphones? They communicate via WhatsApp? via emails and so this Intermelling show that the civilians are used to a certain kind of technology that they also want to use within military. So You know, there's a stronger exchange and there are new concerns So to go back, I think it's it's also part of a new society that we don't want So much loss among human beings. We don't want to see corpses being brought back in spin You know in coffins and sometimes it's also easier to see machines fighting something which is as dirty as war Thank you, Roberta Laurent the ICRC Point of view Generally, do you feel that humanitarian law as you are guarding it? Is sufficient as a response to the new technologies that were likely to face I think it's important not to Underestimate the adaptability of international matter and law means you spoke just before of the fact that it's a 19th century Low, I would say at least 20th century for all the main principle And I think when I say we should not underestimate its adaptability It's because the foundation of international matter and law is general principles Which I express precisely in general terms and which can be applied to very different type of warfare as we were just discussing With the other panelists and bill before the there is no specific Rules on air warfare air warfare is regulated by the general principle of HL Despite the fact that it's present since a hundred years, but that shows that general principles are able to regulate new technologies because at the time a hundred years ago It was the new technologies of course new technologies must respect existing law that's the first question and That's also somehow an answer to this question Can the law keep up the law is here? And it's the technology who have to keep up with the law if the technology does not comply with the law The technology is not allowed to be used in armed conflict full stop That's how the law stands and that's I think pretty much undisputed So what does it means to comply with the law? it's with the general principle like the prohibition of superfluous injuries and unnecessary suffering the obligation of discrimination between civilians and combatants Proportionality and precaution all those basic principle and When maybe we continue the discussion we might see a number of challenge With regard to new technology how they can or they cannot the challenge they pose to respect those principle But let's be clear again if the technology does not master the ability to respect the principle I mean if the weapon cannot when it's used respect the principle then it simply cannot be used now of course and Coming back to your last question whether the law is sufficient That's always the question that one has to look at when a new technology is developed Whether the law as it exists is sufficiently clear sufficiently precise Does it bring the protection that we want that the law was supposed to bring to both civilians and combatants? That's certainly something on which we have to reflect Us as the ICRC but first of all states when they study and developed such technology Whether in view of the specific characteristic of the technology in view of their foreseeable humanitarian impact Whether the law is sufficient or not We don't have a general view on whether it's the current law is Sufficiently detailed for all new forms of new technology new technology evolve every day So any general position we would have would be outdated today or tomorrow But certainly we think for example for cyber that they might be possibly at some point a need to reflect on the necessity to Develop the law that will be for states to do it but that's you know view certainly one of the Technology which raised most questions certainly much more than the Jones for examples In terms of how to apply existing law to it That's probably also the case for autonomous weapons and especially for one aspect with because There is the technical aspect whether and that you expressed That you mentioned before whether through autonomy we will be able I mean whether technology will be able to distinguish between combatants and civilians and between combatants and wounded combatants and between fighting combatants and combatants who wants to surrender so That raise a whole lot of question if the technology cannot do it then automaton most weapons cannot be used That's easy now even if the technology Would allow to do that and to respect principle of proportionality etc etc, which is certainly not in the foreseeable future then remains the ethical or moral question is it Appropriate to take the human out of the Decision to kill somebody and that's Question that is not answered for the time being and that on which we as the RCRC don't have an answer yet either It's I said first. It's an ethical question But it's also a legal question in the sense that you have in the law and what is called the Martens clause Which is that for situation which are not regulated by the law and the Civilians and the combatants remains and the protection of the dictates of the public conscience and This has been also used in the past with regard to new developments. That's certainly something which was referred to In the preambler of the mind land-bound convention Also when that's not new technology But we say also with regard to the debate which led to the prohibition of blinding Laser Weapons which have been forbidden. That's the only example of new technology which had been forbidden before being deployed on the battlefield So that's certainly something which still needs to be discussed and debated so New technology has to comply with existing law. It's not existing law that has to adapt to the new technology Marco How do you see things? Can the law keep up? You ask the wrong person because I'm the person who always believes perhaps because of my civil law background That rules, you know the code Napoleon is more or less still the code civil today And the world has totally changed. It is precisely because the rules are General and abstract that new phenomena have to be dealt with under the existing law until we are able to Change the law and hopefully in the good direction now I'm slightly skeptical whether it's easy to find agreement on new rules. So let us live with the existing rules. I Think an important basic Thing is that Only humans are bound by legal rules If you believe that then Somehow technology is not so important because it's always humans who produce the technology who Decide perhaps two years before it is used to how the technology will actually Targeted therefore even the moral question. Is it immoral that one that I find it immoral that a human kills another human But this is my very personal opinion back Is it in human that Machine kills a human. Well There too, we have to be conscious that if we were asking these questions in the Middle Ages or With the samurai in Japan that would have been a genuine question. Well, I mean today. Yes, it's Someone hopefully there are always humans who remain in control someone on a warship in the Indian Ocean sends Missile a tomahawk missile and he doesn't know the persons who will be killed in Baghdad Hopefully he knows that's a military objective and someone has made the proportionality called calculation and then these people are killed and He doesn't know who these people are so the idea that us it was Ideally don't believe that it was better in the Middle Ages. Please study history and That ideally it's a human being who knows the other human being and then kills that that's simply not IHL us it is today. So I think I fully agree. There's a question of technology We have to know whether the technology is able to Respect the rules, but it's not really the technology Where are the human beings who use the technology by using this technology are able to comply with their obligations under the law and For instance for autonomous weapons, we know that this is yet not yet the case But I could imagine perhaps, but this is not my problem. That's the problem of the technologist perhaps one day it's possible and The septicism towards technology. I don't fully share this septicism Simply look at where are more victims? Where's more suffering where high technology is used or without high technology is Syria is South Sudan is the Democratic Republic of Congo a problem of too much technology or not enough technology I think well, unfortunately perhaps because of my past that only humans can be inhuman and Unfortunately humans do a lot of bad things, but obviously they can also have compassion compassion is Something behind the law is not a rule of the law, so We need to know whether the technology is able to comply Or the human beings are able to comply with the rules even using this new technology Then there are and we come certainly back to that some technical question technical legal question like how I don't know to evaluate the military advantage and How could Even without technology. It's a question of principle in my view for instance an autonomous weapon could not evaluate It could evaluate the civilian The risk for civilian but not the military advantage because the military advantage is constantly Developing according to the plans of the commander according to the situation in the hostilities and The answer is simply this means that there must be an input into the Autonomous weapon whether this is possible. I don't know and I agree that cyber is probably the most Critical one because it's on a very different level and there I think We can deal with most Of the problems under the existing law While there are some issues where either the law is over inclusive or under inclusive So for instance deleting data if you consider that this is Destruction and therefore an attack then in my view the The concept of attack becomes over inclusive and this is also dangerous So there I think there's a real problem and I agree with you drones. There's really no new problem At least in law with drones How they are used? Yes transparency for instance, but not the technology Thank you, so that that means there is an important distinction to be made between the permissibility x the acceptability of the technology as such and Acceptability or legality of the way it's being used obviously any weapon can be used unlawfully from a wooden club to An autonomous weapon system, but there's only few weapon systems that can never be used lawfully Those that have been completely prohibited, but still I sense that there is some kind of a Bottom-line agreement that the existing law is relevant It does apply and if the new technology cannot be used in compliance with it then To that extent that technology would be unlawful can we then just turn The the logic 180 degrees in the base basically say well, okay That means that if it can be used in compliance with existing law then everything's okay No Well one of the problems I have and being a panel of lawyers is a belief in the law and that people will actually Abide by it and that's my big problem really when you talk about new technologies You imagine that if you have an indiscriminate weapon over the laws, okay, nobody will nobody will use it Nobody will it can't handle proportionality It can't conform to the principle of distinction Therefore will all be very good, and we won't use it just like we didn't use air power in any kind of bad way At all throughout history When we know how we have no special laws for air power, but I don't think that worked out too well actually to be honest Today yes, I know but look at the but look at the phase We had to go through to get there so we don't really we don't really so so if you're talking about a treaty ban the one thing that that does is it sharpens the focus on a particular technology and Makes creates a stigma for its use so so that would be one way of doing it But but could I just address the cyber issue for a moment because I don't want to get caught up too much an autonomy here Because we could argue we'll get into big too big an argument But I want to ask the law thing about this is there are things about cyber that I've heard the military talking about in the United States and one of them is that Because forensics is impossible You might want to use probability Who is most likely to have used it and then attack them on that basis? I'd like to know how that fits with the law Does that crew choir something you not to mention well, I'll give you the other one which is In cyber warfare. Nobody shows their colors So in military terms you always have to show your colors. I believe will submarines are a bit different So so and also who's responsible for the separation of the civilian and military infrastructure? Bill I Where do you think that there are how does What are the legal problems here essentially? It's a bit too simple if these Technologies can respect existing law then we're fine isn't their Aspects or aren't there aspects where the law doesn't quite fit these new technologies or where we have to change The way we perform our obligations under international law because of the specific characteristics of these of these technologies Bill could you could you could you speak to that? I agree with what's been said that The existing law Applies just as much to new technologies as it applies to old technologies And the basis of that is the terms in which all states are required legally to review New weapons and they're required to review new weapons by reference explicitly to existing international law norms and There are a pretty standard set of legal criteria that you apply Consisting of the superfluous injury an unnecessary suffering principle. It's already been mentioned the indiscriminate Weapons prohibition that's already been mentioned the environmental protection rules that haven't been mentioned and Any ad hoc weapons specific rules that apply Clearly to the particular weapon type that you're considering The new technologies do sort of change that a bit in The sense that if you're talking about Drone type technology if you're talking about cyber type Weapon systems if you're talking about highly automated or autonomous weapon systems It seems to me that you've also got to Consider whether those weapon systems facilitate or at least enable The existing law relating to targeting to be applied That's not something that you need to consider when you're reviewing a rifle Because self-evidently the existing law applies to the user of the rifle but the minute that you're creating a technology which masks the are the existence as it were of a User in terms of a human user when it comes to the decision to Attack then I think what one needs to satisfy oneself of is that that particular technology is capable of being used consistently with the Targeting principles I won't go through each of those principles and to the extent to which they may cause challenges In relation to particular technologies because we be here all night and you don't want that any more than I do Then I asked myself right well, okay, let's start thinking about particular technologies What about performance enhancing technologies? Do they require legal review? Well, it seems to me. Yes, they do because they would count as a method of warfare, which is specifically mentioned under article 36 But what are the tests how would you evaluate such a weapon system? Given that there is no it seems to me ad hoc law and given that the other Customary principles that I mentioned Manifestly don't seem to apply and I think in practice what you're going to be doing is you're going to be applying Notions of domestic law. I think you're going to be asking Whether the health and welfare of the individual who by definition is going to be a member of your own armed forces is Adequately safeguarded by the way in which that technology is being applied to him or her and Then maybe there are ethical as well as legal worries to consider. What about nanotechnology? Well, I don't think that nanotechnology as such is going to be the subject of a weapons review What I think in practice is going to happen is that a weapons reviewer a state Thinking of acquiring a new weapon is going to be confronted with a Element of that weapon having been constructed using nanotechnology and it's going to have to apply the standard rules And it's going to need to satisfy itself that this particular Weapon including the nanotechnological bit Isn't going to cause superfluous injury isn't going to be indiscriminate isn't going to breach ad hoc Weapons related law and isn't going to have the prohibited effects on the natural environment What about the obligation to review new cyber? Methods it seems to me that the law applies to them in just the same way as it applies to any other weapon Can it be used discriminately? You're not going to be so worried about the Effect on an adversary Because that's going to be the weapon that's deployed as opposed to the cyber capability that you're employing to deliver that Injuring effect and then what about the obligations of individual states? to review an adequate The question is whether their obligation to review new weapons is a satisfactory safeguard against Unsatisfactory new technologies being brought into the battle space. Are we being adequately protected by this weapons review process? and the worry there is Firstly that so many states are not known to have a satisfactory system for legally reviewing new weapons And that I think is something that the international community would do well to address and Urgently and then the second issue is confidentiality States are going to be very disinclined to tell anybody about the opportunities that they perceive new weapons They're thinking of acquiring are going to afford them and about the risks that they think that those new weapons may Suffer in relation to technologies in the hands of others So therefore by definition, I think this is an activity this reviewing of weapons if you like That states are going to wish to continue to undertake themselves however imperfectly and in confidence But on the other hand, can the law keep up? That's the question that's underpinning much of this discussion our future Technologies adequately constrained adequately limited by the rules that I've mentioned superfluous injury and all the rest Mostly I think the answer is yes but absolute autonomy Human performance Enhances human performance de-graders genetic weaponry That sort of technology I think is a considerable worry Partly because we don't somehow know what is going on in the research field if you like Underneath our personal radars Partly because we have a suspicion That in order adequately to address that sort of evolving capability There is a new for need for new rules But on the other hand, we're not terribly sure what new rules Because we don't really know what the new technology is going to look like and we have a suspicion that only when we know What the new technology is going to look like and we're going to be in a position to draft new rules that are going to make sense So is a blanket ban the answer? I suspect the answer is no and I really think that a blanket ban of Certain new technologies would be ignored Particularly by states that identify those technologies as affording them an opportunity I think it's better in my view to promote wider engagement between states at the official level Through whether it's the conventional weapons convention process or the UN or whichever medium It's thought to be most appropriate to facilitate legal discussion about emerging technologies and with a view to Developing a common sort of view about what the challenges are How to interpret the application of law to these technologies and so on maybe that's a bit of an idealistic Proposal from somebody who isn't normally noted for being an idealist But it seems to me that Some sort of discussion between states would start off on the basis of mutual suspicion but the idea would be to erode that degree of suspicion and to get to a position where states feel the confidence to talk to one another and Perhaps through that medium we get some sort of handle on what's going on under the bed clothes if you'll excuse the analogy and With that slightly better Understanding of what's happening. We might feel a little less worried about the future or maybe more worried I don't know than we are at the moment But at least we would feel that the discussions were beginning the process of addressing the potential problems I see we already kind of are drawn into the The big question of the way forward how to actually respond to to the introduction development of these Technologies, I just wanted to answer to know that we the lawyers we believe that the law will be respected Yes, that's another issue that we have to have implementation mechanisms Enforcement mechanisms and states do not accept efficient enforcement mechanisms But under this argument you should also prohibit arms export because indeed and Switzerland have just liberalized its arms export To adapt it to the EU standards, which are even more liberal and there we know that these weapons are not used In conformity with IHM. So I would simply say let it let us not Discriminate against new technologies which in historical experience nevertheless permit For instance in the field of targeting more precise targeting I mean think about the aerial bombardments during the Second World War and the Vietnam War and compare them with today's aerial bombardments There is however, there is an issue we have to speak about which is that once a technology exists and The state and their military consider that it is a very useful technology there is a risk that The state practice which influences the law Will change and suddenly something which we would today say is unlawful is considered as a lawful So this is why I think the review process has perhaps to be rethought with these technologies that you constantly Accompany a development because today for instance you cannot yet evaluate Autonomous weapons because they don't exist But that somehow the review process should constantly company a development and eat at each stage Somehow to check. Okay. What does that mean in terms of the possibility to respect the law and the third issue? And here we could nearly have the evening There are some technical issues where indeed we have to agree on a new Interpretation of the existing law to give you an example which will not be controversial I could give some controversial ones under the existing law You can only commit a war crime during a war and in peacetime you cannot commit a war crime Now if the last human being program him a weapon which will then be used in wartime by a commander, but the commander doesn't The commander trusts that the weapon was correctly a Programmed to attack only companions and not civilians and that it the weapon is able to recognize those who are wounded and those Who surrender and all these things which apparently are not yet possible today And then and the programmer deliberately miss programs He cannot commit a war crime. I assume it's not a lady. It's a man. Therefore. I say he and Do but there are solutions for that We don't need a new treaty, but we have to agree on an interpretation or for instance You have to interrupt an attack when it becomes apparent that the attack is lawful I think this rule can perfectly Apply to drones, but even to autonomous weapons But you cannot say when it becomes apparent to the machine that it is unlawful Because the machine is constructed by hypothesis that it doesn't become apparent to the machine and you cannot either say that it's Apparent for an autonomous weapon to the human being who has constructed the machine because it doesn't become apparent and therefore But I think this is reasonable interpretation of existing rule This means that the machine has to be constructed in a way that it becomes apparent to the machine So the obligation which is an obligation of the human being to interrupt the attack when it becomes apparent to the human being Has to be interpreted as meaning that if human beings use machines They must be make sure that the machines are able That it becomes apparent to them that an attack is unlawful as well as it would to a human being No, did you want to respond to that a little bit into all of that? I agree with you I mean one thing you said that I agree with anything was the idea that States eventually accept a change because the big thing is about there are great fiscal cuts at the moment We know that militaries are being cut all over the place and so the big big idea is to have forced multiplication by using machinery and You put massive investment into that Supposing you put massive billions all your investment into autonomous weapons and at the end of it They can't discriminate and they can't do proportionality. What do you do then you get attacked? You've got no other you've got nothing else to defend yourself with what do you do? So that was your point about the the states changing the law and I think that's one of the real problems But also I think it's I think it's not wise to think about this in terms of a single weapon You know, you have a weapons review. It's a it's a sort of war review You need because once you start getting more and more into automation You start getting into this whole idea of a factory of war will we just automate war completely and there's no one No one on our side being killed then you might be totally discriminant, but your enemies might be very very large You might be discriminant in many many countries only killing their combatants But it's the war legal in the first place and what's the nature of war could be so easy to go to war So those are the sort of questions that you need to ask I mean so so just being discriminant or being proportionate is Insufficient to stop expansion of warfare and the illegality of wars but look on do you see in any this danger of maybe a creeping change in state practice in order to reinterpret or I wouldn't like to say twist a little in order to Accommodate the new technology or see did you see this danger as being a real danger that over over the long term this could actually lead to a Weakening of the protections that have been developed in the last century or so Probably that would need a historical analysis to see how Practice of state have indeed changed the law or not, but probably I wouldn't say the risk do not exist and dismiss the risk. I think it's something certainly important But I think we have to look at it both ways and as Marco was mentioning that some a new technology have helped being having weapons More precise one could wonder whether the standards in terms of Precision of weapons have not evolved as well in the positive sense compared to a few Decades or a century ago and what would be today considered as in Discriminate might possibly not have been considered. So at the time Despite the fact that it's the same rule. So it's through state practice interpretation. So I think it's important but in both aspects both in the risk of having a more relaxed interpretation and But also possibly having with the development of new technologies. I Mean the push to you. I mean that there is anyway an obligation to use whenever it's feasible the most Discriminate technology, I mean that means and method of warfare which will lead to Which will avoid or minimize civilian casualties. So that's Anyway already in the existing law. So anything which can be developed in that regard should at least theoretically bring to Diminition of civilian casualties if technologies Exist now to come back on another point that Nils you had mentioned before I didn't want to give the impression in my first presentation, I mean the first time I talked That the ICRC think there is no need to develop the law I think at this stage which we got to the technology we are talking about We don't have yet sufficient knowledge in terms of the technical feature of those systems and their humanitarian Consequences to have a position on whether or not the law would need to be developed But throughout the history of high-chill The ICRC has been at the forefront in terms of developing the law for specific weapons Etc. And we don't say we will not do that in the future for those existing technology Just for the time being not enough is known about that for us to take a position in that regard We've talked about the law and and this may be sufficient or insufficient precision and adequacy to Regulate these things but one issue that always comes up or very often with various New technologies cyber the attributability of cyber attacks that you've mentioned that's very difficult, but also With regard to autonomous systems is the question of responsibility Yeah, if you have a system that basically decides what it will do autonomously Who's responsible if it if it causes damage if it causes harm if it does something that is unlawful and I don't know Roberto. Would you like to kind of speak to this briefly? With the tough questions I'm not sure that it might not be a she the one planning the systems by the way, we can be as nasty as man sometimes unfortunately Yeah, they should responsibility is one of the big ones and I think this is also the reason why Professor Sharkey is so much concerned about automated Weapon systems and autonomous weapon systems human rights watch has been so much concerned about these of targeted killings Also, this is the reason why governments are very much concerned about the use of these new technologies And there's a big endeavor and looking at the problem Comprehensively looking at not just legal questions, but also the ethical questions because I think the law is really strictly related to ethics and to moral questions and Just a little remark. I don't think that lawyers things that everybody will stick to the laws I think laws are there to put some limits so that if someone breaches them There's a way to to look for redress and this is why we have work around regions We have international tribunals and we have different legal paradigms So in order to answer your question For instance, if I look at the concerns of governments or organizations like NATO is that you have different legal regimes you have issues Related to insurance questions protection of the environment by the use of such machines the problem of Product liability in case of malfunctioning The issue of individual criminal responsibility, which is not to be confused with command responsibility and For the government's states one of the major issues is state responsibility Can we attribute responsibility to states for the malfunctioning of a machine of a drone of an autonomous system? Can we attribute responsibility? If a member of government or armed forces does something unlawful and this is where I think that scientists and engineers really need to talk very often to lawyers because this is another problem that we talk in compartments and Sometimes lawyers are not aware of the issues the possibilities that are there in technology and vice versa. This is maybe Another particular aspect of new technologies is that because of their technicality and difficulties. It's even more necessary that lawyers and Those looking at the ethical aspects and engineers look at this problem So to come back to really state responsibility Again here. There's been much concern, but Fortunately the articles on the attribution of responsibility to states for international wrongful acts have been adopted I mean the the articles that were proposed by the International Law Commission and One of the key principles is that Any activity of a state's armed forces can be attributed to the state Regardless of whether the member of the armed forces was performing This act in its official capacity or unofficial capacity and because we've seen that Somewhere in the chain. There's always a human being operating or deploying the the machine or programming it There's always going to be the possibility to come back to a person acting on behalf of the state So if it's a member of the state's armed forces, it's even easier if it's not a member of the armed forces We have rules on on the fact that state agents always lead to attribution to the state for its acts And we also have various limitations to the the use of defense arguments like a force majeure with Major So first for instance a state could not claim not to be responsible for them on functioning of a system In case of recklessness of negligence because I've seen with the presentation of mr. Bruce be there's a review process that needs to be undertaken a weapons need to be tested So a state can't can't come With the argument that oh sorry we didn't we didn't foresee this kind of possibility this kind of effect of the weapons So again, thankfully we have the article 36 of protocol one Another important aspect of state responsibility is that we're not looking at who is guilty. It's about To whom to attribute responsibility and also in this case Even if a state could come up with a good argument in defense of of the reason why it had to breach the rules This has nothing to do with compensation So if we have victims the state representing the victims can ask for compensation so This is Something that should tranquilize a little bit was concerned and again the law should look be looked at comprehensively So again, we have criminal law for individual perpetrators. We have civil Claims that can be filed. So different possibilities. Thank you, Roberta. I think it's that that's a very important observation that Even though some of these new technologies may lead to Difficulties in attributing for example criminal responsibility because you may have a robot Deciding something and it will be difficult to basically track the a person that's actually intentionally Wanted that kind of unlawful result It does not mean that no one's Responsible the state's employing and using the the system will be responsible legally internationally Fool the harm caused by that system and also liable to to pay compensation for harm caused and How to enforce international law in the end is difficult, but that's a wider subject as we know That's not linked to new technologies in the first place and also the difficulties of Attributing certain certain operations in cyberspace for example to certain state is is not a legal difficulty or legal problem that you could Reasonable resolve with presumptions in these types of things, but it's a technical problem It has to be solved on that on that level and you shouldn't be Identified as a failure in the law if that becomes difficult Because I think one thing I'd like to to do before I just open to the to also to the audience to to contribute to Discussion is just to have but please short statements because I think that would also generate discussion I really would like to benefit also from from the numerous Guests that we have here and and their expertise and their opinions and views as to the way forward Where should how should we deal with this? I mean we've had you know the issue of the ban and then I'm sure you know would like to speak to that as well But also the question of and how instead of the ban perhaps having a ban of certain technologies having a dialogue initiated my dream Wish for this would be that well article 36 that keeps getting mentioned here is the idea that you Review your weapons, but of course no one knows how anyone reviews the weapons There's no real rules about exactly how you review it and one nation may have autonomous weapons We perfectly capable of doing massive testing another nation may not so my view of my dream would be that the way forward Would be the article third people's weapons reviews should be totally transparent to everyone else That's what I would like to see that would be a great way forward for me And I'd like to know exactly how they were tested in what's in the same way that we take medicine And we do drugs trials and those drugs trials well I mean drugs companies are a bit slimy so they don't really you know They're not you can't rely on them too much But it's a lot better than the weapons reviews at least there's some you can once there's a sort of Pretended transparency at least you can attack that and say well, we want more we want to know this issue and we can question it I think it would be Excellent if all weapons review would be transparent now maybe a midway, which is I don't know if it's achievable, but at least that's less unrealistic dream I think because dreaming that states would reveal all their technical expertise on new technologies Maybe That's not foreseeable, but that at least when they make Review and they conclude negatively that the weapon is not allowed that this conclusion is made public and the reason why Because anyway, if that's the conclusion the state will not be able to use that technology That's its own conclusion and that will also inform other states opinion and to come back to a point Which had been done before maybe prevent that state practice leads to The establishment of more relaxed practice by establishing that no this is forbidden and probably you can certainly Joe obligation, I don't know But at least a responsibility to do that based on common article one from the Geneva Convention Which is that you have to ensure respect and then for developer of technologies for states would develop such technologies not only when they say that it's The technology does not pass the test of the law But also when they put constraints on it because the law says that you have to review whether it can be used in All circumstances or in some circumstances So that means that's possibly the review will lead to the fact that you can use it in some and you cannot use it in other and that this also the aspect that it cannot be used in those Circumstances is made public because and that's all the more important because that weapon will possibly go under development And being used by that state but only at least hopefully only Under the circumstances that it had considered lawful now It might also sell that technology To other states and those other states needs to be aware of the fact that the one who developed that technology Consider that in some circumstances that technology cannot be used. So I think at least I would say I have a Half the dream of but maybe a more realistic one than the new novel But certainly the fact that weapons review is central to the development of new technology That's something I think most of the panel agree and I think certainly that we at the HRC agree just one other point is we mentioned compliance and I agree with Nils that this is another debate, but also that this is extremely Important and as we had mentioned that several times there is that's also something We are working a lot on with the Swiss government on monitoring compliance for for IHL But as Nils has mentioned that would open up an entire other debate. So I'd stop here You just said dreams don't have to be realistic I Like to be more realistic than a dreamer because I learned that for instance with too many ideals it's very difficult to to keep up the struggle and Also in terms of IHL I notice that sometimes we expect from Soldiers to be empathetic towards the enemy. We're not not even empathetic toward their own comrades. So What I like to say is that maybe we should try to Reestablish the the correct scenario for the use of armed forces and to distinguish law enforcement operations from pure armed conflict scenarios So that it would be also a little bit more easier for the lawyers to decide which rules of engagement to apply and how to train Combatants so that combatants don't have to operate as policemen because this is creating a lot of confusion in in their heads and In this sense, I think that for instance a lot of governments including NATO are undertaking major efforts There's a current project that you're aware of it's called There's an international forum called MCDC, which is the multinational Capabilities development concept is a forum which is open to all kinds of governments institutions academics it was launched by the US many years ago and every two years they come up with a new research topic and Last year one of the partners of this forum, which is NATO launched the project to study the implications of the use of autonomous systems in gaining operational access and Some of us are in this project. We have lawyers. We have scientists We have people looking at the ethical aspects and the aim is to come up with policy guidelines Obviously it would be down to states to decide whether to follow these guidelines but I think that the effort is worth it, so we're actually doing something about it and I think it's very important to encourage this kind of discourse and my hope is that When we have different projects on different important research topics that we don't think about our own little agendas But that we try to exchange views and cooperate and think about it globally so that not everybody's conducting its own study In its own little corner, so that's why I'm very glad that we're here to exchange views Thank you very much Roberta Also So we had the idea of sharing information about weapons reviews Sharing some information about written weapons reviews policy guidance Other ways forward briefly Marco Dream of the UN Charter Prohibition of the use of force and respect of human rights But we are here dealing with humanitarian law which applies to armed conflicts and there we have indeed to be realistic and Technology has advantages in armed conflicts and even outside armed conflicts at least my car Perhaps because it's a German car makes less mistakes than I do and therefore you cannot say a machine makes more mistakes Than a soldier because soldiers are not very reliable people us all human beings Obviously Technology must remain controlled by human beings But this doesn't mean that the human being has all the time be to be behind it States must take IHL into account when they develop new Technologies, I think it is justified that civil society NGOs ask Legal and ethical questions, but simply don't be biased against technology Ask the same questions about missiles about the aircraft about rifles. That's I think important Transparency indeed. This is my dream because it's not a rule of Humanitarian law unfortunately while human rights law has very important rules on transparency And our human rights are also applicable in our conflict. This may help, but This is yes something new that states should be more transparent and I think public opinion can have An impact not only on the weapons review But also on how they use it and why it didn't work or why it worked Finally while I'm against a ban, perhaps I'm even more dangerous for these new Technologies than those who ask for a ban because I simply ask that those new technologies May only be used if IHL is respected by the human beings who use those technologies. I Agree with Marco completely. I think the answer is legal compliance Absolutely where bands are concerned I would be very very skeptical About a call for a ban of a technology that doesn't also call for a ban of the Weapon systems that the technology in question is being developed to address It's very easy to call for a ban of an autonomous or automated Technology that is being used to try to be the last line of defense Against a mass inbound rocket attack, for instance If you're not going to also ban Mast inbound rocket attacks what state is going to that is seriously threatened by that sort of Capability is going to go along with you and say oh, yes. Yes Will ban the only way that we can address this threat against us not many certainly not many sensible states I think the problem in making weapons reviews publicly available is That if you do that there are going to be even fewer weapons reviews Undertaken than at present and heaven only knows there are too few in practice Undertaken at present. I believe Why? Because a properly undertaken weapon review will tell you Exactly how to counter the weapon concerned. No state has an interest in doing that. I'm sorry It is a dream, and I'm sorry. I ain't no dreamer But on the other hand I Then address myself to the states that take the chance that don't actually undertake weapons reviews Increasingly in this world I think you're taking an enormous chance Because lawyers sooner or later are going to latch on to the fact that if a situation arises Where proper tests Had they been done to inform a weapons review would have identified the thing that went wrong That caused their client to be killed injured Have their home destroyed or something of that nature you can expect some sort of legal Consequence it's outside the scope of this discussion Exactly what form it might take. I just Indicate to you that you're taking a risk and as time goes by in my judgment You're taking an ever greater risk, and I think it's time that states woke up to this obligation Which over a hundred and seventy of them have signed up to and then Yes, I've advocated Consultation I've advocated a process of growing confidence amongst states and I've suggested that the CCW process might be a venue for that It will be up to states to decide what the correct venue would be What format it might take and how they might go about developing this sort of confidence And I don't think it will be easy, and I don't think it would happen overnight But I do think it would provide states with the opportunity to discuss amongst one another Emerging technologies and their concerns and their thoughts I'm not suggesting it will be the venue for new law or treaties and things of that nature most certainly not But what it might well be is a process that could lead to lesser understandings and Perhaps an approach to dealing with new technology and an understanding of how the existing law applies to it Which otherwise might become rather fragmented And I'm not sure that it's in the interests of the global community that understandings in this area should be fragmented Thank you