 Good afternoon. Good evening or good morning, depending where you are. Welcome to the session on Digital Currency Validation for Protection and Resilience. My name is Vijay Mori and I lead the Digital Currency Global Initiative Secretariat in the Standardization Bureau at ITU. This is our last session for the Security Thematic Track and also for the DC Free Conference which is organized by the ITU in collaboration with the Future of Digital Currency Initiative of Stanford University as part of the activities of the Digital Currency Global Initiative. I'm joined today by our panelists. Let me introduce them. Mr. Jacques Frankeur, Chief Scientist CNAW and also team leader of the Security and Assurance Working Group of the Digital Currency Global Initiative and Mr. Ed Scheidt who is the convener of ISO TC68 SC2 Working Group 17 on Digital on Security of Digital Currencies. So this session today will discuss the process for a common multi-level assurance that can be applied to all digital currency types for threat analysis and security protection design and assurance. So basically this session today it follows a bit you know the session that was held in the morning where we discussed the digital currency ontology notions and just before this one we were talking about the different security threats to digital currencies and now we are going to discuss you know a multi-level assurance process that can be applied you know for threat analysis and security protection design as well. So the presentation today will consist of seven parts and after each part we'll have a discussion so each some audience can submit questions on the Q&A as during the presentations and we'll take them at the end of each part and Mr. Ed Scheidt from ISO will intervene of the each part to give his you know on his side you know what are the actions that ISO is doing on these different aspects of the security evaluation model. So the presentation will be led by Mr. Jacques Frankeur and I'm now going to invite him to take the floor for his presentation and just as a reminder Jacques you have for each part of your presentation you have 12 minutes and after the 12 minute we'll have the discussion with Ed and then you will resume the next part afterwards okay understood you have the floor now can you hear me can you see me can you see the slides yes I can see you and I can also see the slides so if you can go in presentation mode well thank you VJ for that introduction welcome everyone thank you so much for your precious time to listen to what we have to say my name is Jacques Frankeur as VJ indicated we have a very special guest Ed Scheidt on this session who will add his valuable insight onto this issue of validation and the topic of assurance it's interesting to take a few minutes to get to know Ed to see some of the things he's been involved in in his life and here we have a very special cryptos sculpture that is located on the CIA headquarters in Langley it's four parts 1800 letters and the first three parts had been solved but the fourth has not being considered one of the most famous unsolved mysteries in the world so Ed here I just want to take a moment to explain this story it's sort of quite interesting in 98 the artist explored potential themes and decided to incorporate some encrypted messages into the art message Sanborn Sanborn the artist was new to encryption so they enlisted Ed's help who was in the process of retiring and at the time Webster referred to Ed as deep throat of codes and then after you know some serious discussions Scheidt taught the artist various encryption methods and then Sanborn the artist himself chose the actual message text of the message on the sculpture three as have been saw as I mentioned the fourth characters at the very bottom there remain uncracked Ed knows the answer along with the artist and probably somebody else the CIA so in 98 starring took 400 hours over to his lunch period to defy to decipher the first three parts using a pencil and pen and then a year later a California computer scientist boasted to have solved the first three sections using a Pentium two and then the fourth code seemingly insurmountable seemed to crack to start a race to break the first three codes of course maybe even the fourth given it was shown to be breakable but Ed there's a rumor that you know Stein or Gilligy weren't even the first ones to do it that the NSA had beaten them in breaking the first three codes before is that is that you can you clarify that on the record now all right thank you well that's Ed so we have a very special person with us who is now brought his brilliance to the domain of digital currency security so we're very lucky to have you Ed I'm just quickly going to go through and outline the security insurance working group we have Mitch Cohen from e-currency focused on building a CBDC target model threat countermeasures kind of the thing that we do but you notice here that we start with a model of the thing that we need to protect we have the pure cryptocurrency version of that as a target model being led by Jason Lau of crypto.com and we also have Paul Lloyd of HP Enterprise who is focused on quantum safe cryptographic processes that will feed the needs of all digital currency type systems so that's really a truly a horizontal through all the different types of currencies that will need that kind of support myself as the team lead I lead the validation work stream and here Ed is the Liaison with the working group so again very fortunate to to have that so what is what is the security and assurance working group and in particular the validation work stream trying to achieve we're trying to develop a flexible objective framework agnostic multi assurance formal method based that's a lot of words but each one of them is critical and I'll get to how we meet flexibility objectivity being agnostic and by that I mean pick any framework you want as opposed to being somehow locked in by one framework versus another now a process a validation process so validation first of all is a process and a platform to do it in because what is essential is that this process be easy to do the work that needs to be done using current approaches would be very difficult to achieve these new outcomes so what is the process what is the platform that will enable that what it will enable the automation or the facilitation of a common basis evaluation and benchmarking of different performance characteristics whether it be resilience or security or privacy or even performance things like scalability so that you could evaluate and compare the performance of these across digital currency type systems so it's a lot of desire you see here the sense of normalization we want to be able to compare across different systems with confidence let me take a moment to explain the process that DCGI is following in its validation model it brings together a global domain of experts who propose suitable specifications as input into this standardization process now DCGI is like a focus group it's an early stage standardization process and then there are the more formal where the standards are actually produced and standards at IT are called recommendations but those occur in study groups which is an entirely different process that we won't necessarily go into right now but we bring together all these experts we do so by consensus and we're supposed to output whatever is created into this validation engine and so what we're trying to do is ensure common consistent comparable whatever is we're doing we're trying to make sure by design up front common consistent comparable in this case specifications measurements and metrics and templates and evaluations so that we can have a reliable and repeatable process so that we can compare things and have confidence in the the outcome of the comparison and that you can compare two things on a common basis and then we feed that back into the consensus and it feeds a you know continues in the feedback loop so we want you know common specifications and requirements consistent measurement metrics and metrics or measurement methods and metrics you know comparable models and templates and evaluation all with the goal of standardizing the DC type and you'll see later a reference to other work from the other working groups that feeds into the ability to standardize DC types with the objective of standardizing again the models the measurements the evaluation and even a tool that will ensure again that repeatability with high confidence of performance or protection metrics so that we can assess and benchmark cross-ecosystems so that is again a bit of a restatement but basically we want a world that in essence takes a lot of knowledge dispersed and fragmented and brings it into a standardized format that can help everyone achieve a better outcome. So one thing that was critical in DCGI from my point of view at least is that we all attempted to understand the DC type and then consider very various things whatever in relation to a DC type that we're all agreeing that we're talking about so this is not normally achieved because it's more difficult to policy and governance working group doing some privacy work some policy work architecture over here security over here but the key is this thinking interoperability between the various aspects about the same thing and usually we would not have that luxury of a common understanding of what we're talking about from different perspectives and as a result it's very difficult to cross correlate the outcomes of these efforts these separate siloed efforts so starting with that cross visibility is a key beginning perspective. Now this is work out of the architecture working group but it is very important because it feeds into the security and assurance working group so in the architecture working group there are these notions ontology notions that basically say that all digital currency types have these five notions these will decompose into distinctions and it grows very quickly but the objective of these notions is to take let's say the input language of policy about digital currencies so you see here DC type policy and go through some kind of normalizer communication normalizer that doesn't have you know the negative effects of a lot of baggage to certain terms and different input choices will specify architectural or technological characteristics so this work is underway it's going on very nicely and I'll mention why it's important in a minute and that is we're doing validation and therefore we're doing validation of a system design onward so we're not doing validation on notions so we pick up where the architectural working group will provide its notions so here are the five notions of digital currencies but they decompose into their primary distinctions but we've discovered in doing this work that you first start with a supply of units the unit gets value ownership value is owned value ship value is exchanged and record the updated exchange of ownership is updated so I'm not intent to show you this but these distinctions further decompose and further decompose so why am I mentioning this this work output is that this will be the basis of building targets this ontology reduces matters to its simplest form from which we can now build cryptocurrency or cbdc targets and and they're articulated and constrain and built in a very careful way so that's what that's going to be very useful so the work stream needs to then build a notion of a one digital type digital currency type system the thing that we have to validate so we need to create sort of a value process and enabling value asset representation of what the digital currency type system is doing we then need to identify and map vulnerabilities to potential threats so that we can easily conduct an impact analysis we need to be able to map what are we going to do about it security policies and how are we going to mitigate those threat vectors and consider them specifically as it relates to the value assets being threatened so we can do some protection by design we need to be able to measure different perspectives of the digital currency systems and its security controls such as the efficacy of those controls and we need to be able to measure them with increasing precision so that we have a common basis evaluation and benchmarking as you very well know so why do we need standards or standardization and so why do we need dcgi and its function we are in the middle of this technological innovation that is just the latest wave cryptocurrencies nfts even even you know earlier stage but now innovation cannot be stopped it steps in and says hey we got to make sure we create some standards we will you know validate those standards as we're doing now they'll eventually be sent to study group 17 and further massage and eventually be issued as standards and then organizations can comply usually this time may take a decade or so but really time is money and if we're trying to reduce the time increase the confidence reduce the cost so that it can be more profitable then we're trying to shorten that time by having commonality built in by design up front so i believe this is where we can have one our first vj moment for comments to you vj okay thank you jack i don't see any questions from the audience so far in the q&a but maybe i don't know ed you would like to maybe add something maybe on what's happening on at your level and if you have any comments on what jack just mentioned before we move to the next part yes jack is describing a process and and a validation process it's first it's important to start the thinking along this line to take in and take in the the business case the authority and the process in the thinking to move forward as you know that i'm representing iso and iso has a process called common criteria that was developed actually years ago but that the development there in the development now has some commonality but at the same time is there are differences the the subject matter of digital currency obviously would be new the common criteria was made for defense was more broadly developed and and created as a standard for the for use in in in different commercial as well as government applications uh so as we proceed today uh there'll be uh be adding more depth to what iso has as well as what could be applicable to what jack is putting forward ed you mentioned uh if i can just bounce off one of the things you're saying common criteria very well architected from the point of view of assurance level but you mentioned that oh well that's military systems and so at some point in the conversation we're going to ask the question of whether digital currency systems and their criticality should be similar to the military's consideration of its systems being critical what you know um it's a key point that we'll have to discover to discuss in the sense of not only the application of security but how much security right assurance level we'll get into that a little later okay thank you um next topic i think yeah please continue thank you now i guess that's the point i was uh making uh i'll show some evidence evidence in a moment that that's certainly not happening and i'm sort of like uh heartbroken because i thought that people will consider digital currencies as a use case that you know that can't go through that evolution of technological innovation where you know security is considered last you know function is considered you know that's the normal way things are developed but i would have thought that oh for this we need to be careful and so obviously that's what's going to be proposed now let's take a look at the need for assurance now a lot of the work that you'll see here will all be based and articulated on a notion of value and and and the movement of value so we're going to talk about yes the financial inclusion but at the end of the day value creation hopefully from the edges you know and value consumption hopefully at the edges i.e the intent is we you don't have to move to centralized areas nor nor a lot of individuals don't have the luxury of moving they they need to either create or consume where they are now the digital currency world will be the lubricant for that you know creation and consumption and if that lubricant doesn't have is not trustworthy then we'll have challenges and here i say value exchange for around for currencies with loss i would ask you know is this just like a credit card and losses will be built in and demnified warrantied and we'll just call it the cost of doing business or if that's the case will that just destroy the trust needed needed for the fourth industrial revolution to happen so so basically there's a lot of potential sorry there's a lot of potential in front of us some would say a lot of disruption in relation to value creation value consumption and this cryptocurrency world so this promise of of financial inclusion where you're transferring funds to consumption transferring that value and on the way back transacting that value so there's compensation flowing so basically if that can be permitted to occur wherever you are frictionlessly then where there is value there will be an attempt to steal value and at this point in relation to digital currencies it's literally the wild wild west and that's just a normal evolution of innovation much faster than our ability to even understand per say what's going on secure it even legislate it regulate it but but any innovation goes through this wild wild west and depending on how wild it is more or less confidence will be lost let me show you an example so as of today which is january 24th there's been 54 hacking events for approximately 2.4 billion of cryptocurrency exchanges you see here one mount got hacked mount got hacked 661 million stolen funds that's a good day's work and so you see over a period of 10 years 2.4 billion loss over 54 hacks now if we go to defy then and we look at breaches we'll see that as I mentioned initially 54 valued at 2.4 billion and 75 versus you know 1.7 billion so more hacks worth more and 20% of the time so we're looking at an exploding business with exponential damages occurring the defy breaches are more in one-fifth the time so you can get that information here so that that's an example of the fact that we've already in essence passed that point and of of of having to dig ourselves out of a hole with this notion of security inclusion now if it's going to enable the financial inclusion and so does that mean security by design not necessarily so basically if we come down to the building blocks of the need for assurance we have here this notion that I've created value and I will want I want to purchase value so the individual or the identity is obviously core to being able to get enumerated for value creation so you have here the traditional identification attributes of of you know identify yourself or prove control over the digital currency so how will you exercise your rights as I at the end of day again from a point of view of value your eye is what value did you create or what value did you purchase so the whole notion of what and the authenticity of what through various cryptographic techniques which Ed would be very familiar with whether it be the immutability of the blockchain today or other mechanisms a trust trading mechanisms that cryptography can can enable so that's more on the information obviously cryptographic asymmetric for identification as well but it's organized around I I own I've created so you can see that I want to get paid for the value created I'm willing to pay for value that somebody else created what is that exactly once I buy it how can I prove it's still you know valuable and then at the end as I mentioned digital currency types at their core are fundamentally fundamentally about ownership of value and with the confidence that you own value then you can transact away even in the DeFi case where there's no centralization or intermediaries and and still have assurance of these different business characteristics of I need trust I need clear liability lines and knowledge of any warranties that are you know managing or indemnifying some of the liability these business attributes don't really change and so I what and so that you can own with confidence any sort of weakness in this would give you weakness in your proof of ownership and therefore hamper your ability to transact away so at this point it's another pause around the notion of digital currencies maybe even the word liability and how that's managed but opening it up BJ to you yeah okay jack thank you I don't see any question but I have a question for you before I give the floor to Ed maybe if you could elaborate a bit what do you mean by assurance no is a definition for assurance in this for digital currency what what exactly that well there wouldn't be for digital currency because assurance is assurance independent of of what you're trying to gain confidence of so it's it's another word for or how sure do I know I am protected so it's a degree of confidence in the answer so you you create you measure something you get an outcome and then you may ask well how'd you get that outcome and oh I guessed it that's obviously very low assurance or the outcome was tested verified retested and here's the outcome and the certainty of that outcome is very high so certainty and confidence are soon okay thank you and if we maybe consider maybe what you are going to recommend here maybe what is it the what what is the thinking here is it something along the lines like what we have for and at the level of the EU maybe EIDAS for example where you have these different levels or you have your own levels that you're maybe going to present are you referring to assurance levels yes the whole topic of assurance level is a hot potato topic okay EID mentioned common criteria and in the military use case you know the the funds are available for that level of verification and certification and so how do we bring forth that notion or do we or can we that we need a little more certainty if people are going to put their value in this form what is the level of confidence and certainty that they will not lose that especially as a result of somebody's you know mistake or something but so the question here maybe to add again it seems like digital currencies are just no different than any else than any other financial process and in the financial processes you know in in in in credit card financial processing there's a there's a there's a cost of doing business is that something that is simply going to have to be true for digital currencies they'll be you know at the end of the year over time losses by digital currency types and people will put their money where you know the most secure version of the currency so I'll give it back to you VJ yes so maybe let's hear from Ed maybe what's the thinking at the level of ISO on this well there has been a lot of discussion on the subject the point here is also we're focusing on the the digital currency as the the business case as the the reference that we need to add security to as Jack was just saying assurance is is a a level of guarantee that the security methodology that has been chosen will perform according to some authority we have the digital currency and their relationship to central authorities and it can be visualized that the central authorities are going to also pull in the policy and the other factors that define or can define what assurance is our validation is needed as such so you have a method that includes the decisions by governments decisions by the citizens since they are going to be included as part of the digital currency usages so the point that Jack is raising now is also saying what value not necessary monetary wise but what value do people put on their currency or their use of currencies and now in this case we are moving to digital currency or currencies represented in a digital format and does this raise the bar in regards to security since Jack mentioned a whole lot of cyber attacks and things of that nature which could be directly related to digital formats so we are working toward a currency or a digital currency being defined that must take into account these risk these potential flaws and have the ability to add new technology or innovations from a security perspective based on the business case or what is desired to take place so Ed you mentioned the word authority a moment ago and back to my Wild Wild West use case I think it's the authorities that are missing right there is no I mean standard development organizations are not authorities I hope you agree with that in the sense that that's a should right standards produce things that you should do authorities change the should to shall do and so right now it's almost like there are no grown-ups in the room we don't have any people mandating shells in terms of security level because it's we just we're just starting to try to figure out what it should be let alone shall good point however we do have regulatory authority the regulations and the guidance and the requirements the shells that come with banking and with the existing systems that must also be included in the decisions and so when you're when you're talking I was I was thinking these new startups in the business how how do they value security probably in the same same way as any startup value security and that's certainly second on a secondary basis or it's certainly when something happens to a particular digital currency type you know oh some of it's been stolen there's there's a run out of that digital currency type to somewhere else right so confidence and so and a friction to convert is very low you can get some very rapid swings in value of currency based on on whether proceed people perceive their value to be safe or not so thank you let me continue how I'm now going to move yes go ahead yes Jacques just to let you know we are a bit behind time so okay right thank you so the state of global protection so here I'm going to sort of describe the big problem so if we looked at 2018 which is obviously a pre-pandemic this was the size 86 trillion dollars global wealth internet connected 4.5 billion people out of 7.7 for a 60 percent penetration so clearly that's a nice huge juicy target of that amount of economy creation and and and individuals that can be exploited to get to that value and then we all know that you know cyber risk is pretty up there in the global risks and here's a measure of how we are protected in fact there's an updated version of this report so I have to see how things have improved but back then about a third of the people think they're protected and two-thirds of the rest of the world are under protected with 45 no protection so you're seeing the dark blue here no protection under protected and the advanced countries who think they're protected and they spend you know some 300 billion dollars in thinking they're protected and when you look at the expenditures therefore about a third of the people who think they're protected spend about 80 percent of the budget or 240 billion dollars on protecting themselves the rest of the world gets 20 percent or 60 billion to protect themselves so that's a snapshot of the state of global protection in 2018 with two-thirds of the world rapidly connecting to the one-third of the world in this highly distributed decentralized you know everything's connected to everything world so it's like a major problem where the business and the innovation is moving ahead people are connecting into the collective and it's it's it's happening whether you're secured or not the next thing to realize is if you look at the demand for security skills and expertise versus time we all know that there's a mass shortage of skills versus the availability of skills and in my opinion as of now we are in a state of unsustainable divergence meaning that the problem will never be reversed such that at some point in time we could get ahead of the curve so you may ask yourself what do I do to bend the curve so it can't be insignificant these are bend the curve things so okay I'm sure that we share some threat intelligence and we do things here by design and and we do better education and and that's training by the way that's not don't click on that link security automation yes that would help to bend the curve in but we have this fiduciary awakening going on I thought it was kind of over a few years ago it's been going on for quite a while where the executive gets hacked and then says holy something and then oh I need to see so oh I need a 10 20 people and all of a sudden they're awakening and they're creating an increased demand because they've been somewhat sleeping at the wheel so all these things including security training of employees not to click on the link will have to you know more things will have to come into play in order to to get ahead of this curve and what is being presented today can bend both the demand curve and supply curve in hopefully a way where at some point we can get ahead of the problem the only problem is on average with the number of people entering security space who are unskilled the average skilled level is going down and we all know that threat severity is not linear it's an exponential growth so you know so the real picture is a little worse but that's the state of bubble protection this is where I can pause here the point is can we ever get ahead of the curve because you know the innovators are innovating and putting these systems online they're available the two-thirds of the world or those who are still remain unconnected will get connected they'll use these services the financial inclusion will occur but at you know what loss damages reputational to the business and so what is it we can do to get ahead of the curve any other questions at this point or Ed or VJ it's all yours um yeah maybe Ed some thoughts from your side on this on the questions raised by the frack here well jack's bringing up the point of what is the state of what could be taken place in the international community regarding security and what we are asking here is is there a balance that can be reached in the sense that we also have the growth of the business case or the growth of of wanting to move to a digital representation for this type of banking capability now uh we have tools uh that are part of the supply as jack has identified um it's how we can use these tools and the uh the idea or the the effort that we are talking about today is to put these tools into a process that can facilitate things the process can give us an advantage on time to delivery where if we just take the education level we may not have time uh to deliver on your side uh so uh understanding the various factors that can go into the to what extent security is needed to what extent do we have tools to address those needs uh becomes important and part of the discussion today so Ed um this is a global problem and between ITU and ISO I think from an SDO standard development organization we cover I don't know what percent of the landscape um so so clearly the only way we can get ahead of the curve is really if we think unified that right up front just like we're trying to do with this common consistent comparable are we at the very early stages normalizing or trying to leverage each other so that the outcome between ISO and ITU being aligned not overlapping you know these these sessions that ensure that we complement each other but I'm just realizing that the play that can get ahead of the curve has to be a united front because even the ISO ITU silos are producing you know standard specifications a really unified common consistent appropriate to the both organizations as represented by the both of us you know would would would be a necessary but insufficient requirement and I don't know what are the other sufficient things but if we don't cooperate and come out sort of unified then we're not getting ahead of curve I guess is what I'm saying so hopefully our liaison will be very productive any other comments no I think you can continue with okay some operational challenges so you have this word assurance and it's like it's something different than security it isn't you're going to be putting in a certain amount of security and we're talking about what is your confidence level in the efficacy of those controls and mitigating threats so you're still needing information about what you need to do you're needing information about what has actually been done and there's a whole communication cycle that's going on continually so if we don't solve some of the operational challenges then we we will not be able to you know keep up with the innovation now this comes starts coming back to the frictions involved in conducting security on a daily basis in a world of security control information overflow but basically if we have this dream you know the CISO is sleeping and they're dreaming you know nirvana you know that when they wake up tomorrow morning and the executive asks them a question are we you know the CISO doesn't understand why it always feels hot when I get a question like that they're they're in the kitchen so the the CISO has to whisk off and look inward at the actual control implementation state and in this dream world has this crystal ball just can get up at you know 855 look into crystal ball and get the answer five minutes later but that's visibility on the actual control implementation that's protocol specific environment specific and it's a different world than the world that the executive chief and pension security officer has to communicate back and so usually you have to respond based on the state of control that you just saw in your crystal ball but then how does that it gets it gets responded in the context of this state but the response is also based on the question are we compliant are we this are we that you have to have the response so there's a huge amount of friction causing billions of dollars of expenditures by every organization simply trying to go up and down between actual communicate in the objective framework world there's no direct connection here until now um and and translating the response in the context of the question and the state is a very painful process so we can't expect organizations to apply security let alone you know verify its its assurance level because your use case in your business case warrants that level of assurance otherwise you're going to be in the news so the problem the the operational problem is really an information overflow problem um and it's odd for me to say that but because ISO and ITU produce what I'm saying is part of the problem that doesn't seem very good but basically there's an awful lot of organizations whether it be standards or or regulators authorities we spoke we spoke about earlier or certifications or industry associations in the u.s. and globally there's just a huge amount of information about so little although there's a lot of security that one needs to know so you're struggling you're saying where does this all go you know you have this bowl of spaghetti and um you're kind of left on your own to figure it out kind of getting to it doesn't seem that difficult but what does it all mean and and and and you know things like is it complete is it correct you know if there's so many different documents explain the same thing there's an issue of correctness so at the end of the day you know you want to find out how to do something and unfortunately the number one tool today are spreadsheets because frameworks are created in them uh gap analyses are done conducted in them regulators send you the frameworks in them actually they don't but they but that's your first step to put them in excel but excel itself is a tomb right there are fundamental operational issues of what why we cannot use excel any framework in any spreadsheet and it's because it's a one-to-many construct it says that here's my security control framework and you see that fixed reference and I'm going to tell you through a series of hundreds of controls what you should do and that is the view of the world as defined by the author of the framework and they're going to be really really nice to you and they're going to give you some maps and they're going to say that control how you know is equivalent related to um this other control but they're just going to give you the idea you have to go find it find the document open the document I mean forget it and you have this one way external mapping now if the industry is stuck in this tool it's a problem we have to have a construct that is not one-to-many but many-to-many here's the example of the cyber risk institute framework as defined on the left and then you see all their controls over here and this is that very nice very nice of them to to give you some csv v11 sorry jack I'd ask you to to wrap up this part okay sure so so the point being is that there is a platform at a dcgi that allows you to select any reference select any map to reference can show you the description side by side that can't be done in the spreadsheet show you the same map to framework the controls and then at the bottom here show how that control validates any number of other controls so clearly there is a need for the friction to be removed out of the system so that you can so that one can do security now this work has been awarded a best paper award third place and in it it says that the tool will be made available to the world so i still have to actually do that this is the pause vj on operational challenges back to you okay thank you jack maybe ed very quickly maybe any yeah well we on this sure it's it's not only iso but it's looking at the the community the financial community itself also we have to consider the consumer is going to be the the consumption consumption person in this effort since the digital currency will eventually go through the process but wind up on the consumer's phone or pc and and now what is being discussed is the ways of looking at security to that consumer or to an organization as such in other words it has to be simplified so that the consumer doesn't have to know how the response was done but have to be able to relate to the response in their own environment so another thought thank you um thank you ed so jack maybe over to you for the next part thank you um evolution by consensus so in order for us to make progress so this is kind of the time do we need to rethink security that's yes we need a tool that allows me to travel navigate without friction to anywhere any framework a change reference anchor on demand and so on so i need a tool that makes it easy for me to get to the information i need to see the other problem is the information that you will see is in written form and so you take this new model and you ask yourself the following question of security is this delivered property and protection is this received property now security can be performing what they perceive to be perfectly good security from a from a you know system design that system is operating according to design but it may be developed delivering very little protection per se so if you um make an assumption it will lead to the ability to model something a little differently but here's an example very quickly so in a security control you'll have a description of the security value that they want you to impart on some business value so ensure the integrity confidentiality of data motion net rest and in use so even though this is kind of a control statement it's huge the things that that that you will have to do to achieve that but if you say okay well i'm not going to give you the benefit of it down in fact i'm going to assume that um they're not the same and so security delivered is by an asset and that asset can be any people processor technology and we don't know if it's effective to mitigating the threat that is being exercised on the target the business asset the business asset is just people processing technology generating business value so we're making that assumption that they're not the same and so the work that you're about to see has already a fairly long history uh there is a unified security model um and it's also incorporated in the ito focus group in protection assurance for digital currency so it it's kind of in the security manual and it's it's thinking is already embedded in two x recommendations in the annex which are non normative and basically that paper award and basically a platform that we're using now based on the model to now do some validation now i'm quickly going to go through the new model and if you can take a moment to put aside your your notions of security because security evolved by consensus over time it it didn't get created from a clean slate and at the end of the day a clean slate is is what i started from so if you go to an universe far far away in fact where there's nothing and then there's a value creation force that is born you know in the internet of value it there's the value creation force where there is value there is risk it's just human nature so you have this influence between two forces where value creation is threatened by value risk or risk threatens creation they're bidirectional and because there is risk or loss to value the owner of the value will invest in its preservation so a value preservation force is born and it has a relationship to value in that it protects it and then whoever owns the value and is investing in its preservation will also want to make sure that the preservation is in fact effective in mitigating against the loss and so you'll have the preservation force being validated by the assurance force so at this point in the universe there's only four things going on there's good creation of value there's bad theft of value i'm going to preserve value you know one one one dollar value preserved is a dollar value created and then there's an assurance force just to make sure that it's preserved so at this point there's a four force model digital force now what happens is all those forces one morning give birth explode and give birth to their actors so instead of four forces all of them explode and they give birth to billions of of of value asset actors and value process actors billions of of threat vectors little nasty guys you know billions of of internal controls and security assets that deliver security mechanisms and the assurance force gave birth to all the external controls in the universe so now we merge away from four forces to billions and billions of these actors and they're all imprinted with their DNA meaning a threat vector life is to attack value assets so that's wired into their DNA that's all they do now if you're an organization you have full control over value assets and how they enable value processes and you have full control how you preserve your value this is your security budget but you really have no control about the external threat vectors or external compliance so you know you're having to manage against these and these are variables you can fully manage and so those six actors that derive from four forces they only engage in five relationships as I said a value asset in the universe only knows how to enable value processes and threat vectors in the universe only know how to threaten value asset vulnerabilities and internal controls are policy statements about how you will protect this value asset given what it's doing and how it is being threatened and then you will then have actors that will get super excited about how to actually meet every requirement in this internal control for example confidentiality of data that's a lot of mechanisms on to a lot of assets that you need to put in place to fulfill that and then at the end of the day you know how does that get demonstrated to the external control actor so they really that's the model that's the security control expression this model has the property that it can describe any scenario of security in the universe and therefore all of security is reduced to sick actors and five relationships so what we've done is we've replaced this world of a lot of words and you have to decompose and figure out what's the target what are they asking for by an expression which is you know syntactically isolates the issue of threat to enabling of and so on right so you know and you know exactly the relationship so I'm trying to move forward here so now we're going you know from a state of spaghetti to a state where we've removed a lot of the unnecessary wording and we've created relational structures that that are consistent right and left and as we create and record these relational structures they are in fact knowledge that remains persistent in those expressions so the next this is the next pause point to vj du so so the point here is about a new model sort of started from scratch versus a lot of old established models that still exist in iso in itu and that maybe brings up the question of a time to retrofit some of those models that were thought of 20 years ago add to you maybe add yeah can we have your well we're talking a really broad subject here but condensing it uh I think again sort of reinforcing what was pointed out a few minutes ago I think the experience from iso and the experience as well as the developments from itu can address the the subject of digital currency as well as the as it relates to the central authorities the the mechanics of doing this with given the potential for collaboration maybe that what aspects are identified in the approach that Jacques is just now talking about as well as what has existed maybe also another direction for their committee to consider see the new approach go ahead yeah I was just saying I think yeah that's a good suggestion I think we already have a liaison but I think if we can work together I mean DCGI and also the at the level of iso collaborate together on this we might come up with maybe something completely new that might be useful for for this I agree the the new approach as as as I'm sort of articulating it is you know what is the stepping function here the new approach I think is a big that we need to cross right and we we're in a world of having to select a framework one two three that are in scope because they're given to us they're managed to us what the new approach needs to eliminate the dependence that is created when a single framework is adopted and have a tool that allows you to see all security control frameworks as simply part of security body of knowledge and to use any one of any one of them in any way that you see fit to create the one that you need that is customized and optimized to your target so that's that new approach is you know is not having to select which framework to take but again if you're mandated you don't have a choice um so I kind of call that that you know the battle of the frameworks where we're still in in an era where uh industry associations are battling it out to be the framework pick me pick me and the point is let's not having let's not let's pick them all all right okay jack so just for your information we have 15 minutes left so I'm going to ask you maybe if you can maybe very quickly over the next 10 minutes maybe complete the rest of your presentation let me just take a quick look here and then we'll we'll do a final pause and conclude the session thank you Ej I'm going to quickly jump here so we're talking assurance and so we're talking a need for formality so we you asked about what is the definition of of assurance well you won't get it through you know things being loose right so it's in the formality how do we know for sure and again that sureness is a level that every vulnerability a very value asset a very value process in one digital system has been treated now this has nothing to do with the kind of treatment it can be a very poor treatment that's another topic but I need formality in understanding what are the vulnerabilities and have I gone through them systematically to address them systematically and if I get hit by a bus tomorrow the person can come and continue and know exactly the scope of vulnerabilities in the system and it's no longer dependent on the person doing the work which is another very strong upside is that insecurity people leave all the time if you can institutionalize the security knowledge they have about the organization that doesn't matter if they leave as much so here you know how do we know and so you remember the model and that model is true for one vulnerability so that's a very important thing for one vulnerability performing this value process so you get a sense of criticality data sensitivity in this digital system how can it be threatened what should I do about it how am I going to actually do it and how am I going to prove it so that is always the same for every vulnerability so here I can treat this vulnerability and then when I aggregate across all vulnerabilities then I get what I would call a value asset attack profile surface profile and then if I iterate over all value assets in one value process I end up getting a value process attack surface if I iterate if I then iterate over all value processes I will now have a digital system attack surface so this nested iterative process allows you to serve up in reverse if you will every vulnerability of every value asset in every value process making one digital system so okay so here the tool will allow you to pull out a digital system a digital currency system have in it its value processes and the different value processes are stored in the value process repository of that digital system and value processes can be decomposed until you get small enough that you can identify the enabling value asset and then you store all those enabling value asset for that value process in the value process value asset attack surface so you're basically just recording you know you didn't reverse or ever you iterated and you recorded okay you went through every value asset of every value process and you said okay it's a structured database oh the the text is in plain I should encrypt it I actually put some access control two factor authentication on that database and I should monitor the behavior of that database so there's lots of things you can do but you and and but after you have knowledge of the thing that is being attacked so that you can look systematically at how each vulnerability can be exploited and then when you've done that when the deep subject matter threat experts looked at every vulnerability in every value asset of every value process they're storing the applicable threats in another framework so at this point you know your target quite precisely you know how it can be exploited and we can start the treatment process at this point this is um the staging point before I finish about this particular point the need for formality and how that formality gets translated into a tool that exercises that formality DJ all yours yeah um so Jacques like I said maybe let's complete all your slides and then we'll come back to questions yeah okay let's do that then really we're in summary um but if we take the common criteria that Ed mentioned earlier that comes from um ISO you'll get these increasing levels of assurance but what I'm trying to point out here is what is changing and it's the formality of or not the well the formality and the robustness of the verification method functionally tested structurally tested and so you're just the formality of the testing procedure and and and even the method of testing is documented right so uh as you as you increase here uh it's it's just a lot more effort in verification and the tool that I mentioned provides you with that capability by being able to increase the precision and measurement our obviously precision and measurement gives you insurance assurance not insurance um and and really uh what is different with the proposed method here is that we know a lot about the target in fact we start with defining the target as a target the thing that we're protecting as a target and then we do threat to and then we do so that's not usually the normal way but but we're we're trying to shift away from what I would call notional security which external frameworks that were written with no knowledge of your target must be to precision security which is based on knowing exactly your value asset types so on that note um this is our progress as it relates to our goal certainly would ask for uh any individual with any interest whether on the policy side the architecture side or the validation side to please help us this is work in progress and then only get better through multiple contributions that's it for me Vijay okay thank you very much Jack uh for the presentation um so maybe uh now maybe we can take maybe some comments from Ed uh you know on the on the common criteria part you know the slide where you had this maybe maybe some insights from you here Ed uh is there any thoughts on your side or at the level of Izo for applying this also to digital currencies? Well in short it turns out that I did one of these and many years ago now it's not recent and it wasn't digital currency but it was a communication system um and the kinds of questions that came up were very similar to what we've presented today um yeah what authority are you going to uh accept the the the model as well as the uh the type of target and and all the necessary capabilities that would exist and then then you put the uh security levels and as you see here in the case of common criteria there are seven levels but there's also a cost in time that go with each of these levels and so you put that in the business model of for security what can be done with realistic what is available how much is it automated as opposed to something that you have to create the terminologies are similar. Do you think that well when in the military world you're kind of given the money to be this assured right the the buyer pays for that level assurance we don't have that in the digital currency world nobody is going to pay this money except the company itself so not only are we uh and and and maybe even the consumer buy side on the currency side could sustain the cost of this so you know there's a complexity component it's not easy to do that's very costly and since we're very early stage in the innovation cycle it's all about functionality um so if we're if we're saying we want this or if somebody's going to mandate this but I don't think anyone's close to mandating any assurance level um remember our past conversations when we said well you said we'll get everyone to agree to assurance level zero why everyone can agree or maybe like one right we can agree at the lowest level of assurance can we not and so we can get agreement on that and then the difficulty is can we get agreement on two and then all of a sudden you know reality sinks in costs and and market dynamics and sink in but you can see for the cbdc world you know central bank is your currency that they would take more due care sorry go ahead right it's a it's a due care what is the due care responsibility of that cryptocurrency operator you got you got all the a lot of factors that have to be considered here right they we are can be talking about crypto currencies we also could be talking about fiat and and how what relationship between these two entities could be acceptable and then for that relationship a distributive cost model could come out of that um but there is a cost um as well as uh how do you it still is in the discussion stage on uh the question of uh money and the question of uh digital representation of money and then then the application of money comes to play and sort of as we are just talking about I'm looking at the EL levels can be assigned to that application but it also could be assigned to money and so I think in regards to the cbdc itself we're still defining how how should that be viewed in the business case yeah okay maybe Jacques and Ed maybe let's take a couple of questions from the Q&A now so there's a question for you Jacques here on the implementation aspect now what are your thoughts on which of the EL levels exist on the client side like for example an app and which ones on the blockchain or server or cloud yeah I'm not aware of much common criteria activity going on in any maybe digital currency system yeah to my to my knowledge at the moment yeah I don't believe anyone is submitted a actual target uh as such and then there's another question uh is how do you see the assurance level for cbdc's in front of quantum technology um yeah quantum is definitely um a game changer obviously and when is it going to happen um but but clearly and we have a specialist Paul Lloyd in our working group specialized in this but clearly any design must be taking that aspect into into consideration because it is not far away um and and of course as soon as the capability is available then a lot of the assumed protections evaporate and and and that will be a weakness that of course will be exploited at the speed of light and therefore any amount of time in that exposed state could be devastating to the company so you need to certainly for 10 years out started thinking about swapping out some of the algorithms uh appropriately um that's on quantum and there's a lot of activity in that area so but quantum is sort of all of a sudden the algorithms that we thought that we were giving us protection don't um now this so the eal somewhat a separate um perspective the algorithms that'll come out with the new algorithms will be quote tested to you know to some level because those algorithms being produced by NIST and others will be heavily formally tested that's a good question to see to what level uh but if you're doing it you know an algorithm protocol you're probably doing something fairly high right because to have a zero day on on something like that would be devastating okay i think we we've come a bit to the end of the session we'll really pass uh half uh half past five here all right so uh maybe just to quickly summarize so we've seen a bit you know what are the issues uh regarding you know how you would protect uh your digital currency systems and I think Jacques mentioned you know the new model that he's thinking about and and how we can also think about you know assurance levels um for digital currencies and this is a work that is still uh in in in progress at the level of the digital currency global initiative so and and since you know the iso is also you know having thoughts on this so maybe like a collaboration you know like we already have a liaison with iso so maybe some more stronger collaboration going forward uh like joint meetings maybe could be considered uh and and so that we can share you know both sides you know how we move forward in in this particular area and hopefully we'll have something uh to share maybe at the second edition of the dc free conference uh later either this year or next year so uh with these thoughts ladies and gentlemen I think we've come to the end of the session today again I would like to thank both our panelists for their participation uh to the end for sharing their insights on the topic uh so we've come to the end also of the dc free conference today and I would like to thank everyone who have you know attended all the different sessions that we had over the past three days and with these I'd like to you know thank you for your participation and declare the session closed as well as the conference thank you and have a nice day everyone stay safe, bye, bye