 All right, thank you. So my name is Sergey Topic is here and as Arkady has already mentioned. So that was a little mistake Small mistake introduced by me. He's my supervisor. So at first to give you a Comprehension of what is actually studied here. We need A little bit of brush up on Oblique cryptography for those of you who might not have Understanding of what it is. So I'll I'm not trying to dig into some mathematical foundations But how it should look like from the user's perspective for those of you who are aware of what it is That's okay. So first we need two persons Alice and Bob who want to communicate securely over a insecure channel observed by Intaker so then bro Bob creates two keys. So that's the basic concept There's a body in between these keys public key, which is green and secret key, which is red So and he said he keeps his secret keen to himself and sells public key to Alice then Alice applies the encryption Function using these public key and send a encrypted Ciphertext C to Bob then Bob applies Decryption function using his secret key and obtains the plain text message M. So the trick is that it is only Bob who can do this decryption Function because it's only he who knows the secret key this game can work in a different direction when we have one Bob wants to Authenticate his method message sent to Alice. So the Alice is assured that it's actually Bob who sent that message he applies decryption Function to some Hasht value or digest value of the message by using his secret key and sends it alongside with the message and then Alice using her Encryption function using Bob's private or public key Verifies the Signature and if it matches Then Alice is assured that it is actually Bob who sent this message and this message hasn't been altered during the transmission so if it's clear because It's vital for understanding further. All right. So but couple of questions arise. So how actually Alice know that these public key belongs to Bob if Bob Gives it to Alice in person. That's fine. But it's not always visible in electronic world. So here we have a concept of certificate authority Just analogy might be if Person applies for a driving license who comes to an agent to actually get it and the agent asks for a secondary ID like a Passport to actually verify that it's the actual person who are given that secondary ID So here Bob as a user comes to a trusted certificate authority who actually verifies Yes, this public key actually belongs to this person and Certifies that public key by signing it by his own private key. So and the Structure that we have here is called the certificate. So that's the public key of Bob Signed by the some trusted party called certificate authority Further so here where Networks arise here. So we may treat it as a directed graph where graph where knows our users or CA's or any entities who can sign messages and directed edges are the Acts of certification or signatures of the certificates So here how it would look like in the trivial example and in real world If you're talking about a bgp web of trust the strongly connected component of which contains of more than the 50,000 certificates. It's quite huge networks. So two main types of structures we have here is Basic one is hierarchical which actually is usually seen and used on the Internet's when you browse to Like a website your browser needs to validate the certificate of the website to actually assure you that you're actually Visiting like Banks online banking website. It's not somebody else interfere with your communications was to try to steal the password You're new into visiting some malicious website so the open page of your web of trust looks like a in general form arbitrary directed graph where all the users can act as a certificates authority and Each of the users assigned their own private values of trust to Any subset of that users So another concept here is pki which actually the real system which All these technologies work together. So it actually Results a lot of questions, but we are interested in this two ones So it disseminates the certificates around the users of API. So actually The structure of the certification network and actually provides the service of certification. So Where Alice trust some certificate authority or some other user who verified That public key of Bob actually belongs to him. Oh, sorry. So this is vital. So one of the examples I've given you so when you browse into your Online banking website If you may notice that green padlock icon which says that okay your browser has successfully verified That you're actually at the website you're after Okay, so we're talking about trust metrics. So trust arises here into forms. First is the actual trustworthiness that the user has in the c a Certificate authority or any other user if you're talking about open pgp web of trust. So for instance in x dot 509 standard Binary choice trusted or not trusted in pgp. There are three choices and the actual Goal of this system is validity. It is the level of confidence the user has in particular certificate Which is the result of an algorithm that takes as a parameter's certification Networks network structure of it the root user for whom it is computing the validity and the their private trust Values around all the other users in the network. So this what we call trust metrics so one of One of the main problems here is to study the properties of the set of validating Certification path so we assume that a certifies sorry validates x and we have these three edge Independent and one vertex independent path so configuration of this path set of path is vital so we may derive some Nice properties like attack resistance out of the properties of this set of Certification path so these four are just examples of criteria by which we may measure this. So first we want to may want to divide all the trust metrics by simply the Code domain of them so the most simple case is binary which is x dot 509 Certificates right then we have finite amount of values which open pgp Resides here and the most general case we have real values in the segment of zero one so now we'd like to present the most simple Magic and which is actually used in practice which is x dot 509 the Certain assumptions the assumption is that in real world there are root certificate Authorities like in hierarchical structure which signs some certificates here we merge all these Root certificate authorities into one root vertex a and sign ultimate trust to that vertex so here we have So this metric base basically says that if there is a certification path from a root user to x which we want to Validate with all the vertices trusted along the path then x is valid so the most simple one So and we create set of axioms will actually not creating them so we're kind of disintegrating that definition to simple Rules which are quite obvious nothing really deep here but still they make sense so first root vertex is always valid so A is always valid for herself which is sorry so second it is if there is nobody in the Certification graph who sign x then x should be not valid which also sensible then we have if some why sign X but why is either not valid or not trusted then the validity of x shouldn't change which is quite Natural as well because we don't want these kind of vertices to certify other x's which are not valid or not trusted And this is most powerful axiom which is about in what's in which circumstances the validity should actually change So if we add an edge from the y which is both valid and trusted then x becomes valid so pretty Obviously these four axioms uniquely defined that simple trust metrics further we want to generalize this class of Metrics as one of the weaknesses of this metric is here so if an attacker managed to capture one of these Intermediate vertices he can fool a into validating some forge certificates so okay we want to so now we Require a metric to validate x only if some K trusted valid introducers so the vertices that a sign x exist so we modify The definition so we leave the authority of a to sign x herself thus validating it but for all the other vertices We need at least K such vertices so the agent on certification path with all the intermediate versus bad and trusted so pretty simple Generalization of that trivial case so in this case we may think of two variants of set of certification paths both of them successfully Validate x but that once means that all the vertices are trusted but so if you think about the difference between these two graphs so in terms of Attack resistance the left one might be a little bit better so these two vertices are still at the same position so if an attacker Can't manage to capture both of them still may provide validation for forge identity but in this case we have four vertices versus two So these are just basic examples of what may happen here and in terms of axons so we leave three axons the same we add another axon The ultimate trust which is kind of connected to the first one which says that if x signs something herself or sorry if a root vertex Sign something herself then it becomes valid and we introduce the parameter car K into the impact in action the most significant one Okay further we may want to generalize this in terms of amount of the code domain of validity and trust so we have open PGP which incorporates three possible values I won't present the definition and axons for this in terms of they are pretty much the same so what we basically do is we add that third level of validity and trust into the Impact in action which makes it a little bit more awkward but still we need to incorporate that so we may say that we come to a more broad class of metrics which are called hierarchical By adding these layers of added validity and trust into the impact in action so why are we actually interested in this pretty narrow or standing alone class of metrics because some infinite amount of other metrics may exist is because first it's the most simple ones and to the best of our knowledge we are not aware of any other Works or next implementation of these types of objects so we started with the most simple one and two of them are actually used in practice is simple and open PGP and there are kind of hierarchically dependent and so all right so this was descriptive approach so when we provided some pretty obvious but still a systematic foundation for this The class of metrics now want to look at a couple of properties derived from the classic social choice theory so dependence of irrelevant alternatives turns into independence of irrelevant vertices and edges which basically says that If we remove any outgoing vertex outgoing edge of x and any other disconnected vertex why this should not change validity of x which is pretty natural Second property is scientific compatibility here we count the act of signing someone else's certificate like creating an edge is an act of voting you name it and this is pretty similar to the previous property but still we state that x can do certain amount of actions without bearing any costs So create a forged identity without anybody signing it costs nothing and creating edges from himself and these forged identities costs nothing so we may want our metric To be moved to this type of trivial attacks and it states that if we if x does any of these actions this should not Change or actually the goal of x is to improve or increase its validity for a but all these actions should change it So directions of the research so model possible attacks and investigate tech resistance of metrics there are other like theoretical metrics which largely residing continues a class of metrics we may want to start them from a Maximatic perspective as well so study these class of metrics which are incentive compatible and have the irrelevance of independent vertices and edges the property which actually seems to be quite a broad range But if we think of some metrics which may be derived from like page rank if we observe cardinal values rather than ordinal it may not be the case but it has improved I don't know maybe maybe not true and some complexity issues arise here as well if we impose a limitation on the length of the defecation paths some problems becoming incomplete here So and in the last slide I would like to present some other perspective from which we can study this object so if you think about the evolution of BKI and behavior of users how they sign keys We may state the problem which pretty much resembles the thing that Matthew was talking about in his talk so the premise consists of two items so first the user when he's entering the PKI he follows some goals or to be able to validate some number of certificates which is he wants to maximize or maximize the number of users that can validate his certificate so it corresponds to our Degree and indegree but he needs to and to properly to maintain the security property of the system he needs to actually do some work to validate users when he's signing them he needs to actually act as a certificate authority Otherwise he will validate for certificates so given the following we have the structure of the certification network given the specific trust metrics which is used by all the users the utility function of the user for instance the amount of users that will be able to To validate his certificate once he's in that PKI and he has a restricted budget so he can sign up to end certificates so which certificates should he sign to maximize that so it actually resembles the problem of identifying the most influential The most the vertices to somehow find and rank users as per their influence and then to actually choose the subset which will maximize this utility so this as Matthew said has some tricky complexity issues as well So that's other direction of the research which has been touched by me yet at all so that's all thank you So if you're talking about these simple metrics it just tries to find the path and checks the so one at least one path right it's pretty easy And in terms of some other metrics which may require at least like K independent certification paths so the problem of checking whether the set of path independent path is the maximum with limited lengths is proven to be incomplete So if we remove the limitation on the length then using flow techniques we just find all the path to that target vertex just by crawling the graph