 Back up recording on and shards all you. All right. Yeah, I'll, I'll be taking this one, Sean, but thank you for getting us going. Yeah, welcome to the identity special interest group on August 10th. I'll go ahead and share my screen so we can follow along here. Can you all see this? Yep, looks great. Perfect. Well, yeah, so today we have two main things on the agenda, which is some working group status updates and then a presentation from Nick steel on credential migration for wallets and credential providers so thank you for joining us Nick and Looking forward to your presentation in just a little bit here. Of course. See, let's jump right in. Some announcements. I believe that an in-depth webinar on infosys on August 24. So go ahead and register here if that's something that's interesting to you. And then I believe this hyperledger identity. I can jump on these 10. So these four bullets just the Q3 editorial campaign is hyperledger identity. So if you have news, please let us know we'd love to signal boosted PR at hyperledger.org. Next one, Animo, who are fantastic contributors to hyperledger Ares and maintainers of some of them are maintainers of hyperledger Ares framework JavaScript. I have recently announced making Ares framework JavaScript a global framework. We've got more at that link. Basically, they're going to be raising money from funders and finding teams who want to who want to build out required infrastructure for Ares and that's really exciting and something we love to see. We see recently announced the messaging feature for their agent and wallet. You can find out more there. The leasy team is writing a blog post. They're going to post that to the hyperledger blog shortly. And then lastly, hyperledger Indy community is going to hold the second hyperledger Indy summit on September 7, 2023. The first one honestly was the best four hours I spent on Indy since I left Evernon years ago. The events going to gather the entire community together talk about the current state of Indy the future state of Indy what needs to get done. This is being driven by the maintainers and the contributors this is, I mean, it's literally not a hyperledger event but we think it's super important and we welcome everybody from this call to join back to you Tim. Awesome. Thank you Sean. Yeah, we'll go ahead and jump into some working group updates then it looks like the hyperledger Indy contributors working group met on the 1st of August. Was anyone able to attend this session that would be able to give us a summary here. Yeah, so on that call I was as well going to mention the second Indy ecosystem summit on September 7 that Sean just talked about so I'll send out the link as well. To register if anybody on this call would like to do that definitely encourage to do that. We also talked about the did Indy Indy VR work be merged into main deprecating the Indy SDK and then as well the the draft of the Indy quarterly report. So, somewhat of a short meeting but got through some good topics. Absolutely. Thank sure. The areas working group met just yesterday it looks like on the ninth was anyone able to attend that session. Oh yeah I didn't have a session. They spent most of the time. I remember talking about the EU. Yeah I just guidelines I think finally you were published and they were talking about possible ways to support the protocols and specifications that that Europe had that were far different than what I guess areas and did calm were opinionated about. Okay, sounds good. Thank you Charles. Looks like the bifold users group is not met super recently areas cloud agent Python users group met on the eighth was able anyone able to attend that session. All right, looks like they were discussing a non creds rust and acibi. Areas cloud agent Python maintainers met on the first. It was anyone able to attend this session. All right, looks like they are getting some kind of PR together and then had a discussion areas JavaScript met on the third was anyone able to attend the areas framework JavaScript meeting. Okay, looks like they're mainly discussing unqualified did migration and did peer three support some of those things versus can have pleasure and non creds met on the 31st of last month. Was anyone able to attend this non cred specification working group meeting. Okay, looks like they're discussing CL signatures. Trust over IP. And then the second stack met on the 27th of July was able to attend this to IP governance stack meeting. All right, looks like they had a presentation from Michael Parisi, and if that sounds interesting to you, you can find it at this link. Okay, looks like the two IP concepts and terminology group met on the 31st of July. Did anyone make this session. Okay, looking at TV to amongst other things did calm. Could not find it looks like the diff did calm users group met on the seventh. Was anyone able to attend this session. Okay, they've been looking at how to better market did calm recently and are working on some efforts to put a book together it looks like. I think we've met since then, I believe, unless anyone else has any other updates that brings us to the end of our working group update section. So I'll give it just a minute here and then I believe you can take it away Nick, he'd like cool. Thanks for having me again I'm going to turn on my, my camera and share my slides in one second. I have these slides available afterwards and share them with the, the rest of the group here. So I'm going to go ahead and cut over now. All right, can everyone see the stack. Yep. Cool. Right on well today I'm going to talk about credential migration, which is still pretty early days. It's some work that. I'm going to turn my video on so you can see my face hello. Credential migration is some early work and it's actually a bit of a misnomer we started to call it credential import export. Or now I guess it's being abbreviated to crimp somewhere to shrimp. But if you have any better abbreviations let me know. So import export is a better term for this and we'll kind of discuss why as we go through this but to give you some context of why we're talking about this why this is a topic being developed and why I'm talking about this specifically let me start by doing the real first thing and telling you all about myself first. So hi I'm next deal. I am a security person. I've done a myriad of things starting mostly on the security R&D side. I almost eight years ago nine years ago with duo labs. And then what duo got acquired by Cisco, and during this time I spent a lot of time working with the Fido Alliance and W3C on a mostly Web Authentic standard and Web Authentic is a standard and I think folks are probably familiar with at this point. To deal with essentially password lists credentials and pass keys and we'll talk about that term. But it kind of started to be the as as this way to have a predecessor to U2F and UAF and really merge a lot of the benefits of MFA and U2F and these like these hardware authenticators with basically use those benefits for for single factor first, you know, first factor login which is not really the right term for this but we want to just have a single credential that we used to log in and Web Authent was really hoping to make this possible. So I started as a mostly on the R&D and security engineering side and now recently have been a staff technical product manager over at one password. And what it's it's kind of funny I ended up at one password I found this great slide that I made I think in 2016 when I was still at duo where actually it was like oh one password is going to get blown away by Web Authent and so you know says last past and all these other other is there's this great stat from Dashlane saying you know everyone has all these accounts and this actually this is a great stat for why we need this credential import export format but you know by 2020 users are going to have over you know 207 accounts and credentials for each one of those and my thinking at the time was like oh you know Web Authent if we have Web Authent we won't need like Dashlane one password and all these other companies but actually you know it I think it's at least come to me that like one password and Dashlane and Bitward and Nordpass, Passball like there's a lot of other companies out there that are really really well positioned to manage user credentials, especially for handling Web Authent and acting as an authenticator in Web Authent. One password and a lot of these these other companies these third party credential providers, they're called are in I think a more in franchising state for in the industry. One of the big problems with with Web Authent credentials and other types of credentials right now like pass keys is that there's very easy user lock in, especially if users are using Apple and Google platforms to manage their credentials they tend to be pretty restricted to those ecosystems and to the systems that are able to use those so if I have you know a an Apple iCloud keychain account with all my passwords and my pass keys and all any other credentials SSL certificates it's really locked into that ecosystem. But then these third party providers like you know Dashlane and Bitward and are able to allow for access to these credentials across different platforms and through different modalities, it really opens the user up to have more freedom and sovereignty over their credentials and their identities which is really what's in I think going to become more important here and I'm sure you folks are well versed in. So I was wrong back in 2017 they're these these types of authenticators are not going away they're probably going to become more prevalent as folks have more and more credentials as Dashlane said you know back in 2015. But the big the big reason we're you know we're I think we're talking about this and by we I mean Dashlane one password and all these other credential providers is that pass keys are have been a huge huge focus for us and pass keys. It has a lot of different definitions. The this is more of the marketing term, and I think this is where a lot of confusion lies is that there's really to. There's two big public terms that I'd say kind of float around this and this is the marketing one is past keys are replacement for passwords that provide faster easier more secure signage that back. But the real the real technical side I'd say of past key is that they are a replacement for they're not a replacement but they are synonymous with the idea of a discoverable credential their credential that is managed by by a client and authenticator and then revealed to the relying party. And so you don't need things like usernames and additional referential information to be stored alongside your passwords or these types of credentials by the RP. I can go to a site and have my past key be available through my through my authenticator and client without the RP having to do a lot of any additional steps to look up keys and whatnot and they are. They are supposed to be replacement for passwords a lot of folks are using them now for MFA that's in my mind that's fine you'll get spicier takes from other people. But they should be a replacement for passwords and they are supposed to provide a faster and easier way to sign in with the benefit of being more secure for both the user and the relying party or the website using it. A lot of this conversation around past keys because you know Fido is with marketing material and has been doing it with Apple Google Microsoft. A lot of this discussion started in Fido and Fido has a tri annual plenary. The first one this year was in Dublin, and this is where myself and this is Rue Islam from Dashlane presented our kind of view of what migration should look like and now credential import export. And we got some pretty good feedback and I'm here to tell you kind of like where we're at and where we're where we're currently at and a bit of you know where we were coming from. So, at the Dublin plenary we presented some some some work that we showed over over over about a month we just kind of like we're talking back and forth. Because Rue and I are buds and we were like hey would be cool if we could do this, you know, like a better way, like a better secure way to migrate credentials between our two applications and I think Dashlane had an internal hackathon and they put together a demo I'll show you. We had a hackathon so it really just like the stars aligned and we're able to put together a really cool demo and this kind of got us on the path of saying hey we should have a more standard way to deal with this because both us and many other credential providers, including Apple and Google struggled this immensely. It's really, it's really hard to transport credentials across these, these, these boundaries these like platform boundaries so we wanted to present in at the Fido plenary, this, and it says B of F because it's the Fido birds of a feather plenary we had it's a B of we wanted to present this work to try to gain some consensus and thought around like how we could do this with other folks. So, the current state and of import and export or migration for for most companies is to output credentials into a CSV if you've ever moved from one provider credential provider to another. And then we see that it outputs a CSV and as a security person, this is probably very concerning. So, the CSV is also non standard which makes it worse importing the CSV is a bad experience both for the user because sometimes credentials are actually lost in in transit, like they just can't be parsed properly by the other side. We need to have pretty specific on marshaling code for different different types of CSV as we see so we need to know if it's coming from from last pass or Dashlane or Bitwarden or one of these other companies, because their CSV is going to be a different format Google Apple same deal. Their CSV is going to be a different format and if we don't parse it correctly, we may just drop the credential. So, unless the user tells us where the credential came from we don't know where they came from. And we also don't know if they've been properly imported or exported in a lot of cases we can just try to make a best guess. You know what we do but it adds a lot of development time and everyone really struggles with the side of it so we proposed a method where the credentials don't go into the CSV format in fact they stay encrypted through the whole process of the import export. And we want to provide a normative data format for these credentials as they go through this export. We also get get attribution, which is really great. We, we want to be able to say with confidence where they come from and this is especially important in the, in the enterprise case. You know, we, we should, we should be able to say that you know ex credentials coming from, from this provider, do you want to do it and we can get the user to actually give us some intense signaling here they could potentially use a bio metric but or at least they can have some say in the process of doing the import export aside from just like, you know, doing the CSV that that may be potentially malicious in some cases. This is really important for for enterprises to have if because enterprise important export is is a really onerous task and kind of impossible in a lot of states. The other thing that we don't really have listed here that we're trying to accomplish with the standard is to do it in a format where you could do it for one or more credentials and you don't have any guarantees around the network on which these are being available. It could be completely local. It could be semi local there could be a segregated network component where you know I'm moving credentials from my laptop which is in my office on a when and I put my credit my migration file and I'll kind of show you what that could look like on a USB stick and then I want to take it to my data center. So maybe my data center is on a completely different network and actually unable to see if they're online or offline or not but we want to be able to support these scenarios. So we have this is this was like an early example and there's a lot of different types of modalities we're thinking of right now but essentially we want something like this in some cases and after talking with Apple and and and Google they I think that they're more interesting using the OS which is fine but this would be a way for us to do it between any additional steps without any additional steps but what you can see here is we're you know on on the important export apps were creating this box we drag the box over. The box is then filled with credentials and then sent sent back to the the new the importer the new application we're going to use and then they decrypt and and store the new credentials. So this is one modality we can do and I'll kind of step you through the crypto of what's happening here and then some new modalities that we're thinking about and and where we're at in the spec so oh wait sorry. Let's go to the next slide here. So really what's happening here is and we'll step through it. Kind of bit by bit is if we look here into this top right quadrant. We're going to have a public key that we that the importing app supplies to the exporting app. And that's when you know when we create that empty boxing drag it in what's happening is that box is essentially just a public key. The exporting app then uses that public key and their private key to derive a symmetric key, which we then use to encrypt the data and this is similar to us. And, and we encrypt with this and then bundle that encrypted data along with some metadata that includes the exporting app public key. If we pass that back, the, the original importing app will use their private key and then the exporting apps public key to, you know, derive the symmetric key again, and then we'll, we'll decrypt that data and store the credential data using that that credential data format that you know we're we're hopefully going to develop here as well. So this is this was kind of the original thought and this is still like a good way of doing it. But one could also do this using hybrid public key encryption, which is a newer standard and also developed by an old colleague of mine from Cisco Richard Barnes along with some other folks. This is currently, I think an informative RFC in itf. We could, we could potentially do, do key encapsulation here, where we actually just encrypt a key with the exporter exporters public key and then and then they decrypt that and are able to encrypt with the, with the negotiated or the encapsulated key. So, this, this is also a method that we really do want to account for and I think I've been talking with a lot more folks about recently and so we'll kind of talk about like where things are at but you know I don't want it to be to I don't want this to be too prescriptive, but we should be using cryptography methods to to make this easier another thing that we want to do account for to in the in the case of enterprise migration, or even some consumer migration steps is is potentially the presence of an authorizing party and this can work with the HPK or the HPHelman setup but an authorizing party could essentially authorize it could act as a credential authority in it throughout most of this. And so in the case of you know I'm at XO corp, and we're migrating from password provider to, we could provide our own sort of authority to handle the public and private keys necessary for both of these, these applications to, to migrate between each other and also get approval for authorization from from the enterprise to do so. So that's kind of like the three things we really were looking at in terms of architecture. But you know the important overlap here is that we're using these modern public key encryption schemes we're using things like Diffie help and which is triding true and we're using these hybrid public key encryption schemes which could could potentially be used in one to many scenarios which is also something we might want to account for. And we're able to provide this user intent and provider attribution through these public keys you know these keys could potentially be provided through an X 509 certificate so in the case we had that authorizing party they could either issue the certificates are authorized or really verify the certificates being used. So we can get some better ideas around attribution and identity not only of the user but of the application. And then what we don't describe here really is that migration format but this is really an early development and is going to help provide things like data integrity hopefully where we can, you know, know for sure that certain credentials have been unpacked or if they can be unpacked we know where they exist and can be used or we have ways to remediate their usage. So this this work also should be the foundation for further I'd say work with regards to authenticators especially third party authenticators like ourselves at one password. Where we can deal with things like potentially referential integrity and credentialing where we could either import a credential or reference or if or if we can't import it reference a credential from another wallet and and potentially ask for signatures against it and this would I'd say look similar probably to like an O off flow, but this is that's all I think pretty, pretty far out, but by providing a lot of this work now, we can lay a path for this to be successful later on. So, I mean outside of the that that transmit that sort of referential foundation, the real reason this this this work is important now and in the long term is that this now raises the bar dramatically for users security security during migration and import export of credentials. Having, in a lot of cases, import like when you when you import or export credentials you're, you're dealing with, you're dealing with CSVs. And if you're not you're you have to have this trusted middle party to migrate the credentials over for you and you know in the case that you're online and you say hello hey dash lane, could you share credentials with X, or one password, could you move your credentials over to why you usually have to have some other root of trust in there to handle that interim movement. This, this work would allow, allow us to kind of have have the user be able to, to not only, you know, provide their own threat model and account for their their own security but also they don't, you know, in the case of bulk movement of credentials you, you don't have the CSV anymore and so that's going to be a huge left. Multiple authenticators and credentials can lead to bad UX for users but this is I mean this has been kind of a, a, a, not a requirement but like a heavy ask, since day one of WebAuthn where we, you know, are our best recourse against things like account recovery and needing to deal with like you know account loss was we you know we would say that you should enroll many many credentials enroll as many WebAuthn credentials with the site as you can. But this also requires many authenticators a lot of the time because you don't want to have all your same credentials in the same bucket now Pasky's help with this because they are sinkable. They're not going to cross many devices, but a lot of the time as I mentioned before those credentials are locked into certain what are called identity fabrics. You know, like I they're locked into my Apple iCloud key chain account they're locked into my Microsoft Windows Hello account they're locked into my Google federated account. There's no good. This can help facilitate that crossover between these different these different fabrics and also allow for better use cases of multiple authenticators so we can have multiple credentials you should. I think and it's actually going to be easier for the user. And having a flow for handling things like import export could lead to things like, you know where I don't necessarily if I create a credential with one authenticator I don't necessarily need to go create a bunch of other credentials with my different authenticators I can potentially import that same credential into different authenticators and have it available across, you know, different fabrics and different devices, if that makes sense. So this format for the transmission of credentials is also going to be super important, especially as modern identity standards rely more on user bound and device bound credentials. So being able to move credentials that are associated with say, you know, MDLs or verifiable credentials is going to is is is going to happen and transmission is going to need to happen in a secure manner. And so being able to define this work now for those other standards and for ourselves over in W3C or really in what about then working group and vital lines is going to be important. So let's provide this this more seamless transition I think in the long term and I think that's pretty far out between credentials as they're used by users, devices, and workloads and units of computation. I think that, you know, as as user and, and really computer credentials become more similar. There's going to be more crossover and having not only a format for these credentials but a way to securely import and export them is going to be really necessary. So, there's general generally there's been some some, so there's been a lot of support so far for all this work, especially from from Apple and Google but also from the other credential providers. I mentioned before it's like we wrote this with Dashlane we've been working a lot with with Bitward and NordPass, Keeper, Passball, and other providers to really make this work and everyone's pretty pretty jazzed about it I'd say and then the the current kind of roadmap for for how we're going to put this together is in October FIDO has another plenary and we hope to talk about it then and then in November the IETF is meeting for I believe 118 in Prague. And if this work doesn't get picked up in FIDO it'll it's going to get picked up in IETF it's currently being drafted by myself as a as an internet draft and it can move into FIDO and there's discussion of it moving to FIDO I am not picky. I as long as we're as this continues development it can it can live on the moon for all I care. So, we're also working with a few other providers like, I guess I can't mention any by name but we're working, we're working to align on a PSC development between a few different folks. So, the early, getting early feedback from folks here is also like appreciated but definitely not expected we're still in very early draft and the document is pretty non normative at this point but you know if anyone feels compelled to do so, feel free to to reach out. So if you can reach out to me at Nick, Nick got steel at actual bits. You can also check out the GitHub repose for cred migration, or I guess now crimp and crimp data over in our org which is the credential provider special interest group we're having a lot of these discussions within the credential provider special interest updates, I think every four weeks now in the FIDO Alliance which I chair with or co chair with Dashlane and Apple. If you're also a credential provider you're free to join. If this develops I imagine there'll be more public forums for for discussion of it but you know if, if not, feel free to reach out and, and I can invite you to all the channels and and slacks and discourse. But that's that's kind of like where we're at. And I'm happy to take any questions now on on the topic. And slash comments. One is, we have had reports of some password keepers getting hacked. I think, very recently, where they lost. I mean, it, you know, it's basically a concentration of all your credentials in one place. For an individual, and also it's almost like a honeypot for because it, it has so much information stored away. Was it last pass or was it something like that that got hacked. And they lost a lot of. So, so these are some of the dangers with this sort of. Not correct is, did that not happen or is it, you know, am I imagining things. So last past did. Last past did experience a breach. You can read. You can read, you can read a lot of articles around how it happened. And this is not the first time that I've, they've experienced a breach. What I'll say is, you know, most, most credential providers have different security models and different threat models. And credential managers as a whole are, are, are a better way to, to, to manage and create and manage credential, like online credentials for the user. They create credentials with higher randomness that are harder to crack that are. And change the, the, your, your, your threat boundary from, you know, in the case of like generating one path, like generating pass keys. Right. If I generate a pass key and store it on an authenticator or stored on a, on a, in a credential manager, an attacker would now really need to have access to my physical device in order to get access to this pass key. Now, they could potentially, you know, with, with a fair amount of effort. Get all the, like get the keys that they need to crack into a credential manager account but I, there's, it definitely raises the, the amount of resources needed to do so and so I, you know, each credential manager is, is, is a, is a bit different. One password dash lane. I believe a bit more than others we all have white papers published that explain how our crypt, our cryptography works and how our applications work we're pretty open about it. I can't speak to, I can't really go into depth too much on like, you know why I think last past experience to vulnerable, an issue here but it's, I do think that, you know, on the whole, using a credential manager is better than not using one at all. Yes, I mean, you know, it's evident, but, but like I said, if it leads to a concentration of, of things in one place, people get more by attacking something like that but anyway, let leave that aside, because that is a specific breach for a specific password manager. Now let's move on to your suggestions for this. One of them, you know, the most important one, which you talked about here is basically the protocol itself of the key exchange and the encryption and so on and so forth, right. I mean, that's, that's what you focused on in this presentation. So the standard Divi Hellman. Excuse me. And then of course the, the hybrid key thing that you talked about. It looks like somebody else has a question but you know I would like to continue this, this kind of exchange about this whole process. He seems to have asked a question so I'll withdraw for now but I'll come back once you have answered his question. Well I was actually moving to your question with this key exchange allow the way it's written now, do you guys envision this transfer working between, say a credential provider and self custody of past keys. Yes, 100%. This shouldn't, this shouldn't be, if you, if you are able to. Yeah, we want to standardize it if you can make the X, if you can generate your own export request you can have your own pass key provider you could have your own credential manager you can make one from scratch. But, you know, if what we want to standardize is the ability for you to say hey I just wrote my own credential manager that I trust, and I want to move from, you know, one password to that new manager. I want to create the export request and parse the, the, the import response, then like your, your, your good. And that's, so we want to make we want to make this accessible for everyone and really enfranchise all credential providers and managers. Okay, so, so the, so the earlier question, it would make it easier to go to a hosted provider or to my own local whatever. Yeah, and in the case of local providers, I believe, like PassBalt is kind of a more local provider you. Yeah, we, we, we envision a way where we can just provide a more equitable way for, for these, these transitions to work where we don't need to, you know, support specific managers and providers, if you want to do import export we're just all going to start talking the same language. Thanks. Continue. That one we just talked about where, you know, if it's a local provider, basically it's going to be, you know, some, some kind of a vault where that data is stored and now then if I can put that vault onto a commonly accessible backplane, then I can use it from another device, even though I mean is that correct I mean even if it's locally hosted, so to say, meaning that it's not in a central location with like a, you know, like a hosted provider. So, I will say that this is out of scope for the, for the discussion of credential import export. So for this protocol, we make no assumptions about the credential manager and the security properties that they're in. I would, as I said earlier that credential managers and credential providers that that are commonly used all have sort of slightly different security properties and threat models that they use. But all generally will publish a white paper or some public documentation around how they work and how they store credentials. I would say that that would probably be the best way to figure out how this works behind the scenes I encourage you to look at the one passwords white paper that we publish online and I'm happy to provide that link later. But I'm not making any assumptions and I'm not. And the, the storage and of the vault is sort of out of scope for when I'm when I'm talking about credential transport. Sounds, sounds reasonable. Well, of course, you know, the reading of white paper like that, and figuring out the security properties of the protocol, especially with respect to whether the actual implementation follows that white paper is a slightly sophisticated. You know, ask of any, any user. I mean, it's not going to be a common user that's going to do this. It's going to be just security researchers or crypto analysts or you know somebody else who's going to do those kind of things. So yes, I mean in the sense it's open source that like that. I mean the method itself. That is a very good thing. I'm sorry, am I not being heard properly is that why you're leaning forward or. No, no, no, I agree it's just I think it's so you know a similar argument can be made for taking the COVID-19 vaccine it's just like, I don't necessarily you know, I know it's not. I mean, okay, let's not go to the vaccine thing. No, no, no. You know, it's the same case of like I think there's a okay a more I'd say more friendly example of it would be like any any other system that you use your Google Chrome browser. You have to have some amount of trust within the the. Yeah, I mean, you know, obviously, even if even the working of a simple thing like a toilet in your own house, you got to have some notion of Bernoulli principle to know why it works. Okay, so that's. But you know, it's like there's an argument made it's like I'm not saying you have to trust the science so you can read the. No, no, but what I'm saying is when I'm when I'm putting forth an argument like oh I have a white paper you guys can read it and you can see. So then you know then this that's what prompted this counter not like you know. Yeah, for sure not not a not a generic, you know statement about the technical imperviousness to normal folk of any solution. Yeah, I agree I think it's in most in most cases there's going to be some amount of trust necessary that these systems are going to operate in the way that they do. But when it comes to what we really want to, you know, have the sort of roots and anchors of trust in this with regards to this protocol is that the, when the credentials are in trends that that they are encrypted in a format that is that you know, be safe for storage and also, you know, there won't be a possibility of saying downgrading the the encryption of those keys. You could potentially generate a, you know, if I have a here to kind of go back to the example and I'll share my screen again. If, if I have an importing app here. That is that they could potentially provide a type of public key that results in the derivation of a weak symmetric key. If that data is encrypted I could pretend I could potentially exploit that weakness and decrypt the data without without necessarily needing the the a you know the private key outright that that could potentially be dangerous. But generally for trusting certain certain curves used for for Diffie Hellman or for HPK, then that that's where the the roots of trust lies are we are we trusting that the cryptography and the curves being used our sound here. Yeah, I mean that's a basic assumption in every sort of exchange like that right I mean, because you don't. You just want the protocol itself to be secure. If it is done properly, quote unquote, right. I mean that's, yeah, that's an assumption. That's an assumption we are willing to make I suppose most of us, because we, you know, we use TLS without being aware of, you know, the, the minutiae of how it works. And we think that our providers are most are trustable. Right. Tim spring has got a his hand raised. I don't have my hand raised, but then sorry for the confusion. Oh, oh, maybe I do. Yeah, it's me. I don't have my hand to blame because I when I put it over your, your rectangle it, it comes up with the raised hand. Anyway, back to. So, the thing that two things one is of course, interoperability sort of area or ecosystem is a source of vulnerability because, because obviously, if both, if both the, you know, the source and the sync systems are pretty secure. It could be that, you know, so obviously I understand why this needs to be very foolproof because that otherwise it would be, it would be the weakest link, you know, weakest point in the attack surface. The other thing which you did mention and which I guess we didn't go into was the format of the, the actual exchange itself. Are you guys working on that or is it, I mean, meaning, are there standards for this sort of thing. Yeah, so this, this is the, there's, there's two. There's going to be two separate drafts, there's going to be one for the protocol itself and then one for the data format. There's not a lot of preexisting work that I found that. That really, that's really great for transmission of sensitive material. I've been taking I think a lot of inspiration from Brent Zundell and, and Olivier. Pierre's last name, but over at spruce ID from terrible. Yeah. So they're, they've been working on a lot of stuff with regards to, to to job formats and I, I think a lot of those really coincided with what we're thinking about now. But, you know, as, as I said, it's like very early days there, I think I have a meeting starting tomorrow to kick it off in earnest, but I've been mostly focused on the protocol side of the house and getting folks aligned there. Okay, because I think that that's obviously another key item, the, the format. Absolutely. And I think the format is really going to be. It's, it's important to have a format that that is able to accomplish, like, not only just the movement of credentials, but the reference of credentials and pointing to credentials that may not be able to move, and also being able to handle things that may not be, you know, to handle credentials that could just be raw data, or even, you know, maybe it's it's bytes provided for a PDF and the PDF is being used for some format heck maybe someone has shimmed byte data into a PDF file, which I've seen before. There, there's a lot of different formats this data can take. But being able to, to have a standard envelope in which to package those discrete credentials and the data associated with it is really, is really important here and then also being able to, you know, if I move a passkey that is potentially, you know, DCA, or sorry, Elliptic Curve 256. And I move it over to some credential provider that that is not capable of using say Elliptic Curve signatures. Being able to have a way for that other manager to say hey maybe, or I can't use this. So I can either do one of two things I can go reference, or go talk to a credential manager that can do it, like the credential manager that just tried to export it. Or I can go to the relying party or the website and say hey, I have. Somebody's I knew a lot of balloons. My news. What's it, the new version of Mac of Mac OS Sonoma added a reaction layer. So the, the, the other form, the other thing they can do is go to a website and say hey I have this old credential I can't use it can we make an, can we figure out how to make a new one. In a lot of cases I don't think they may be able to do that outright without the help of the old authenticator but there's there it this opens up paths for for remediation, which is really also going to be necessary here because as I mentioned earlier on there's a lot of, there's a lot of cases and a lot of times where, you know, if I import from one provider to another, I'll end up with five, six, more credentials that have been improperly imported. Because they're either, you know, not not in the proper format or they're like a credit, you know, it's a credit card and it just doesn't unpacked properly so it ends up being the the other provider thinks it's a note so it gets put in the wrong category there's there's lots of sort of Yes, yes, I'm very familiar with this phenomenon having work, you know, around these kind of problems for a long long time and I, I think, you know, on the very base base in a very basic sense it's almost like casting right I mean you have a type you want to cast it to some other type, you know, either lose information along the way because of whatever you know the representation doesn't support it or, you know, there are there are you know it's a it's a basic problem in in computer programming, I believe, I mean, oh yeah, it's GPT will solve this right we'll just pass the credential through the AI and they'll be like I know maybe maybe maybe inject some hallucinations into it and then left with a credential that is just you and a merge of you and your or something I mean, you know, yeah. So, that's, that's all, you know, the problems are what I'm saying is, I am pretty sure many of these problems have been looked at in other fields, and I'm basically urging you to take a look at some of those areas like, like, for example, did come, you know, very simple thing. I mean, it's not simple but communication between to through a third medium which is not secure, you know, and they had to figure out a lot of these kind of problems. So, you know, basically, what I'm saying is, maybe, maybe you can be helped by looking at some of these things and I am also the chair of the interoperability working group in, in, you know, in a, in ITU, which is to do with transporting digital assets across different. Let's say, ecosystems, like, you know, like a cross border currency, you know, exchange. So, not to believe the point but what I'm saying is, you, your problem has similarities to other problems. And maybe, you know, you can take a look at some of those to help. Oh, yeah. No, I, yeah, as I said, it's like I've been talking with Brent and Oliver and Olivia and I definitely interested in in did come as well I think I'm trying to, there's a lot of recent and existing sort of thought in the space and I'm definitely interested in incorporating. Yeah, so, you know, I would definitely help if I can. That'd be great. Now, and I think we have reached the end of our time, and it was a wonderful talk. Thank you for handling this important topic. Both thank you for having me and I'm glad I'm able to speak to you and the rest of the hyperlature folks about it. So thanks for having me. Yeah, thank you, Nick.