 Hello and welcome to another round of a lot happy hour I'm Aaron Perrecchi and here again with Micah Silverman back from from the break Welcome back So if you are joining us on Twitch or YouTube today, feel free to drop questions in the chat at any point. We'll happily take questions From you all who are watching. Otherwise, we will just be chatting about Whatever we are gonna talk about about what's new in the OAuth world and what's what are some new developments? What's going on there and yeah should be a lot of fun. So Glad you could join us Yep, good to be back So, yeah, where should we start the should we start with some updates from the OAuth world Yeah, maybe catch us up. What's what's been happening since the beginning of the year? Yeah, the The OAuth group has had a couple of Now interim meetings since the beginning of the year you see if I can share that screen in there yet Or rather, we're about to have a whole series of them and we did have one last week And these are the official IETF meetings official IETF meetings for the group that are like documented on the record is where it's gonna get brought up to the group either current specs that are in progress or new specs that are being introduced and rather than try to cram everything into the IETF week, which just happened IETF 110 took place virtually would have been in Prague in person and Instead of trying to cram all these topics into sessions during that week the OAuth group has decided to meet outside of that week and instead is running these as Weekly meetings for the next six or seven weeks So the last Or rather Monday Was talking about D-pop. So that was the topic of That session and We've talked about this spec a few times I think on on here, but it is still in progress still Working through some some, you know, how that's how it goes. So there were some new things to talk about a new New version published shortly before I believe oh nope Did not publish yet, but there were some changes being discussed on the call And I unfortunately had to leave Shortly I could not stay through the end of the call on Monday. I Had no thing to run off to but Yeah, that's process wise it It no longer has like an individual's name Attached to it. So it's been adopted by the working group, right? Is that the part of the the flow that we're in at this part? Yeah, exactly So a lot of these will start out as what's called an individual draft where it just has a person's name in the URL and That basically like anytime you see one of those Take it with a grain of salt because literally anybody can publish those and they don't actually mean much at all It's just a way to like, you know get a document shared published date stamped on the record officially all that kind of stuff But it doesn't have any official standing until it's adopted by a working group once it's adopted by a working group then the URL changes to the name of the working group So for example, it's hard to hard to see in this one because the name is the same length as the word IETF but here is the individual draft from Daniel Fett and I went through several revisions as an individual draft and then it was adopted by the working group in April 2020 and it was renamed to draft IETF and New versions were published there and there Yeah, and so We have talked about this a little bit before but what I'm curious to know of is if you know, are there any reference implementations at this point if I wanted to go and Grab key cloak or something. Could I drop in a depop? Implementation to kind of wrap my hands around it or Where's it? That is a good question. I'm actually not sure who has a current implementation at this point But you know that is a good thing for me to make a note of because That is exactly the kind of thing I should be putting on a lot that met under the depop page which is Here so And you might notice that it is a little bit slim not much content there. Some of them are Some of them are a lot better of Like let's see if this one have one. Nope Some of these pages have a lot more information on them add yours some of them are stubs So anyway, yeah, that is good. I like to I like I Tend to understand things better from being able to sift through code and interaction In in concert with things like specs Yeah, definitely, you know, maybe that's maybe that's something I could work on too as a side project It's pretty it's pretty fascinating. I wanted to kind of Speak back my understanding of it and have you validate if I if I have it right or not The idea is that in the current OAuth world we have these bearer tokens and We have this kind of problem where if a bearer token leaks Anybody can use it and whatever permissions that bearer token has now whoever has it can make use of it and The the resource server doesn't really care. It is just validating the token as long as it gets the thumbs up. It's gonna respond With depop you have a new header the depop header and you have a JWT That's been signed with a private key. That's generated by the application and then the resource server would then use a public key to Validate that JWT so now it's a proof of the pop is proof of possession So now if even if that access token is leaked There's no way for some rando to fake a depop header because they don't have the right The right key to do it. Is that my own track there? Yeah, definitely. That's the general idea. So the I'm Most of the time, you know off all the tokens are considered bearer tokens even if it's a JSON web token it's still a bearer token because that's all that's needed to use it and Yeah, so it doesn't matter if you've Encrypted your JSON web token or sign data or whatever. It's still a bearer token because if it gets handed off to somebody else They can be used just as well So yeah, depop is the idea of proving possession proving that you are the intended you know possessor of that token so and Yeah, it's usually using some form of signature verification in that model and Depop is a way to do it at the the key difference depop compared to other attempts at this is it's doing it at the Application layer rather than at the transport layer. So another way to do it is through the TLS connection Where the client can make a request with a TLS certificate, you know client cert Where the server can look at the fingerprint of that cert and associate the token with that fingerprint So that you have to keep using that same certificate whenever you use the token and that so from the client's perspective You configure your HTTP client with the certificate And then you don't change any of your application layer code, right? There's no code you write to do it It's all done in your in your HTTPS Stack that way of the stack depop on the other hand the idea is to you you write this into your application layer So it's it's actually changing what HTTP headers are sent and things like that There's some there's some complexity with client-side certs that we've just never been able to make easy Because it's just never caught on we were I was doing client-side certs for banks in the late 90s And maybe that's a domain that it's still used in but it just never Just never caught on the way say server-side HTTPS did and and even you know in the last few years. It's been become even easier So there must be I don't know deeply like what the challenge is, but there must be some sort of challenge that it just hasn't Really been made easy for developers or companies or you know small to medium-sized companies to be able to use Because there's a lot of applications for client-side certs, and it's just never really taken off Yeah, it's really interesting. I do know that the the user interface in browsers for client-side certs was never good and Like you would get the little pop-up being like choose your certificate and people are like what I don't understand what do you mean and That sure that was a bad interface There's definitely ways to improve that interface that could be done well and could kind of just remove that as the problem and browsers never evolved that and it it's I think a bit of a chicken and egg situation because The browser vendors have no reason to evolve that if nobody's using the feature and nobody uses a feature because the user experience is so terrible so someone has to go first of like Okay, well, we're gonna Use this even though it's bad now because then hopefully browsers will make it better but Browsers don't have no reason to do that work until it's gonna be actually useful to people. So Yeah, but even outside of the browser context, this is still Definitely like doable today because you can configure your your HTTP client like in your code To use a client certificate, you know every curl Rapper has a like a hook like add in the certificate to the request when you make the request and that's that's doable There's no user interface concern there and it's actually not even that bad from the developer's perspective either Yeah, but but the other challenge with that model is on the server side so The if you are if you're using like a gateway or give any look any more complicated architecture than just a server You are probably going to be terminating your TLS at some layer before hits your application server which means you need to do the check of Checking whether the access token matches the certificate at the time at the thing that's terminating that request the the TLS Which means it may not actually make it back to your application that knows about access tokens So it makes that architecture a bit harder for deploying the server side of it as well and I think that's another reason that that mutual TLS hasn't necessarily caught on because of those challenges and then it gets even to be more of a problem because In the case where you are using like octa as your authorization server But then you are building the resource server because both of them have to be able to do that validation right so Octa is the one issuing the token for the certificate But also it will be verifying like the token request the matches that certificate as well and then Your API's have to validate that too, which is a lot more complicated than just Check the signature of the JSON web token or go and do a token introspection request because you can't like forward the It's yeah, it's it's a it's a challenge. So I Think that's That's the reason that depop is being talked about is because it Changes where that verification is done both from the client's perspective as well as from the server's perspective so with with depop your browser client can do it because you can write JavaScript code that does this That creates the signature and manages the keys and you can also do that from your server side code and then on your resource server on the API side you can validate that Because it in your application because it's just an HTTP header that gets you know forwarded through all the gateways that you might be behind Right yeah, so we'll see see how it goes. I think that there's looking at I Was just gonna say I do have one question currently about depop Which is what's the? how does the the resource server know what Public key it should be using is it is it like a JWKS kind of interaction It is it is JWT first of all, I guess is JWT a requirement it seemed like it was for my brief reading and if it is like how does the server go and Go about you know because I could if I if I've intercepted a an access token I could go create a new JWT and sign it with a private key and Send along a depop header. So yeah, yeah so the the idea is that when the client makes the request for the access token, so this is gonna be the The the the authorization flow already happened before we get to this diagram, right? So the the logging in all that stuff this token request is you know Grant type authorization code the authorization code from the query string all those parameters get put into here It also sends a depop proof, which is basically signing this request. So at this point, it's got a key It signs this request and then the authorization server says great. Here's an access token and the authorization server keeps track of the the public key that it saw or the Yeah, it the public key that it saw in this request so At this point the authorization server knows this access token is issued to this particular key so When the access token is used at the resource server the resource server The the request also contains a depop proof, which is again the same using the same key the resource server has to be able to say Okay, first is a token valid and then does the proof does this proof match the key of the original token request so coordinating between these two Is it it makes that a little more complicated of like it's not the same as the introspection request it's considered out of scope of the spec because it's These are often, you know part of the same piece of software or or whatever, but Yeah, but in the case of of course like octa that's gonna require some coordination between your API code and octa octa authorization server right Which I don't know exactly what that would end up looking like but that's how That's how it works So if the client comes and brings a different key in this request it would fail because this proof does not match The key in this proof does not match the key used in that one Right That's cool. Yeah, it makes sense This is a good question though Doesn't depop increase the resource demand at the end point wouldn't have bottleneck stuff easily and leave you vulnerable to flood attacks I don't know if it would necessarily Ballin I get so much that you can DOS the server by sending these proofs the one of the goals of it is supposed to be quick to verify it, but it does definitely add more complexity at the resource server, so Here is a let's see is this where it's explained No the protected resource access so when When the client goes to make the request it's going to also do this whole thing of like basically this is what it ends up looking like It's here's my access token and then here is the depop proof proving that I Control the same key that was used to issue the access token and you have to be able to verify this at the resource server So here's the explanation how to do it the I'm trying to see if There's a If there's a quick Description of it now. I'm not seeing the quick description of it The question would be is it possible to verify that without adding a network request? and I believe it is possible because This is still a JSON web token, right? So you have to know what key to expect To find in the proof which you would know somehow from the access token. That's the sort of hand-wavy bit implementation detail But if you know what key to expect you can verify this and verify that it matches that key so I am guessing that one way to do it would be to I Wonder if one way to do it is to put some identifier of the key inside the token like if you're using JSON web tokens for access tokens also I'm which is not a requirement but yeah, that's a Signed JWT swapping Can't create a message integrity. Nope only contains proof This is actually a good point too the The proof that's created. Let's see is there a description of what goes into the proof The The Thing that is signed Yeah, here's what's signed so you create a JSON web token and It's gonna include a Unique ID is gonna include the HTTP method. So post or get the URI for the request interesting as without query environment parts and Then the time it was created and that's it so it's not signing the contents of the request it's just signing the sort of like the metadata and this was actually kind of a part of the discussion this week of There is a HTTP message signing spec being worked on also Which would sign the entire? Well, it gives you the opera gives you the ability to sign as much of the HTTP request as you want including the body if it's a post request or any arbitrary headers as well and That's a very different problem and actually is a lot more complicated to implement because you have to deal with Canonicalization at the request and all that kind of stuff. So this is intentionally. That's what it was saying down here This is intentionally Smaller scope problem where it's it is It is where is it? Yeah message ticketing. It is an intentional design decision to keep it simple to use but it does mean that replay attacks are possible with this actually because if someone can intercept the JSON web token being used as the proof You could use it again to make a request as long as that request matches the same stuff That's being signed. So you can't steal a JSON web token depop and use it at a different endpoint But you could use it at the same endpoint and Because the body is not signed you could swap the body out, right? so Yeah, it's not perfect, but hopefully the idea is that it's better than not doing it, right? Yeah, and it sounds a little there's an aspect of it that feels a little bit like What happens with pixie in that it's the the you know your client application that's Signing the JWT and that's not something that can be easily stolen kind of like the You know like the code verifier for pixie or something. It's generated by the application You know you should have some some level of safety there Yeah So let's see proof is more like signing the access level meaning each ui's already been authorized by the author server. Well, it's it's not Really any sort of access control thing or authorization aspect at all. It's really just Proving that the it's the idea is that It's meant to prove that the access token is being used by the same thing that Originally the access token was sent to so That's like that's it so like pixie is a good example of that too where pixie is basically it's the same idea where What pixie is doing is it's making sure that the thing that is being sent to the access token is the same thing that started the Pixie flow so you don't accidentally send a token to a different instance of the app for example But then after the flow after the access tokens issued pixie's out of the picture So it's like if you had a way to keep that going throughout the rest of the lifetime of the access token of Making sure that the only thing that is using the token is the same one that it was sent to you at the beginning But I don't know like this is obviously not a perfect solution and the only way that it's Actually Gives you like any real level of assurance is if this Key that is used to make this proof can't be extracted. So if you are in JavaScript and or Yeah, and JavaScript is probably the most or actually no a desktop app too if you are generating a key in your Application and then storing the private key in that application layer code. It can be stolen, right? if somebody has access to Steel the access token they probably have the same physical access that would let them steal the key if it's being stored in memory or in that application storage so the only the way you have to store it where it can't be stolen is in a Is it in some sort of hardware security module, right? If the key is being generated in a way that it can't be extracted then it can't be stolen Right so the application can be tricked into signing stuff, but the application can't It can't be stolen out of the application So like in general all these things what they're trying to do is they're trying to stop the attack of which is actually extremely common of your so your browser is an application running on your computer and It's storing data somewhere on your hard drive if someone has access to your hard drive They can actually just basically clone the entire browser, right? You can grab the cookies out of the browser because they're on the drive somewhere and if you pick up all someone's Google session cookies then You can drop it into your own browser and how you're just logged in as them you've bypassed all MFA and everything because MFA is only it's only Yeah, it's securing the actual issuing of these credentials. It's not securing the actual access so once MFA is done You've issued a session cookie or an access token or whatever it is And if you can pick that up and find it and send it somewhere else you're already logged in so this is To get at the you know the data that the browser is written to disk then trying to Attack the browser directly with a with a browser extension or something Yeah, because like browsers are reasonably secure within themselves like they do they do a lot of work to ensure that cross-site scripting Actually, there are a lot of browser API is now to prevent even cross-site scripting attacks from working But there's definitely a lot of stuff in there for like preventing hardware access from the browser or Cross tab memory sharing like none of that's a problem anymore, right? We don't we don't worry about those kinds of things but It's all at the end of the day written to your hard hard drive and the OS doesn't necessarily have those same levels of protections on iOS you do, right? I always says basically no way for apps to talk to each other. So that's good but on a desktop that's not the case and The way the sense of happening is someone will trick you into downloading an executable file for through some sort of social engineering attack and that file can then just pick up your browser and send it off somewhere and Take over your entire Google account, which has absolutely happened many times usually involving running crypto schemes on large YouTube channels Yeah, which happens again and again Yes, I Have seen several several YouTube channels of people who I know being taken over through Usually that that scheme because yeah, you you know once you have a big enough YouTube channel You're gonna figure out you need to factor off and good security Practices on your Google account, but that matters if they can get access to your actual browser Right. I think it's a vector of attack. Maybe it's not so much now with a little bit of time That's passed, but it's a vector of attack that I think was kind of neglected for a while We spend so much time inside the browser Protecting tabs from themselves and applications and you know cross-site scripting and HTTP only cookies and It's kind of genius in an evil sort of way to just bypass that all together Yep, exactly So that's the general problem that all these things are trying to solve depop MTLS all that stuff where it's trying to make it so that Even if you do have access to the disk and can pick up someone's cookies or access tokens off of Outside of the browser You still can't use it because you need something that can't be lifted Which is a private key in a Harbor security module Which thankfully is getting built into more and more computers now and phones and stuff You know with face ID and touch ID all those are all those keys can't really be extracted. So Being able to use those to create these Signed proofs Would would then actually solve that problem So the question is just what's the easiest way to do it for developers as well as people running servers and That is the big unknown right now. So depop is one attempt at solving that with depop or MTLS. Can we have? Large timeouts or no expiration these don't really The the the depop proof or MTLS proof isn't tied to token lifetime So the token can still have its own lifetime in the same way as it normally does because it's going to Be either a structured token like a JSON web token or just a random string sort of a database somewhere and that can still have its own policies around how lopping that gets refreshed and all that The depop or MTLS is just the idea that you have a Key that you have to use every time you're interacting with the system both the authorization server and the resource server Could you then make an access tokens that don't expire? I think that I Don't think it changes any properties of whether that's a good idea. I think all the still the same considerations of Long-lived or a non-expiring tokens still apply like it does reduce theoretically reduce the risk of Token being used if it's stolen. That's the whole point of it But that's not the only reason you don't want to do non-expiring tokens Right Yeah How well is this integrated into existing tools? I would say it's not at all yet. It's all still very much in progress Yeah, there may not even be like a Reference implementation for its current state yet, but that would be a cool thing to play with anyway But that's a pretty that seems like a relatively high bar to entry right out of the gate And that is I mean that's that's at least part of the problem. We have already with access tokens and browser applications, which is that the dirty little secret is that There is very few reliable Strong ways to to store those to store any kind of data in the browser At least currently Are there kind of concurrent efforts to allow the browser access to? Hardware security Yeah, the part of what web-authent is about So I haven't seen any APIs for like storing data With a key like what Simon the key, but there are there is a an API the web crypto API Which is a very low-level API and hopefully it's gonna get better, but No, it's this one. So this one what this lets you do is from JavaScript you can generate a key But the key is not It's stored in the browser and it can't be Grabbed from JavaScript. So I don't know if this It's well, it's like it's kept track of by the browser But JavaScript can't access the key But it can generate a key it gets an object to it and then it has a way to run a sign There's a sign method of like please sign this for me There there is an export function, but I'm pretty sure the generate key has a yeah Has a attribute so you can say make it not extractable. So Generate a key keep hold on to it Don't let me access the private key, but then you can run sign on it so that it can you can sign a message with it now whether that integrates with the Computers hardware security module is a different story because ultimately that key if it doesn't integrate with the hardware security module It's still stored on disk somewhere that can be extracted But What this does at the very least is it makes it so that cross-site scripting a cat attacks can't steal the key, which is a much more common Attack then someone getting disk access to your machine. Yeah, I mean you kind of have bigger problems if somebody has disk access to your machine then then just the browser but well, yes and no because the browser is like Extremely powerful and can do it and usually has a lot of long-lived session cookies and things like that and has already gone through MFA so Disgust your machine like the browser is probably the one of the most highly valuable targets if someone has disk access to your machine Because what else are you gonna do just like steal random files those don't do anything And it's only anybody has their Bitcoin private keys on their computer either, right? So Yeah, that's fair So but that that subtle crypto seems like it would be directly useful for D-POP because that presumably is what you would use to generate a key and then Be able to sign stuff Yeah, definitely, and I did just find two libraries a go library for D-POP and a type script library So I'm curious to See how this actually what this actually does under under the hood But this is how you use it, right? So you do Generate a key pair and then you can generate the proof Jason web token Which is like this is the idea is that it shouldn't be that much code and actually use it right once you got a library that understands how to create that signature it should be as simple as Calling the you know give me the Jason web token with this key And here's what you're signing. You're signing The URL and the HTTP method So this one does not use that API because it says storage of the key is not part of this So that would be a feature request for this library is to use This Library or API to generate the key so that it doesn't have to be stored at the application layer Yeah, well, let's talk about what else is coming up. I see We inside of March. There's an OAuth 2.1 meeting and then another one that has Some other interesting topics Yeah, so Monday is OAuth 2.1 and that is the ongoing effort to consolidate and simplify the specs so that they are there's just less stuff to read and It's a huge effort as we are seeing but we are making progress. We're gonna have Hopefully some updates to share with done a couple of changes based on some feedback from people in the group so far There's been some really good feedback of just a lot of its just suggestions on how to rephrase Sections and move things around to make it more clear or suggest dropping some parts that are redundant or whatever it is So that's been something I've been working on quite a bit and Yeah, it's been a new draft published I want to say Not that long ago February 19. Oh, no sooner Monday. Yeah. Oh, right and Yeah, this is the new There's a couple of more suggestions one of which is to move to rearrange This which talks about authorization code and client credentials to rephrase it so that this is like how you get an access token and then like this is the token request section and you would put off code client credentials and Refresh tokens in that same section because those are all token requests But a lot of this stuff just like readability, right? It's not changing the spec So yeah, what we're gonna be talking about on Monday, though is stuff that is more in the actual like implementation differences or normative requirements kind of thing so Like how we address Sorry good. I know Yeah, I was gonna ask what drives What it is that you're gonna talk about because I know there's a lot of activity on the email list in between meetings But then somebody's got to come up with an agenda when you're when you're actually the working group is meeting to say Okay, this is what we're gonna focus on so I know it's oh walk 2.1 Generally, but what drives what you're actually gonna talk about at the meeting? I have to make that agenda after this after this happy hour. Yeah I'm we the the editors three of us talked about it on Monday though and we You know have our own our own lists of I wasn't Monday. I think it was Tuesday. I don't know this week has been weird, but the Yeah, we have we kind of came up with a list of like here's the things we think are good talking points for the for the meeting Here's things that we think don't need to be discussed because we all agree or it's very straightforward And we're just gonna make the change later But the the main things to bring up to the group are things where either if the if three editors don't agree on something then we need to get more input either to Have someone change our minds or decide to Not do the thing or whatever it is, but if three of us agree that we're generally like yeah We all have different backgrounds. So it's probably it's probably a good idea to do that and some of the things are just like Yes, this is this paragraph would fit a lot better if it was in this other section We don't need to like bring that to the group for discussion, right? Yeah, so that's 2.1 and then the Monday after that is Two drafts Client intermediary metadata and multi-subject Jason Web token These are two new ones being brought into the group there. You will notice individual drafts not adopted yet So the idea will be to again see if there's enough interest in the group to actually adopt these as working group items So what is this notion of client intermediary data considering I see it's it's under the perrechi Yeah, this one is one I've been working on and I've been working with with The financial data exchange group. That's another it's another one of these groups that is adopting a lot for a specific vertical which is the financial industry and it's primarily us-based companies and a lot of there's already been a lot of work done in the open ID Subgroup financial API working group under open ID So a lot of what fdx is doing is just saying we're going to take these Other documents and adopt them as well But it's basically a collection of a bunch of us financial companies that are trying to work together on standardizing Using OAuth in the financial space To avoid things like screen scraping and credential sharing because it is frankly embarrassing that when you go connect your bank account to various apps like budgeting apps or Quickbooks or Turbo tax or whatever you really should not be putting in your banking login credentials into those applications It's embarrassing that still works that way. So yes Yeah, we're trying to actually solve that and there's thankfully a lot of people who agree that problem should be solved so another question is how to solve it and what parts of OAuth do we need what parts are a good idea for that obviously OAuth is a Framework and there's a lot of optional pieces. So it's kind of like a bunch of building blocks. You get to assemble however you want and This draft is Basically filling in a piece that didn't exist in OAuth that they need which is Because so like a normal OAuth you would go if you want to go access data from like the Twitter API You the developer of some application that goes and makes an account on Twitter Signs up as a developer agrees to their terms of service and then you can get credentials for your application and That's fine, but in the financial world, it's a lot more complicated than that. So if you have for example like I don't know Quickbooks wants to download transactions from US Bank well Into it is the company that writes QuickBooks. They don't want to go to US Bank and make a developer account Because and you know get credentials for QuickBooks online because there's so many different banks, right? There's all these different banks and they have to do it for every single bank, which is just too much work and Then QuickBooks has to do that as well as every other piece of software, right? So it that would that there's like too much of it and there and that's also like not just The banks also don't want to deal with those requests either so In seven ends of happening in practice like this already is a thing is there's these aggregator companies that kind of sit in the middle and they will create these relationships with the banks and then sort of wrap the API and provide that data to End user applications like mint or QuickBooks So what it means is that it's like there's a whole bunch of these end user apps There's a few aggregators and there's a whole bunch of banks and it means that there's a lot fewer cross Connections, right? It's not every app talking to every bank. It's a bunch over here talking to a few talking to a bunch and The the trick is that because this is the financial industry there's a lot of regulations around data sharing agreements and things like that and users have to be informed about things and When you see this OAuth screen, it's you want to say like QuickBooks will be it's trying to access your data from US Bank Right? You want to end up with a screen like that? But you have to also say and this data will be shared with X Y and Z because of all these different companies that are involved in that chain And there isn't any mechanism in OAuth to specify the other companies in the chain or The fact that those should be displayed on the on the consent screen and that's what this draft is doing Basically, it's giving a way to associate the data of all the intermediaries that are going to be getting access to data When an application is registered so Once that building block is there then they can take OAuth this draft a few other drafts Put some limitations on it that are described in the financial API open ID group and Then they get a profile of OAuth that is useful for that world Saying well, I mean one of the companies that you're talking about the intermediary company like plaid Right now they do a lot of the scraping You know give us your OTP Yeah, kind of kind of work It would be really nice to see this evolve because it like you said, it's it's kind of embarrassing at this point that That seems to be like the worst offender is the place that we have like our most sensitive Among our most sensitive information The banks are so far behind relatively speaking It's pretty bad like Google's security is way up there and they are one of the best at protecting consumers and and You know having a good OAuth experience and being really strict about it and yeah, they don't hold on to your actual money So I would actually much rather trust Google with like my bank account Credentials than my own banks log out at this point. But yeah Hopefully with this we can improve that and The US will finally catch up to What's it is already it is already happening in a lot of a lot of banks in the US and a lot of the rest of the World is like a lot of them are using OAuth now, but mm-hmm, but It's we felt so the long way to go yeah, oh cool, and You know, I started digging into this The the JWT This one The related yeah multi-subject that was the word I was looking for ups And I was I was kind of interested by it And what stood out to me is that it's this notion that you still have like a primary subject but then there are related subjects that get embedded and It's cool, but I was left with like You know and you may or may not know this because this isn't your spec But like what is the use case that they're looking to solve here because it's not just that you have a bunch of Subjects it's that these other subjects are related to the primary one in some way Mm-hmm. Well, I mean there's this use cases section so I Haven't read this yet actually this is good prep for Monday Primary subject with a related secondary subject that has authority over the primary subject child parent pet owner Secondary user logs in As gets asked or asked for permission access resources for the primary subject The authorization service and issues of Jason web token with the primary subject and the enclosing Jason web token and the secondary in the nested So I guess the idea would be if you need if you're using Jason web token access tokens And you're using the subject of the Jason web token to identify which resources to look up but you want To know that it was not that person who actually Authorized it right so in in typical OAuth the same person who Owns the resources though is the one authorizing access to them so the subject of the Jason web token Happens to court to coincide. It's both the person who logged in and the person owns the resources So this is for the case where those are different people the person logging in is not the person who owns the resources Yeah, and the other one the second one seems relevant to like the example is a married couple or a partner relationship I Mean that's pretty common where You know, I need to go and Get a prescription filled for my wife or something Yeah, you don't want to have to log in as her To do that Because that's that's misrepresenting what's actually happening. So this would give you a way to actually represent I guess what's really happening Yeah, I am curious about this because it seems like there are multiple ways to solve this and this is just one way So it's a it's Looks like it's adding this claim new claim related subject that has Description of the relationship and then a Jason web token inside that describes that subject. So, yeah This is one way to do it. You can imagine there's a list of yeah These are just or so it's yeah You could say it's The authority over so that's the pet case or whatever But I mean, you know, this is one way to do it But you could imagine that you don't have to put a whole Jason web token under here You could just put new claims under there, right? That's a different different way of doing it with different Properties, but those are the kinds of questions to ask about, you know, why does it work this way? What's the intent? Is this the first time that it's being taken up at the Working group. I believe it is. Let's see. This was Published in in March, but version zero was March last year So oh interesting and that was a different working group then it got moved here and Then apparently it were thought changed companies between September and March One of our friends from odd zero looks like I Didn't actually I actually didn't realize that he is at all zero Yeah, well the so that's that is that week and Four more in April So these are all I'll see these two are in progress Have been in progress for a while. We've definitely talked about those and then This one is a new it's not actually a spec. It's a Use cases document talking about how might browsers better be involved in Identity and authentication because right now they're not really involved. They're just Shells that can be used to create authentication flows So that's and there's people from browser vendors who are actually like interested in working on It's probably improving that space and getting more Building more actual browser UI around authentication use cases. So we're trying to collect these use cases And then this one. I'm pretty sure That acronym was not chosen accidentally Yes Token mediating and session. Yeah, see the session is should have been capitalized. They just left it out for the funny acronym Token mediating and session information back in for front-end. Oh, yeah, and they capitalized the F Nice good one good one guys But this is a new a brand new one first published last month and This is a new idea for how to coordinate between Single-page apps that do have a server side back end and it's describing some interactions there. So we're taking a look at because People are gonna have some strong feelings about that. I'm sure So this is the time to To chime in and yeah, the goal of the security BCP one to revise to continue to revise that while we're In between 2.0 and 2.1 Yeah, basically 2.1 is supposed to include everything in the security BCP But the security BCP still is useful as a reference for OAuth 2 So like we still want to finish that one get that one done and then everything in it gets put into 2.1 Yeah The TMI BFF one I think is particularly interesting because we have talked a lot about You know, if if you have a use case where or if you have an architecture where you have middleware You're much better off keeping your tokens there Then having them arrive at the browser at all if you have a pure spot app, maybe you don't have much of a choice Right, but if you have some sort of middleware and I wonder it you may not know yet But is it is the intention to kind of address that to kind of codify that and in Some part of a standard or is it totally actually it's the opposite so the The browser-based app spec that I started which is a little stale right now I knew we need to go back and revise it, but that one describes the pattern you describe you talking about of Do your OAuth flow from the server side keep the token server side and do a session cookie to the browser So the access tokens never make it to the browser. This is exactly the opposite This is if you you do the OAuth flow from the server, but then you explicitly give the tokens to the browser So the browser can make API requests without going through its back end. So it's establishing that pattern and Yeah, we'll see what they say about it because they they brought it up on the mailing list when they first published that but We'll see what the what the thoughts are around that At the very least it seems like I might have to add this a reference to this in the browser-based aspects as as like a fourth option right Well, it's a lot of interesting stuff coming up It may be useful because we've had a little bit of a gap of time to let people know how they can get involved if they're interested The the site that we've been looking at is events.oauth.net Yeah, and this is This is where we post the The you know these meetings, but sometimes other other things on here too like other The workshops or the ITF meetings or the where is it last All the happy hours are posted here Or am I thinking of the identity workshop that's another place that's it's a different group, but there's some interesting stuff happening there as well the security workshop which is again, not not ITF, but it's just a community organized event talking about oh That's security stuff and a lot of good discussions happen there as well I'm back to the before times when we could actually travel around Yeah, yeah, I wouldn't actually went to Singapore and Montreal and Prague all in the same year Yeah, so this is the place to go to find these and These are Like publicly available to join so like you can absolutely join and show up and and talk about things you there's no like Apply to be a member button in the ITF you can just Sign up for the mailing list if you want and write write messages or or come to the meetings. It's very informal that way Yeah, they The ITF homepage for the OAuth group is there it is I talked about Is this is where you find the other this is the other place to find the meetings and stuff But I find this one's a little harder to navigate but These are the list of documents so what group is working on which is a lot a lot of in-progress work All right, and so we'll be back here in two weeks. Yes We're doing every other week. I think we might even be doing every week during this all this There's a lot of activity right that would make sense. Yeah, just because there's there's interim meetings every week For the next six weeks, so we will be here every week except for octane week where Octane will be happening. So Yeah, that's April 6th at 8 so whatever that Thursday is we won't be here All right, well, we'll put it up on that events page anyway It's good to get rolling again good to be back Yep, and I guess we're at the top of the hour. So let's just wrap it up here and see you back here same time next week Hey