 welcome to another round of the OAuth happy hour. Always a good time here. I'm Aaron and we have Vitorio here as well. Hi. It's been a while so I'm sure there's a lot to catch up about and a lot of interesting things have happened in the world of OAuth that we can catch up on and I think we have a few other little fun updates to share as well. So with that as always if you do have questions feel free to drop questions in the chat and we'll we're happy to take questions as well but we're also going to be just sort of sharing what's new in the OAuth group. What are the recent updates to any of the OAuth specs that have happened of which another happened a couple is kind of back and rolling again and what's new in the world of OAuth in the news as well and there was a massive Facebook outage on Monday which has some interesting implications in the world of OAuth as well. So it should be a fun hour for everybody who's here. So with that let's start with the easy stuff. Get that out of the way first. Updates from the OAuth group. So this has been the we've had kind of a slower summer as you know a resumification and and taking time which is fine but things are back it is the fall is October and we are back in business here. So in this month of October there are a series of interim meetings which are where the group meets what would have been a face-to-face is still virtual and discusses a particular topic in depth. So giving updates on a particular draft talking about what's new in that and then also hopefully getting to discuss some of the open issues and resolve and move things forward in those drafts. So right. Mailing lists are not always the best way of uh gathering consensus or getting the temperature out of the room and having this high bandwidth from time to time like at least twice a year is not a bad thing at all. Definitely definitely helps having some actual FaceTime which is still virtual but still that is different very different than mailing lists interactions. So the first of the series was yesterday talking about the new new draft individual draft HTTP signatures for OAuth which is one of Justin Richards drafts and this draft is a applying HTTP signatures to OAuth. Oops those are his slides I'm meant to show the draft and applying HTTP signatures to OAuth which is something that has not really been done before and is still very experimental and also because the HTTP signatures draft is still very experimental. Yeah maybe we should zoom out a bit for people who aren't following very closely what happens to the ATF including myself. Yeah I learned about these like I knew that this was happening by osmosis but I actually looked it up once I've seen that stuff coming up. So the idea is that Justin and Annabelle from AWS are working with a HTTP group so lower level than the OAuth level on a mechanism for performing and validating signatures for HTTP messages. So there is this generic mechanism very powerful that you can use for selecting parts of a message selecting the kind of key that you want to use their algorithm and similar attach the signature to your message and send it out and then instructions and here correct me if I'm wrong Aaron but I think that Justin in the past had another draft which was meant to perform a proof of possession with OAuth. Let's say that it was a mechanism for showing when you are using a token that you know a particular key and basically that would have eliminated a number of the challenges that you have when you use a better token and the meeting that we were having yeah it was it yesterday or today? It was yeah. Oh crazy and it was about whether that all the draft which is expired by the way of like doing proof of possession with OAuth should be abandoned and instead a new one which is trying to achieve the same goals but using HTTP signatures which as Aaron mentioned are still also not out of sea they're still being discussed but they're not yet standard. However we should drop the old proof of possession and work on a new one which uses HTTP signatures as let's call it the mechanism of performing the proof of possession. Is that fair? That is that is fair yeah yeah exactly and I guess backing up one level further the idea of proof of possession is I guess you did summarize it but it's the idea of having some sort of key that you do not actually send in your request you instead sign something so that when you protect that key properly it is sure the server receiving that message is sure that it actually is that thing sending the message and not something else that has stolen an access token and the big problem here that these are all trying to solve this whole realm of proof of possession is the problem of stealing access tokens because if it requires signing something with the key and the key can't be extracted then there is no longer a problem with stealing access tokens and frankly this has not really been solved well currently this does not there's not a super great solution there's really only one that is an RFC and I would not say that is widely deployed it is mutual TLS where the client has a TLS certificate that it uses when it establishes the TLS connection and the access token can be sort of associated with that certificate and that's really only one that exists other than the handful of experiments that have happened that have all either expired or just disappeared. So shameless plug the very first episode of the identity unlocked podcast was with Brian Campbell and we spent the entire episode exploring the space of sender constraints which is the collective name of this and Brian was a very nice and patient and walked us through MTLS which he just described and a mention of token binding which is a basically dead attempt to achieve the same by tying to a SSL layer rather than XPC certificate, D-POP which is inside a very nice and fashionable which is a lightweight way of doing proof of possession with a dynamic key generation and in general it's an interesting space and a fast moving one and this idea of using HTTP signatures this way is largely playing in the same space which if we report on the earlier reactions on the mailing list seems to be something which is concerning for a number of people in the working group. Yeah yeah definitely so the HTTP signatures has nothing to do with OAuth by itself it is really just a spec about signing an HTTP message which also means it has lots of applications outside of OAuth and that's a really important aspect here is that that spec is not meant to be tied to OAuth the concept of signing HTTP messages is useful in a wide variety of things. The OAuth group in particular would like a form of proof of possession that is you know easy to write code for easy to deploy and HTTP signatures if it were existing already would be a good option and that's sort of what this draft was trying to do was say okay let's say this thing exists let's say it does go forward and does become standard what would it look like to use it in OAuth so that's the particular draft we were talking about yesterday was the application of HTTP signatures in OAuth so all these layers but yes the look does look like the initial response on the mailing list was not super positive mainly the concerns sounded like it's too experimental to to adopt at the moment and also that signing things and canonicalizing things is hard those are the two sort of main points I heard yeah there are a few and there's also the aspect of confusion as in all those methods like you're right and mtls is the only one that is officially rfc and there's a bit more impetus behind it because a fappy one the financial apis require it and so some of the financial institutions that want to comply they have supported no matter how hard and so it's seeing a bit more traction but apart from these very bright lines as in like you've got to do it or you'll not get in certified then there is a lot of confusion about like okay let's pretend that the depot actually becomes a standard how do I distinguish like how do I decide when to use the depot versus using these things with HTTP signatures like Justin made some argument which have a solid roots I believe in security considerations but we know that some of this stuff is kind of like the nuance when you give guidance to people is a bit lost and we tend to always go toward the simplicity like for example people try to argue that you don't need pixie everywhere and then we found the place in which yes it shows that you do need it but for a while people pushed back but ultimately people seem like others said you know what let's just make it simple let's just ask for it everywhere and so here we might see some similar mechanisms in which if the difference between a depot and HTTP signatures is not easy to explain to the non-off cognizant developer then that might be another reason for which maybe it makes sense to wait until at least HTTP signatures are enshrined in the RFC Olympus or whatever you say in English. Yeah yeah I um it's it's definitely a tough one um personally my my own personal feelings on this are I would love to see HTTP signatures exist and be a thing on its own because that means that it would very likely then be built into HTTP libraries like curl where it would then be super easy as a developer to just be like hey also sign this message like here's my key file please go sign this message and you wouldn't really have to do anything and ideally there would also be the corresponding code on the server side to be able to validate the signature and it would all be at that layer of the HTTP client itself meaning I don't have to write application layer code in order to make that work that's my that I would love to see that but we are not there yet because the spec is brand new and it there's no telling you know how long it would take for a client library like curl to adopt it even if the RFC comes out so that's it it is a challenge yeah I hear you and actually as you were talking I was reminded of remember when JavaScript had no crypto whatsoever and you had to build everything like from scratch you'd have to like to use basic math and integer arithmetic to build the crypto and then once they started featuring crypto APIs so many things got so much easier and signatures might play a similar role I'd be really interested to also see how the market would react because like good to have something that we are so confident that it's so polished that you can actually call it the standard but then we know that the market will pick and choose what are the things that we want to use the most and also I see potential synergies with other things like web often like places where the browser already is paddling with crypto or like access to the secure elements of the machine and flows in which the user is involved because like if you need a user to unlock access to your key material so I think that this is a technology which can be incredibly powerful and incredibly empowering but I would allow it to see what people use it for first so that's the standard gets enshrined in libraries and then see what people do when they are using it and then we can build on top of that even further with more confidence that what we are doing is actually what people want to what the market wants as opposed to what we imagine yeah so yeah I think that's a good that's a good summary of of that session the minutes are posted online if anybody wants to go and see more of the discussion and Justin slides are online as well so you can dig into the details the best place to go to find that would be the well there's the actual meeting page on the itf so I will drop that link into the chat and Justin did a great job like he did a little retrospective he explained how signatures work in a very concise way both are great slides so if you want to get trapped but I think that the both slides we really have yeah yeah definitely but let's move on to our next our next topic well I guess let's just quickly summarize the the the work the the work that's coming up in the in the next interim meetings which we will I'm sure talk about on another another happy hour here but without getting into the details of all these because there's way too many of them the the new drafts that have been updated are these these ones up at the top here so these are all going to be the topics of the next couple of of meetings we've got a new draft of OAuth 2.1 optimization server issuer identification we've mentioned that a couple times the D-POP we talked about that's the sort of other alternative to HTTP signatures and then rich authorization requests got an update and Jason went token response for token introspection mouthful but look at all this new work happening so that's really great to see the the sessions coming up are right now these are the ones that are scheduled OAuth 2.1 rich authorization requests I'm super excited by that one also and D-POP has one at the end of the month so it's possible we'll get another meeting or two out of that as well that just aren't on the calendar yet I'm ready to bet that just like last time the 2.1 will run long and we'd like to book a second one because like the surface that you cover into that one is just so large and you are like taking this opportunity to revisit some things which are uncontroversial but even just the language and so you're like I'm covering some of the stuff so I think that there will have a lot of healthy discussion about it yeah yeah definitely so yeah that's the updates from the world of OAuth let's talk a little bit about Facebook I think that's an interesting one because I didn't actually see much discussion about those implications in the or the OAuth implications in in the Facebook adage so on Monday this week Facebook had a massive outage where they basically deleted themselves from the internet and it was an accident and they were just gone for it was a long time it was several hours six hours six hours okay six hours of just being not on the internet and the it's one thing when you have like you know your random one random website goes down fine but Facebook is not just a random website and it is a lot more than just cat photos and it is also in fact used as a way to sign into other things a lot of people do rely on the signing with Facebook button and that's the thing that caught my attention of like well if Facebook is down then all of the signing with Facebook buttons are also down and that can have a lot of implications outside of Facebook because you can use your sign with Facebook to get into all sorts of different things and really the the sort of the only the result here was that you just couldn't click that button the button would just go to nothing and you're you're stuck so it's tricky because this is always the like oh well you don't want to you know is it is it worth relying on an external provider for logging into your app or are you better off building your own provider or running your own provider or just using a password and you know that's a constant struggle with developers of like it which is the right way to go and then you see stuff like this happen and you're like well maybe i should have just rolled it myself right well of course like been once here is like first i'd say that like it was cringe worthy because like those are people like us working in icky and no matter how much i hate we are getting on the media uh those are engineers that were struggling for six hours uh i heard that the telegram gained seventy million new users in six hours seventy new me it's more than the population of my native italy it's like yeah mind bubbling but anyway uh like so they deserve our sympathy and it was pretty funny to see some of the memes like uh the twitter founder tweeting welcome everyone and i mean literally everyone literally like that but there were also other things that i found infuriating as in people that know kind of like enough to be dangerous and then chimed in with very um categorica like very like to say dogmatic extreme views as you're like oh yeah that's uh it's stupid to rely on external that you have no idea like like everything these things as uh pros and cons like whenever you outsource a critical function to an external uh element in your architecture you are going to um be tied to the sla of that thing and this thing is not just for the identity provider but it's for everything like a recent example is platform authenticators you can sign in to a website sign up to a website sorry and if the website understands that you are on a device that there's a web offering platform like uh face id or touch id or like i have various others around um then it can let you just come in and it's going to be super easy super simple just look at this thing and boom you're in but if that's the only mechanism that you put together then if you migrate to a new device or you lose your device so like uh you forget to the rest you are going to be dead in the water because it's just the nature of this which is why normally for this kind of stuff we have alternatives if no website apart from the most crazy ones will allow you to sign up only with a platform of indicators like with keys they let you because you can move them around in similar but when it comes to to a platform of indicators you almost always need to provide another another factor so that if something happens because you need to expect that something will happen then you have something you can fall back to that's one consideration and then the other consideration is that signing in is not always a fungible operation let's say that for us professionals very often we abstract away and we say okay here there is an app you want users to access their records on your app and so something need to happen for those records to come back online and so i could send you to facebook i could send you to github i could send you to our amazon whatever so it doesn't matter as long as we do this but very often that's not what happens especially given that here we are on off the thing is very often you sign in with a particular provider because you want to also call it the eyes for that provider like if you are on google and you want to use google calendar or if you are using one drive because your files are on one drive the sign in is not just the sign in it's also unlocking your ability to call this stuff and so in the case of signing you can do remediations like okay i'm treating for social providers like a factor an authentication factor and so as such i also have my fall back using the authenticator app for example in otp so that when that provider is not there it doesn't but there will be times in which you will be unable to because you are leveraging other things that are unlocked by that particular provider so around over i don't want it to suck the oxygen out of here but what do you think are no it makes a lot of sense and i think the um you're absolutely right that the at least having a recovery method if not an actual method to sign in is something that a lot of people will will do as they're building up these things because they're not going to rely on only one factor and yeah treating like a social organ as a factor is probably a good approach there of like you have your sign with facebook path and that you that might disappear because the user might have their account shut down or delete their account for example and you need if they want to get back into their account they used uh they used facebook to get into that you need something else so if you've got their email address you can at the very least always send them an email magic link they can click right and get back into their account that way and set up a new authentication mechanism but definitely something to think about as you're building um it's i don't think it's by any means a bad idea to rely on external providers at any providers for this because it does simplify things quite a lot but being prepared for not just outages but even the user voluntarily deleting their account is important very important oh absolutely and it's not just volunteering here um i i normally stay away from controversial topics but in the last year last two years we had a number of things that people could say on social media which would get them banned sometimes temporarily sometimes forever and so you don't need to wait until once in a century uh downtime for facebook to have a big problem you just need to uh you call the ai in the wrong way and the ai is gonna put you in uh ai jail and anything that is attached to that particular thing would be inaccessible which is why uh i've observed through years that like if you remember the enthusiasm that we have seen for open ed to the top once it happened many years ago and then some of those things started happening and developers realized that having a direct relationship with a user or having a remediation path is useful and so it's it's very common for people that want to accept social providers to also use uh other mechanisms and sometimes there are also like other things you can do like you can have multiple social providers that accrue to the same accounts so once again thinking of those as factors kind of that you can use account one or account two but you unlock the same record on the application you are signing in with so there are many many tricks that one can place but i think the bottom line is relying only on social providers unless you are targeting that particular ecosystem and you need divorce apis like if you are on android and you needed to do something with the store you must add a google account apart from those cases having some degree of redundancy is a good idea yep and i think it's yeah and it's not just for the outage case which you know it's probably not going to happen again for five years you know facebook is not going to disappear themselves from the internet again it's just not going to happen so there'll be some new problem that that comes up and we'll see what happens but yeah it's it's not just those cases that you have to worry about it's it's all together to end the finsta they might actually have issues that don't you know that that thing that happened in which uh that senator asked to facebook but are you committing to end the finsta oh yeah thank you so if they don't end the finsta i don't know in five years what's gonna happen yeah yeah we'll see how that how that all plays out fascinating fascinating so yeah any other questions about the facebook situation otherwise we will move on um do feel free to drop questions in in the chat at any point and we will get to them when it makes the most sense though um okay uh let's take a little um uh a little a little break to talk about my tweet yesterday because i think it was hilarious and approximately three other people will also think it's hilarious uh so i was working on um i i found this ai tool that generates text given an input and you give it a couple of sentences as a seed and then it will go and create similar sounding sentences uh on the same topic that it has guessed and you can sort of tell it like what kind of sentences you want do you want it to be a product description a twitter bio things like that so i was trying stuff out and it made some pretty fantastic sentences about oauth and i just uh love these so i think you should do an a dramatic reading of those okay let's see if i can let's see if i can do it at the end of the day oauth is just a way to communicate between services and users and as such it's easy to grasp and now i'm controversial truth truth go ahead yes oauth solved a universal problem signing into a website and in doing so created a new problem how does a site know it's really you so sounds nice flowery words nice sentences um any any any clues from our uh our lovely audience here what is wrong with this sentence uh because basically every single point in here is is incorrect but i just loved what it did and i was so inspired by this ai that i actually went and created a brand new twitter account just to be able to tweet these sentences and there's a lot more coming let me tell you so highly recommend following this twitter account uh it is wtf underscore oauth and uh just go follow it and you will get a nice daily dose of humor i have a bunch of tweets scheduled so it'll um it'll be it'll be fantastic i'm gonna drop this in the chat so you can go and follow it um but yes even the bio oauth isn't just a way to log into your favorite sites it's a way to log into the internet itself but you you know what's my you know i'm paranoid it's uh professional bias i think after two decades of doing these i tend to be paranoid as all of my colleagues know and i guarantee that a lot of people will read that and they'll say why wtf this makes perfect sense because it feeds like the challenge about the ai is usually is like garbage in garbage out and just like ai can amplify biases and like inequalities and similar i think that here we'll observe and of course i followed that thing i think i was follower number two of the uh of the entire thing but i'm convinced that we will see over and over again the urban legends about oauth surface in there because the ai is like this sieve that will just shake the sand and the stuff that will remain are the things that are recurring and are typically the misunderstandings the people say oh yeah oauth is used for signing in right um so and for that reason some people will actually see and have their bias confirmed and say oh yeah great and i go i'm getting my wisdom my daily wisdom from this thing so um if i were you we can do whatever but if i were you i'd add like a bigger disclaimer like saying this thing is not real the cake is not real nothing is real something i saw that you make sure that the people don't learn from it probably a good idea um because yeah it may be too subtle for um for for most people who are going to be looking at this randomly in their timeline because twitter is also going to start bumping these tweets into random people's timelines once more people start following it and liking stuff too so yeah and i mentioned that yeah some some of us will find see one of those find it hilarious and here to retreat and then some people in our followers list will not find it hilarious and i say okay they thought i tweeted this so that must be i don't think that many people think that but some will think oh i've tweeted it so it must be right and so now i will make myself a culprit of spreading misinformation spreading misinformation about a lot yep we wouldn't we wouldn't want that there's there's already enough of that out there well if it starts to conversations it might not be one hundred percent of that like yeah a lot of it a lot of the problems we have in here is the classic donny krueger effect in which um most people are in these uncanny valley in the middle in which they know enough to be dangerous and depending on personal inclination they can either remain how to say learn the new path and i don't remember what's the exact term um new like a new camera mindset and similar and then always question but some others that might feel okay i put the on the work now i know everything and instead asking the questions and even discussing the basics sometimes leads to clarification that people didn't know they needed so i'm a i'm optimistic that this thing would be a good thing uh yeah yeah okay well that was a fun little fun little diversion um definitely go and uh go and follow this account because uh who knows what will happen with it and you'll get in early on this very very niche joke well if we don't try to find some amusement from time to time our field is not particularly shiny so we needed to do lemonade we have the best stuff that we are handed down i'm great i'm glad at least at least one other person thinks it's funny yeah excellent thank you siren i yeah okay um so back to back to back to business uh what else oh wait no before we get back to business i did want to ask you a question what is explicit flow what is the explicit flow i know what the implicit flow is well uh we don't want youtube to ban us so we can't read now the the thing is that the in the early days of uh off they were uh a number of uh turns that people brought up that uh sometimes were like temporary turns that uh even appeared in some specs and then they were they were killed but others instead were just people wrapping their head around these brand new things so remember the two-legged flow and three-legged flow they were things like people actually uh used those terms even among ourselves when we were discussing this stuff we did use those terms uh and then thankfully they went down because now if you were to classify flows by legs we'd have a full uh beast zoo like now it's far too many we're far too many legs but they i think that the explicit flow was just a um a symmetry thing people heard about implicit and then i said okay there is implicit the other one when they were just basically two or three must be explicit and so they started using it i don't think i've ever seen it in uh a spec but yesterday i heard this term and i say wait a minute what is this thing did i miss it and then i started searching for it and i found some instances here and there but i think that people used it as a synonym as a code flow as opposed to implicit flow but i i thought it was funny so i i tried to tweet something that remained safe for work and we got a good laugh here and there but yeah yeah there oh and then anabelle had a fantastic response yes anabelle piled on yes fantastic yeah yeah that's really good so that was um i i laughed at that as well that was man yesterday i was just last night i was just i had i was just out of energy and just cracking myself up between that explicit flow and the uh ai text generator so uh well and i actually had the week before one uh a tweet got semi viral the one with the id token and access token uh oh yes it got like uh it's one of the few posts in which i got more reactions on twitter than on linkedin usually on linkedin i have way more reactions than twitter and instead of this time it was reversed apparently it resonated with a number of people now i'm trying to find it no i i don't see it i was gonna try to oh there it is this one yes yeah yeah yeah and it's true we we have this discussion on uh on most daily basis today i had to do this again because the thing is that there are so many things uh appended into that and i myself had to go on a journey to convince myself that truly in every case that's true that so let's let's let's let's talk about that because yeah i feel like i also had to go through this this journey every now and then of like but what if what if you only needed for this then isn't it okay and it's um it's a it's a tough one because it seems like if you are if you're building an app and you've just got the user to sign in and you've got an id token and you're like cool now i know who they are then why not just send that to an api and have the api do the same validation on the same thing and then return some content it seems like it's simpler yeah it's like the naive uh approach which kind of makes sense because also for um they're non-initiated given bad access tokens don't have to but very often are in a job format and you look at them you say well there aren't not a lot of differences and especially if you're not doing the delegated the stuff that you don't have scopes sometimes there are no differences between the two but the thing is that the two main reasons that we normally uh use that i think are the most effective ones are one is the method of the audience like the eddy token as as audience the client and so if now you make it possible to use that token with the api that would mean that the api has the same audience it's in the same identifier as the client and so not only that gets a bigger bigger blast radius because uh now you have more opportunities to get the token that works for calling the apis you add no chance on running any logic at the authorization server that is specific to the to the api so you couldn't actually place the claims that are needed for that particular api and there is also the fact that that actually this token is tied to sending the eddy token to the client like there are mechanisms that are meant to protect you from the very use of eddy token if someone steals the eddy token and tries to use it again against your your client then chances are that these won't work because there was the nonce that was issued together with a request for the authorization server and then once you got the token back the nonce was referred to it and so basically the eddy token is it comes with a mechanism for making it single use but these single use doesn't translate when you make an extra hop when you send these elsewhere and we have other mechanisms that we mentioned earlier we send the constraints for protecting access tokens in the lag between the client and the api but those mechanisms don't apply to eddy token there are no flaws that help you to tie the eddy token to the channel between the client and the api because it's not meant to be used that way is that a fair summary i think that's a fantastic summary i was one of the blog posts i ran into as i was thinking about this last week is this one from dominic bear which i should drop in the chat and this is a small rant about the name id token but i thought that this was a very good summary of of things in a much better way to think about it so i know that we often talk about eddy tokens as sort of being about the user but it's really more about the authentication event than it is about the user and i think as soon as you start to think about it that way it becomes more clear why it's not at all appropriate to use it as an access token right the eddy token is capturing it is it is an authentication event it's the user logged into this app at this time when the client had this nonce in the thing it's happened at this moment in time it you should only trust the statement until this long all these things are in the eddy token and if you think about it you know and it might it may also include some information about the user like their name or their email but it is mainly about their about the fact they have authenticated to into the tap so you are formally the jury 400 correct that in the intent technically correct the the best kind of correct it is the best kind of correct in the court of law and idea but as soon as you step out unfortunately the behavioral economics start having an effect and making everything a bit dirtier and less clear cut from a formal perspective that works and that was clearly captured in the intent of the offers of the spec in which in almost all cases the eddy token is verbal it contains only that stuff and if you want user information in there you have to actually ask for it using the profile and you also need to use the correct response type and here the funny thing is that I believe that this is the least the part of any spec that I know with the least the level of compliance because almost all the implementations that I work with contain as soon as you ask for a profile and in some cases even if you don't ask it contain a very token the resulting from that will contain the profile information no matter what the response type and the response mode you follow and I think that here the reason is more historical and more about the jobs to be done you know those classic uh uh memes in which there is a corner for a what like a sidewalk that makes a corner and then there is the a grass alone and you can see that clearly people cut the corner and always walk through the grass and that's I think a good visual metaphor for what might be happening here people were used to things like uh samol in double s federation and similar in which it was already a big deal that in order to cross boundaries between organizations instead of assuming that you can get identity information using held up as you can when you are within the same boundaries now you had somehow to package those into a something which is portable something which is a easier to move than a carbon sticker that can actually go on the public internet and it was all this thing about claims about the fact that you don't have to pre provision the user on the other side so we worked hard to convince people the federation was a good thing and now instead if you would have followed exactly these uh these flow you would have come back to the point in which you do need to call to an api to retrieve the identity information which either if you get the ad token from the back channel or if you hit the user info for getting the user info in both cases you are calling a you're calling an api from the side and in the process you now also need to get a secret because in order to get an access token for a web app you need to get a secret so a lot of people just couldn't find the motivation to even move toward operating connect without stateful tokens that sorry stateful is the wrong with tokens that contain at least some skeleton profile for the user because it was just ingrained in that I'm doing a federation even if we know that it's an abuse of a term but so you are absolutely right and Dominic is right the id token intent the framers said yeah this is a proof that successful authentication occurred let me tell you how it occurred and the user information is more of the I'd say stovway but for a lot of people that stovway is the reason for which they are doing all of this and I worked on products which had the v1 with the berbon following respect and they got zero the very little traction until their product was modified accordingly so I see both sides of the issue yeah that is a that is a tough one and and I think really that you know what happens when someone goes and applies a spec and actually creates a real thing out of it is a better indication of what the spec should have been than what may have been written down at the beginning because ultimately specs are not useful themselves until you apply them to something yes so much yes in fact I don't want to be cancelled so I will not go into the details but so many things are happening today in the word of identity which are to me the counter part of the simarillion from Tolkien or writing entire grammars of the Klingon language which like a sure the fact you can think about it doesn't mean that it exists in the world it might and it might turn out that Klingon is actually easier to learn than English and then this fictional stuff becomes something that we'll start using for a business communication it's possible but to me standards are different to me standards are a lot of people have been trying in the real world to solve a particular problem their approach is similar enough that it makes sense to try to describe it so that now we can do what we are trying to do solve the problems we're trying to solve in interoperable fashion to me imagining something that isn't there and for which there is not even clear demand is potentially a huge waste of time and resources that it might always turn out that instead it's a fantastic innovation and but if I have to bet I'm not a betting man but if I were to bet I'd rather bet on on stuff for which there is clearly a need yeah makes a lot of sense we have a little bit of time left and there's a good question here that's relevant to our current discussion so let's talk about this the message in three parts I'm trying to build I don't know why it says no name I'm trying to build an internal open e-connect provider built on top of OAuth 2.1 I don't recognize which response modes are compatible with OAuth 2.1 since only the authorization authorization code grant is supported can you tell me which are the more compatible I feel like we were just having this conversation did you are you no name did you slip these questions in no okay but okay historical like stars are aligning yes yes um no we were just having this conversation because in the OAuth 2.1 draft that I published two days ago I forgot to do exactly this which was at a note about this exact point and yes in OAuth 2.1 the response type well OAuth 2.1 OAuth 2 doesn't define response modes only response types response modes are all in open e-connect or response type the OAuth 2 defines code and token code being the authorization code grant which pixie adds on to that and response type token being the implicit and the security best current practice of OAuth 2.0 says don't use response type token scratching out the implicit flow and what we did in OAuth 2.1 was just not included so it's not even there to begin with so it's it's gone OAuth does not talk about response modes and it does not define any other response types but there is an extension mechanism in OAuth 2 and OAuth 2.1 which is you can define your own response types and you can also extend the flows in other ways so it is intended that it is intended that other specs like open e-connect can still define new response modes and or response types and define their own parameters like response modes so what that basically means is if you just look at OAuth 2.1 you're going to not find response type token and then if you read open e-connect it can define new things and I would still not recommend that you use any response type tokens in open e-connect at all that's still a bad idea but it's not prohibited necessarily because it's just not in OAuth 2.1 it would be all done as an extension in open e-connect now this is the sort of stuff that needs to get like some little note in the 2.1 spec to make this part clear so that this question doesn't doesn't keep coming up but the the trick is that it doesn't really make sense to like forward reference because open e-connect is on top of OAuth and that makes it kind of weird for OAuth to be like here's some notes about one particular extension that you know has extended the spec without having to mention all of the extensions of OAuth of which there are several um and but this is obviously a very common one and a question that comes up a lot but the as far as like what we actually mean is do not use response type token at all under any circumstances getting an access token back in the redirect is a bad idea even in open e-connect but if you want to get an id token back in the redirect which is something that open e-connect made up and made up id tokens and they made up the mechanism to return them in the redirect that's in open e-connect and that's not disallowed by OAuth because it's just not in OAuth and that's the kind of weird situation that that we're in with these and I would like to add that the reasons for which it's a bad idea to return a token in in the redirect don't apply to the id token case let's say that for what we said earlier about intended usage as long as you don't use the id token like an access token then returning it from the authorization endpoint which I think is better than saying in the redirect because the traditional implicit is the implicit was introduced mostly for single page apps and it was a mechanism for delivering the access token to javascript so that javascript could perform both calls and in order to deliver it in there it was typically returned in a fraglet and that stuff is like all sorts of challenges so not only by fraglet now you have these access tokens in there but it might end up in the history it might be end up in like in the headers that tell you the format redirect so bad idea across the board in the open id case you basically return the id token which if you are doing this in the context of the spot it stops there it doesn't go anywhere else and even if someone steals it apart from problem with the pii in the content this token cannot be reused for the things that we said earlier because it is tied to the response and for cases in which it is sent out like the most common case in which I have to have this discussion which people say oh but what about implicit is when this flow is used for doing server side sign in which we have an app that runs on your server and the sign in is based on redirect same deal as some in that particular case the id token is sent to the client which runs on the server I know it's not very intuitive is through a form post because this token can be big and all sorts of stuff but the good news is not only is non-reusable it also will not end up in the URL so it will not end up in the history it will so basically all all the security considerations which make using the response type token a bad idea in every flavor off off to the one open edu connect future stuff based on it do not apply to that particular case we have a pr problem because the implicit flow people think implicit flow equal access token in the url is that implicit flow just means token from the authorization endpoint but then how you pull out and what kind of token is important so we are having this discussion so many times which is why I'm begging the offers of to the one add on dick torsten to add the visa thing and they graciously agreed to put it in the non-normative parts of a spec and it's fine for me as long as it's there and I can send a link to people I'll be happy yeah so the there's there's there's two different implications of this right because there's if you're building an app and you are deciding which flows to use you can obviously choose out of the whole menu of what's supported if you're writing a server and you want to be compliant with specs and give people the right options that's where it's you have to sort of make a decision so you I would I would not create a new OAuth server or open edu connect server that ever responds to the response type token meaning sending access tokens from the authorization endpoint would not build that at all now whether or not you want to let clients get ID tokens from the authorization endpoint is a question of how much of the open edu connect spec you want to follow because it is part of that spec and if you want to be like totally like let your clients do anything that they're going to find when they're reading docs and using libraries then you would have to build that feature as well and let clients get ID tokens from the authorization endpoint and maybe the maybe part of this PR problem is is giving these things new names and not calling anything the implicit flow anymore because it's not really I don't is it even called the implicit flow in open edu connect I don't think it actually is I'm not sure I think it's spread across the board but I agree that given new names would be good like for me let's say like the immediate mode something like that yeah the shortcut mode like they sign in the front channel mode like we kind of find as many as many synonyms as we want I think that the thing is that open edu is very very stable and so before there will be amendment state something big needs to happen so I don't know if that route is possible like for me what's important is that we clarify that sending an ID token for the front channel is not this that does not trigger the same concerns that led to the elimination of response type equal token and there are actually a number of good reasons for which if you look at many many many public services that's actually how they implement web signal and but we are out of time so we can pick it up if there is interest in the next installment of it is a it is good topic and the next next week is the OAuth 2.1 session at the at the ITF which means this will be discussed I'm sure because it always comes up I'll try to make sure that it doesn't take up all of the time because there's a lot more to discuss as well but we will be able to summarize this more in the next OAuth happy hour which I believe is on the calendar so I will make sure it gets posted everywhere so to wrap this up the if you do want to make sure you don't miss one of these happy hours make sure you check out the calendar at octadev.events and that is where we post these sessions as well as other events that our friends at Octadev.0 are going to be either giving talks at or hosting workshops I have a couple of workshops coming up in November so take a look at that there's a calendar feed you can subscribe to and if you want to subscribe to only the happy hour feeds then feel free to click on the happy hour tag and there's an iCal feed for that as well if you really want to get fancy so I'll make sure this gets updated with the next one as soon as we're done here and yeah thanks for coming everybody thanks for all the thanks for the questions thanks for the good discussion and there's a lot to catch up on this week so maybe maybe next time there'll be no we're gonna have a bunch of interim meetings so there's gonna be a ton more stuff to catch up on about as well so all right thank you all and we will see you all next time bye thank you bye