 Welcome to our second episode of the Workforce Identity Developer Podcast. Today, I'm joined by Matt Egan. Matt is the Director of Technical Strategy here at Okta. Matt, welcome to the show. What Director of Technical Strategy? Hey, Emily. It's great to be here today. So what is the Director of Technical Strategy? So I work in the corporate organization here. So the Director of Technical Strategy is a little ambiguous. If you start to place me in the organization, it adds up a little more. So in the corporate strategy organization that's owned by Monty Gray, Monty Gray is Executive Vice President, reports to Todd and has the CorkDev role. So that includes mergers and acquisitions, Okta Ventures, the technical aspect of our technical go-to-market partnerships where we have joint marketing in place. And then we have a corporate strategy function within there. And so in there, I report to Stephen Lee and we have per view over just the technical liaison between product and engineering or any technical due diligence that's required to serve those purposes. So that puts me in contact with a lot of people that rely on or end up deriving value and interacting with Okta's various integrations. Absolutely. And the developers who are working on things like the Okta integration network, I'm sure are going to be using the technologies that result from some of those decisions that you helped make. Yeah. Yeah. A lot of times. I would say that the main goal of my role initially when joining was to reach that developer audience that worked at companies where there were logical integrations to take place. So an ISV could kind of understand, hey, what's the value prop? If I did this, being able to kind of articulate a customer's, the value prop to a customer, what's in it for somebody at the end. So I was able to articulate that better to that developer persona. So they could get justification to build these things. Yeah. Those ISVs or independent software vendors are the folks who build applications that they then publish on our integration network. So do you have some history on how that integration network got to where it is today? Yeah. A little bit. I can't claim like all up responsibility for it. But if you go back in history and kind of look at what Okta was, and I have, having worked at Okta for about six years, I was actually an early customer of Okta's as well. So in the 2013-2014 timeframe, I was deeply involved with evaluating Okta and deploying Okta for a company. And so one of the key value props there as a customer was understanding how much easier it was to deploy new single sign-on integrations and to adopt new applications given the capability that I had to bring that identity component there simply. And so it started out with that catalog, the broad catalog of applications, the single sign-on applications and that grew over time. So when I joined in 2017, we were starting to deploy more advanced API-based integrations where there was a little bit more cohesion. Some of these things weren't standards-based anymore. So we're jumping out of normalized frameworks like SAML and getting into how does somebody interact with Okta proprietary API to analyze a system log or to sync a directory with a proprietary source. So those aspects started to take a little more shape in 2017 and around that same timeframe, we started to understand there's an ecosystem there and it had that interaction and convinced everybody to then rename the Okta application network, the OAN, to the Okta integration network, the OIN. And so that's kind of that genesis through 2017 went on to where we are today with some of these exciting announcements of API service integrations, which is like a full maturation of some of the early API-based integrations that I did work with companies like Splunk to build SAM integrations and other companies like Palo Alto, Exor to do SOAR integrations for security orchestration. And those integrations have kind of taken shape. And now we have a little more a better opportunity to put some formality around how those integrations are discovered and then configured by customers. Yeah, absolutely. And that context of it formally being named application kind of helps understand what we mean by integrations. They're really just those tools that are applicable to outside of just your Okta tenant. So what is in it for a developer if they publish their app onto the OIN? What do they gain from that publication and that sharing if it's maybe not just a service that they're selling? Yeah, you can think about who are the developers that are going to interact with that aspect with like the public market facing aspect of the OIN, the integration network. And I think that's kind of a unique developer persona that's going to get value out of that. Probably usually driven by somebody that has a business development to our product management focus that's trying to highlight a joint integration and would want that the exposure that comes along with being on the catalog and then also the verification and the additional information that's published in there allowing customers to understand that this is Okta verified. It's been reviewed. I know that it works. I know that I'm going to have some documentation. I know that if I call I've got somebody that I can ask questions to to help you to diagnose and understand these things. So having that consistent documentation experience just greatly simplifies that. Oh, yeah. And from an application development perspective as well, having your users already know what they need to do to install that app, what they need to do to set it up, what the questions it's asking you about scopes mean is huge. You get to skip that training because they already know or Okta can teach them in a way that's generic to all integrations. Absolutely. It helps customers solve these problems that are consistent and they exist across their landscape. So you being able to speak in a common language that's kind of consistent across the ecosystem is huge. So if a developer is figuring out whether the app is a fit for the OIN, they might be a vendor that's going to sell the product to Okta users or they might just want their own business to integrate with external partners. They might even want to integrate across branches of their own business. For instance, might you develop an app to ease the integration process if your company has just acquired another company that's also an Okta user? Yeah, a variety of areas to think about the different audiences that are going to potentially get value out of the OIN. So first and foremost, it's going to be those listings of applications where we're providing single sign on or the life cycle management aspect of identity. But as you get into some of those more diverse integrations, you do see some opportunities. So the value that comes out of those is just having that consistent integration. So there isn't really a monetary aspect of this. Nobody's selling anything. It's just that maybe there's a joint highlighting of the integration that allows for that common area, but it's not transacting across that. Nobody's making any money. But the ease of use that comes out of that and just the simplification process, that simplification is the benefit to the Okta customers and the joint. Now, if I'm doing something internal for my own application, say it's a line of business application or something that we built that's highly custom, the value of having something published in the OIN starts to diminish a little bit. And there you're just taking advantage of the framework that the OIN provides, generic skim servers or skim clients, generic SAML configuration templates, or OAuth servers and OIDC endpoints. So you don't have to go and build those components uniquely. You're just able to adopt those standards to consume them. So there's those two different personas. There's value in both, but it definitely has just different focuses there. So something to mention is the framework. And just that ease of installation. I mean, if I can get something out of my package manager, I'm so much more likely to use it than if I have to jump through a bunch of hoops and assess it myself to a greater degree. So how does a developer actually go about building an app for the OIN? The building for the OIN, they're going to have to assess their own... Where is their skill set? Okta does make a few different SDKs available to go and interact with Okta endpoints. There are SDKs that can help across there. So if you're giffer.net or if you're in Java or coding in JavaScript, there should be generally the common platforms we do maintain SDKs for. But then if you're trying to evaluate an end goal, you get to understand what am I building? Am I trying to build identity? Am I trying to displace the identity component of my application? Or am I just trying to integrate with an existing customer's existing identity stack? And that's where most people fall in that category where they just want to be consuming the identity that a customer has. And so that's the ideal persona for listing in the OIN is that ISV that wants to have their application discoverable there. And then if you're building something that say you're going to consume log data from Okta or you're going to synchronize the directory data, you're going to create users in there. Those integrations would look a little bit different there. You'd want to understand what's the business outcome or what am I trying to do to get an integration that's going to, what customer problem am I solving there? Some of the, we see some evolution around SaaS optimization. There's kind of a common theme that we see there. And so we're starting to document some of those use cases. Other ones that have thrived in the past have been the security log and centralized log management or security analytic platforms. Number of customers would adopt those. So we see some of those patterns. So understanding what your end goal is and then identifying which APIs within the Okta framework that you're going to interact with would be kind of the second goal and try to develop a theory of integration there. So you're going to build your code to use those Okta APIs and you're probably going to build it using OAuth to talk to those APIs. And then once the developers got their app built, did they basically just hit the submit button? Yeah, I think that's how they build this, which off message. I didn't really touch on that. But so regardless of the coding language or which endpoints, if you're aiming to integrate with Okta's APIs, there's OAuth that has been recently released and then turned into the API service integrations that we've just launched. So you're on a path on the OAuth spectrum, but the beauty of OAuth is it allows for you to move away from the static nature of the SSWS tokens. Those have been convenient. That was the original way to interact with the Okta APIs. But if you move into OAuth, now you can start to mint these finely scoped tokens that are where you're defining the specific scopes that you need to interact with specific APIs and specific methods and actions on APIs. Those are well documented in the Okta API docs. So there's no longer is there a mystery of which role do I need to assign to this user to be able to interact with this API. It's a very well structured class scope that aligns to the Okta APIs and methods. So it's pretty easy to understand which scopes you need to request and that also leads to better outcomes for the customers. Those SSWS tokens, it stands for Sasher Secure Web Services. And Sasher was actually the old name that Okta used to go by. And that tells you what an old technology they are. OAuth is a huge improvement in terms of specificity of what you're granting. And an SSWS token is linked to the specific user that minted it. So if anything changes about that user, that's going to change about the token. Plus the tokens tend to be pretty long lived compared to Okta letting you mint creds that only live as long as you need them to. So do you think it's ever okay to choose SSWS over OAuth? The number of times the cases where it falls in is going to be greatly diminishing. I actually, I struggled to think of any SSWS tokens at this point become, they don't want to be alarmist, but they're kind of a disaster waiting to happen. So if you're a customer out there that's relying on SSWS tokens or an integrator that has one, I would definitely encourage getting aware, reading up on the API service integrations and starting to look at what it would take to implement in your existing code and encourage, so for customers to encourage those integrators and for integrators to really get a drive to this to help Okta help customers move their security posture forward. Why do you think people still use tokens at all? I mean the tokens are there, they've been around. You mentioned, in fact, I learned something today that Sasha was actually a part of the SSWS acronym. I wasn't aware of that before reading the notes today. So that was, you learn something today. That was just the original way. And there's a lot of documentation that refers to it. There's a lot of just inscribed tribal knowledge of this is how I'm going to do it. So there's getting this work to be done on the Okta side to make sure that our documentations all consistently referring across this and all of our examples are clearly leading people to get into this outcome and inappropriately caveatting the liabilities of the SSWS token. That situation where it worked once and you learn from the time it worked and then you learn from the time it worked and that doesn't make it necessarily a good idea. The way that I think of the SSWS tokens, it's like you have a vendor that you want to share some of your information with. And so you sit down on a table with that vendor, you unlock your laptop, somebody who has production access like unlocks their laptop, hands their laptop to that vendor and says, here you go, do the thing. You don't want them to have the power to be you. The token gives them the power to be you. It just makes your security reasoning and makes addressing threats so much more challenging. And similarly, imagine that you're hiring a vendor to do one thing on your premises and you just hand them the master key to the building. You wouldn't do that. It feels icky to even think about doing that. And that's the kind of feeling that you're probably ought to get when you're asked to hand over an SSWS token, especially from a highly privileged account. Yeah, and with that, these things exist and I think it just highlights the nature of a need for. If you are doing these things, you should absolutely be auditing them and aware of it. They think the worst thing would be to be unaware. And so reviewing the accounts that are active in your off-door and looking at SSWS tokens that have been minted and concerned, I would inventory those things, see which accounts they're associated with, performing that type of audit could be really key in turning up potential liabilities that exist and then to have that hit list of, okay, which are the integrations that I need to now have on my roadmap to change over the next coming years. It could be a bit of a journey, but just being aware gives you an opportunity to now have audits of some of that usage and there are products out there in the market that will help you keep on top of those things. So it's not like there isn't help there, but you should definitely be aware of those integrations and some of the liabilities exist. Yeah, there's definitely a path out. Many, many other organizations have had to climb their way out of this security scary place of using primarily tokens to talk to things, but a lot of places have gotten there and you can too. And the OAuth switch can be kind of scary. Like on the one hand, you get to choose exactly what the app is allowed to do. On the other hand, you have to articulate exactly what you want your app to be allowed to do and figure out exactly what it's going to need to be able to do if you're the one developing it and requesting those scopes. So do you have suggestions for developers who might be feeling a bit confused by all of the options that specifying those scopes is going to add compared to just asking for the token and calling it good? It's going to put this awareness right in the customer's face when they're turning these integrations on that, oh, this integration is going to interact with these types of data, with these levels of permission. I think it's more clearly like you get to see the end user experience when they're interacting with those consent screens. It's very clear. So as a developer, that makes you want to really consider like, hey, why am I, well, one, when it comes to defining the scopes, am I actually interacting with these API endpoints? Because if I'm just going to go and ask for a widely scoped token, I should expect some friction from users. So you want to think about that. And then two, when you're actually making these data considerations, am I interacting with the right sets of data for the right reasons? And am I being responsible that these are some of the things that are going to be considered when you're doing that? And now you're also putting end users in this consent flow, so it's a little bit more deliberate when a customer is submitting those. I've also seen vendors that have decided to publish two different sets of scopes. So publish two different applications, one that needed read and write and one that was read-only. I believe it was Xylo, if I recall correctly, that had submitted those. Or maybe it was Better Cloud. There's a diverse number of integrations there, but they had split those into two. And there's a lot of examples where that would make sense to have, oh, this is read-only. We're just going to be polling the system log versus this is going to be a SOAR integration where we're going to pull the system log. But we also might add users to groups or change user states, clear user sessions, et cetera, which would be privileged operations that you might not want to just initially turn on. Yeah, and having that conversation up front, what is my app going to do? What do you want my app to do for you? It's so much better than having that conversation. Wait, you did what after something unexpected happens? So when we're reasoning about scopes, when we were prepping for the show, you were telling me about how scopes are these nested logical categories of data that kind of logically map to the API endpoints. What do you think the scopes are that someone should start with if they're learning about how the system works in general? Just perusing the octa API catalog first and just starting to understand the top level structures that Universal Directory has in their users and groups and then forced-order subjects like policies and how they map to users and groups and little elements. And anyways, so there's a, I think first would be familiarizing with the APIs and then understanding, then when you look at scopes, they just become logical because there are going to be things like user.read, which would be a scope that would allow a user to read, to read the user's API. So there's, I think just browsing the general octa catalog and then after that, perusing some of the authorization documentation that we'll talk about scopes. And as you're in line with every one of the API's API examples, they're going to be talking about, especially in the updated API docs that are out there, you actually see the scopes that are required to perform at a top level API or down level to a specific method you see which scopes are required. When you dig around in the documentation about scopes, you start learning about some of octa's internal representations like this app user object that kind of, to my understanding, is sort of the mapping between, as you'd expect, apps and users. What are we going to learn about the octa internals as we get more familiar with what we're requesting? That was funny to bring that up. As I started working with the octa system as an administrator, it was somebody who was actually starting to read through APIs and then to understand the relationships as you interacted with the APIs and a little bit more on the base level, you really do start to understand the ins and outs. And so this construct of, oh, there are app users and that's what drives universal directory and that's where I would see essentially that graph of who a user is and what they have access to. It's actually really, really insightful unit to kind of just up level your general octa knowledge and understanding, but it is very logical as you get into it, the structures are exactly what you would expect them to be grouped as classes with a logical noun and verb structure after that. Oh, that's really cool. I love it when you pull back the curtain and what you find behind it makes sense and you kind of would have guessed it. Going back to building applications, building integrations, what's new in the world of the integration network? I don't want to take over this where we're talking about all octa versus, but I think the API service integrations are definitely top of mind for me having that available after kind of years of sowing seeds of SSWS integrations into the world now having a more official and this maturized integration path is huge. And can you speak to what guarantees a user will get about an app that makes it into a network? I know that we do quite a bit of verification saying, let's see if this thing seems to really work the way it claims to. What does that look like? And what should a user count on once the app has been vetted by our team if they're installing it from the OIN? Yeah, most of the integrations that go on there, if their list is not to verify, we've gone through and interacted with the developers on the other side of this to build whatever joint documentation needs to be built. Usually we have a consistent structure there, but we make sure that's going to work. We try to gain access to developer tenants or to customers to have them say, yes, this configuration worked. And so we'll go through, build the integration and have the partner identify a few customers. We will assign those applications to them in private tests that they work and then move forward to publishing those and distributing those apps across the OCTA global cell network that exists. So those get distributed on a semi-regular basis after their testing. So I think it's that consistent documentation and then the verifications functionality you've got a set of working instructions that have a known outcome. That's so key. The working instructions with a known outcome. Did it actually do the thing I thought it would? That's ultimately kind of the security question too. And one thing that I love about the podcast format is that we've got some listeners who are here with us right now, early 2023 kind of timeframe. And then down the road, we'll have other listeners in future years, possibly even future decades. And when you imagine those future listeners, is there anything you think they'd be surprised by that maybe they just take it for granted and they're like, wait, you didn't have that in 2023? What do you think their security landscape is going to look like? To the how far in the future, I guess, you know, could constrain me a little bit. Give me, let's say one year. What's going to be surprising in one year? You know, I think in one year, we should see some meaningful growth. You know, I would like to see some meaningful growth and realization within the security network that identity becomes this meaningful, you know, more recognized, you know, security control plane so that we're, you know, continuing to take advantage of device and infrastructure and network level security investments have been made, but kind of continuing to focus more and more on what identity can do and the further we, you know, further left in the sentence that you have identity, you know, the higher fidelity outcomes, you know, can be there. So I have some real conviction there. I'd like to see, you know, within a year, I think that starts to materialize a little more. Oh, yeah. It's that question of the absolutely secure server is the one that you power off and lock in a box. But that's also the most useless server. And the utility of a secure server, letting the right people do the right things. Hey, look, there's identity. How about in 10 years? Oh, man, in 10 years, you know, are we, what are we? Think sci-fi. Neuralink and then we're trying to have, that's kind of funny. In 10 years, one of my colleagues asking about, hey, what's the, what's the identity problem in AI? And we had a very meaningless conversation talking about, you know, what the implications of identity in AI and how you would, how you would, you know, apply that or try to solve, you know, what is the actual, you know, identity problem there? So that was Keith Oster with a killer, you know, just kind of stumped the supposed expert. What does he have to say about this? And it was, it was a funny thing to think about. But yeah, I think in 10 years, maybe we're having that conversation of like, hey, what does it mean? What am I augmenting my decisions with? And how do I prove the veracity of that data down to like, source to algorithm to, you know, then combine it with, you know, who was the ASCII and what was the intent of the question if we think about those things. So, you know, interesting. We do build trust out of identity knowledge and that provenance of trust for information gets so convoluted when we start permuting the information like we're doing. Like right now we're worried about identity for people, identity for agents that act on our behalf and those agents are currently a piece of code that you could hypothetically ever read. What happens when you can't read that agent? It's like in this intermediate state. And that's like, you know, I think some of the benefits of the, you know, maybe the callback here, the benefits of if you're deploying code and the code has, you know, specific authorizations that are kind of clear and easy to articulate, you get, you know, just inching closer and closer, right? It's, you know, I think we're a long way from like the perfect solutions, but these are definitely steps in the right direction to be more granular and to be more explicit. Oh yeah. I think OAuth is a perfect example of getting more granular and getting more explicit and specific as we go on. Yep. So thank you so much for joining us. For podcast listeners, there will be a forum thread to discuss the podcast that's linked in the show notes wherever you found it. And feel free to let us know what you think and if there's any other topics that you'd like to hear about. If you're using SSWS tokens, please ask yourself why and please consider a switch to OAuth if you can. Plus one.