 Low SAS developers, you sell your impressive software to technologically mature enterprises, and they expect it to work seamlessly with all their other tools. In our previous enterprise-ready workshop on OpenID Connect, you learned how to solve part of this problem by creating users in your application for whenever your customer's employees log in. But creating accounts for users when users log in is just one of your customer's many expectations. Your app is expected to know about users who haven't yet logged in, as well as delete employee accounts who have been removed from your customer's identity provider. In this workshop, you'll learn to solve for these problems and more using SCIM. By following these steps, you'll learn how to implement SCIM and architect it in a way where you can potentially provision multiple organizations if needed. Although we'll be integrating with Okta in this example, remember that almost every identity provider has SCIM support, so be sure to review their SCIM implementation docs closely. What is SCIM, you ask? SCIM stands for System for Cross Domain Identity Management, and it is an open standard protocol that allows us to manage identities across systems, as well as manage common user lifecycle operations. So in this workshop, our goal is to build a SCIM server in a SAS app using this standard protocol and in the end, connect it to an identity provider such as Okta. I'm Simona, a developer advocate here at Okta, and I will show you how to implement SCIM from experience supporting SAS developers in their journey as they submitted their SCIM application to our Okta integration network. In this workshop, we'll cover what problems does SCIM solve. Why do we love it? How to implement SCIM, connecting to an identity provider, for example Okta, and conclude. I hope you take away not only how to implement SCIM, but also how to support provisioning from multiple organizations or tenants. In addition, I hope you will recognize that you may interface with multiple separate SCIM clients. This is because your customers may be using different identity providers with different SCIM clients, and that Okta is just one of them. Now let's go over what problems SCIM solves. Sometimes SAS developers want to know when a user has joined a company so they can provision them with an account, or when a user leaves a company so they can deactivate their account and tied resources. The solution to these problems is SCIM, as it provides near instant updates to the downstream system whenever someone joins, moves inside of or leaves a company. It also promotes interoperability via an open standard, which means allowing systems to exchange and make use of info being shared. I've once worked with a partner who has used SCIM as a solution to role-based access to resources. Now, why would we even love SCIM? Well, it's a very well-designed standard, and therefore it can be implemented across systems that are compliant. It simplifies the management of identity, thereby reducing manual admin tasks. It allows user management by providing awareness or insight of use up-to-date user info, versus having no governance over user info and tied resources. Above all, it contributes to improved security. And now who wouldn't want that in this day and age? Now that I've given you a little background on SCIM, let's get started on building the server. Before we implement SCIM, let's have a plan of how we're going to do that. We'll first review the create, read, update, delete functions and JSON that are the underlying mechanisms of SCIM. Next, we'll go over the prerequisites of what is needed to set up our environment. Then we'll make changes to the OpenID Connect application that accompanies this workshop. And then finally, we'll get started on actually building the SCIM interface. Before we build our SCIM server, let's familiarize ourselves with the functions that support it. SCIM is implemented by the following RESTful API, create, read, update, delete endpoints and JSON. We'll build an endpoint to retrieve users using the get request. We'll also incorporate a filter in the get users request. We'll also have the ability to get users by specific ID. We'll create an endpoint for creating users using post, updating users using put and deleting users by ID using delete. JSON is the format for requesting and responding to and from the identity provider and SCIM server. Now, have you seen these endpoints before? Are they familiar to you? Perhaps you build backend applications with these API endpoints. Let's build on that knowledge. And before we begin, please note that this implementation of SCIM is meant to be vendor agnostic. And we will be adding notes on how OCTA implements this standard throughout the demo and at the end and the accompanying blog post to this workshop. In addition, we'll be referring to the latest version of SCIM 2.0 to build the server. Now let's make sure you have the necessary prerequisites for this SCIM workshop. You'll need the base application associated with this workshop. For detailed instructions on how to use the base application or configure the base application, refer to the tutorial I've linked in the video description. Once you have set up your environment, you should have the following completed. Clone the repo as I have. Have version 18 of node, version nine of npm, using git and have a github account. Now let's verify that I have the correct versions. Let's check node first. All right, I have a acceptable version of node and now npm. And I also have an acceptable version of npm. Another thing we'll need is to check out the OIDC branch. That's where we will be creating the SCIM server on. So let's git check out the OIDC branch, workshop branch, and we'll make a branch from here. Git branch, call that new branch SCIM workshop demo. I already have this branch. So I'm just going to get check out to this branch instead. And we're ready to work from here. I've gone ahead and opened the OIDC app in my Visual Studio Code IDE. Before we begin, we'll need to make some changes to the user table or user model. So we'll go ahead and do that. We're going to add an extra field attribute called active. And this field attribute is not required by the SCIM standard, but it is required when we connect with octa. And this attribute will let us know whether or not the user is active. Another thing to note is that the ID we're using is basic auto increment to lessen the project's complexity. But we do recommend that you use an advanced, a more advanced unique ID generator such as XID. And I'll provide more info about that in the accompanying blog post to this workshop. Now that we have the necessary field attributes, I want to make some changes to the seed script file. You may have already seeded the database, but I want to demo creating orgs and users as needed. Specifically, I want to create an org to demo that our app can provision multiple separate orgs if needed, especially to accommodate your customers who may be using different IDPs with different SCIM clients. Let's do that. I'm going to add or instantiate an org here and we'll call it portal with the domain portal.example. And this API key here is important because it allows SCIM clients like octa to authenticate to this protected resource and we'll see that in action later. I'd also like to change the user names just for fun and to hard code some of the user attributes. So let's go ahead and do that. Alright, so what I've done here is added or hard coded external ID and active setting that to true. So when they are seeded into the database, I've already flipped the active flag to true. And the external ID is not a standard SCIM attribute, but we'll need it to interact with octa. In fact, it's a unique identifier issued by the provisioning client and must be stored by the SCIM server. If you interface with other IDP providers who require this, you'll know that it is coming from the SCIM client and not your server, but you must store it in your. Let's go ahead and seed the database. If you haven't already installed the dependencies, let's go ahead and do so now. And if you've already seeded the database, let's start new with the set of users that I've added here to reset your database. We'll be using a command mpx prisma migrate reset. So I'll add that in right now just to make sure we start with a fresh database. And now we'll run init db so that we can add the users that we've just hard coded here. All right, so we have somnus and trinity now in the database. And to make sure of that, we have a neat way to see the database locally provided by prisma. So let's add mpx prisma studio and then I will show you what the database looks like. All right, so we have somnus and trinity tied to org one with external ID 22 and 23. And they're both active. Exactly what we wanted. External ID and active field attributes are standard attributes in the skin protocol. They're not the core attributes. The core attributes are ID username and meta. Those are the only attributes that are required for the course user schema. External ID and active are optional attributes. As octa requires them, we're adding it here. So now we'll need to make more changes to the application. We'll need to add a skim.ts file and we'll also need to make some changes to the main ts file. So let's go ahead first and add a skim file for our rails. Let's just call that skim ts and here we'll define our routes for our router. We'll import them to main and then at the very bottom we'll instruct the app to use or mount the routes onto this skim URL. Let's just add it here. The skim protocol doesn't exactly define the route that we need. In fact they only require v2 but I'm adding skim slash v2 as this is the required URL format that octa recommends. Now let's get started on building the skim server. We'll need to build each of the crowd functions I mentioned earlier and format the responses in JSON. From here we'll test our functions with postman to see that we're interacting with the server and receiving the responses according to the spec. Let's first import express and prisma then let's create a variable to instantiate the prisma client to be able to access user data then we'll add a variable for org id to tie these users to org one. We'll add here a helpful interface to help us define the type of data that we expect to receive from the request coming to our server. Now we'll go ahead and build our first crud endpoint and that is to create a user. I'll go ahead and explain what the code is doing here. So when a request to create a user comes in it's processed and we check to see if the user exists. If the user already exists we'll output a response 409, HTTP status nr409 and a response saying that the user already exists. If the user doesn't exist we'll go ahead and create the user by sending a request to create the user and then to the database and then we'll also send back a HTTP status of 201 and we'll also send back user response a helpful user response in JSON and this adds the core schema which we have added in the beginning and also it passes the user that was just created. Now that we have our first endpoint let's go ahead and test it. We'll use postman to protect our work to make local requests to the server and before we do that we'll need to protect this endpoint with the API token that we created when we provisioned an org in the beginning. First let's add or install passport bearer, passport HTTP bearer which is a module that would help us. Okay let's first exit out of this and install passport HTTP bearer. We'll then import it to our main TS file. We'll also create a variable and lastly we'll add the necessary code to use it. I will put it under our skim-related actions. I like that we'll be able to delineate which are our skim-related routes and that should be good there. Now that we have bearer off available let's go ahead and add that to our skim route. We'll go ahead and add it to our create user route like so. We'll still need to add two more things before we test with postman. The next thing we're going to add is body parser. Body parser will allow us to accept the header content type which will be sent to our server as application skim plus JSON and I'll show you what that looks like. But first let's install body parser. All right let's import it in our main.ts file. We'll also add it to our skim-related actions and that's good. Last thing we'll need to add is morgan and morgan is an extension I'm using to help me see the request come through our server. First we'll need to install morgan before we import it right and then we'll import it. All right I've gone ahead and added postman here and I've already I already have the request that I want to test with the server. Let's first test the create user request and to do that we'll run the server. Looks like I have an error and the error points to cannot find name passport. I must have not imported passport so let's add that over and try again. All right looks like it's happy. We'll send over a post request. I get a response unauthorized. I want to double check that I have the correct API key so let's double check that that's correct. That was in our seed script. Looks like I have the wrong API so let's just add the correct one and I also noticed that I don't have morgan running so let's check on that and that would be under main. So I've imported morgan but did I actually add it to my skim actions? I did not. I'm going to add morgan here and looks like that's happy and we'll send it again. This error I'm very familiar with and it's because I've not added or included body parser somehow. Body parser is needed so that when we send applications last skim plus json which is according to the spec you need to do this so let's go back to main.ts so I'll go ahead and add that. Let's try again. It's happy with that. Try to add the user. Okay that was 201 created. I now have a user test user and I'll go ahead and double check that using in Prisma Studio. I'd like to show you that my user had come through with an external ID and email and act upset to true. I want to pause here and explain a few things. I added some console.log so that you can see the request coming in to through post and I see that we are responding with that the user has been created and also I wanted to mention that the handy tool extension morgan also let me know that a post has come through to my user's endpoint and that I responded with a 201. Okay let's move forward. We have one endpoint down and we have six more to go so let's get to it. Now we're going to focus on the get endpoint and I'm going to add some code to retrieve users and let's talk about the code that I'm adding. According to the spec we can return users and return them paginated. We've added that here and I'll show you what that looks like in postman. Filtering for username is optional and we've decided to put that here because octa requires it when it creates a user. It will look for the user's username to make sure that they don't already exist before creating the user. This is just a code to be able to filter through the query params and also handle filter if a username has been sent and then again that's going to give us back a response and we'll get to see that here in postman. Alright let's run the server and hit our get user's endpoint passing in the necessary API, content type in the headers. Alright so we get a pagination back or a response back to the pagination start indexing at one and we get back somnus, trinity and test user. All good. Morgan even tells me that I'm getting a get response from postman and I'm responding with 200 with a list of users. Good to know. Now let's test get users with a filter username. I've added trinity at portal.example. Let's send that over and I get a 200 response with trinity with trinity's info. Let's try building the get specific user endpoint. I'm just going to add it here. You can pause the video at any time if you'd like to try building the endpoint on your own, challenging yourself to build it on your own and I'm just going to continue along here. So this endpoint will process a user ID that you provide and let's see if the user exists it'll return a 200 and if not it'll return a 404. So let's get the first user and sure enough we get somnus and that's correct. Alright let's now move right along to our put endpoint. Also at any point that you want to pause this video to try to write the endpoints yourself feel free to do so. I'll just go ahead and continue and this time I've added a put endpoint and this is to check to see if there are any changes on the user's info email name and to update that so let's go ahead and test that out. I'm going to change the user test user to let's say new test user so we'll change the username to that so change like that and let's see that change happen. Again I've console.logged the request coming in as well we get up to 100 response back and if I go to check our Prisma database I see that new test user has been updated. Let's move on to delete now. Alright for our delete section it's pretty straightforward. We are finding a user by ID and we're deleting them so let's go ahead and delete user three by passing in their ID and we'll get a 204. No response is valid according to the spec and if I were to take a look at our Prisma database I should not see the test user anymore so that worked. I want to add a soft delete option and this is not required by some identity providers but it is required by Okta and in the skim protocol it's referred to as a partial update and what we're going to do here is a soft delete wherein we set the active attribute to false. I'm going to send in the body going to deactivate a user by sending a false value and I'm going to change user two trinity and let's see what that does for us. Okay it says that it went through 204 and if I were to refresh this I see in the database that trinity's active flag is set to false and so this is good if we want to keep users for audit purposes Okta wants to be able to reprovision users if necessary say they come back from being reprovisioned we can reprovision them again. I know that was a lot of work congratulations on making it this far you now have a functional skim server so with that let's see what fun we can do with it such as connecting to an identity provider like Okta. We're going to plan out how we're going to do this the first thing we'll need to do is set up a local tunnel which will allow us to expose our local server to the web and allow Okta to reach us at a public URL. We'll then need to create an Okta developer account so that we can then create a skim application which is the skim client that will be making requests to our server and lastly we'll test common user life cycle actions such as import provision and deprovision let's get started. Next thing we need to do before connecting to Okta is to run a local tunnel in fact the service that we'll be using is called local tunnel. We'll need this to be able to expose our local server to the web and for Okta to reach us via public URL so with that we'll go ahead and run our server also something to note this app is already configured for a local tunnel so all we need to do is call it okay our server is running and now we'll tell local tunnel what port our server is running and we should get back a public URL. Fair chicken enjoy. I find that the combination of strings that or random strings that look this local tunnel service sometimes is gives is is funny so I hope you get one too. Go ahead and keep this and we'll need it when we connect with Okta. Okay now we need to create a developer account if you go to developer.okta.com slash sign up you should be able to choose from the different account options and one of them being the Okta developer edition which is a free forever account type here at Okta. I've already got my account and I'm going to go ahead and create the skim application we'll need to go to browse app catalog and search for the skim template app specifically the skim 2.0 test app header we'll add that this is a good enough name we'll bypass the sso option for now and we'll head on to the provisioning tab and enable our integration so remember that local tunnel URL I asked you to save we'll go ahead and post it here okay so I've added my base URL and appended skim slash v2 to the API integration and added my bearer token as well and I got a confirmation from Okta saying that I was able to authenticate successfully and I see that it sent a request to my server to the get endpoint and we've established a connection it took me a few tries to get local tunnel to work so just as a note this is good for development purposes only development purposes only but you may use nrock also as another tunneling service and or maybe do a deployment to test all right let's save this and move on forward now we'll need to enable some crowd functions on the Okta side so let's go ahead and enable create users enable update user attributes and deactivate users let's go ahead and save that as you know we have two users somnus and trinity in our skim server so let's sync them over to Okta so that we can have visibility of all users in our database and designate Okta as the source of truth about users assigned to this app or organization so we'll hit the import now button and Okta gathers that there are two users in our downstream database we'll say it okay I've had to add last names for somnus and trinity because Okta did not allow that they didn't have last names so I went ahead and did that with our handy dandy crowd functions now I was able to update the users to have last names so we'll confirm assignment and if we look under assignments they're added to Okta now let's try adding a new user through Okta so we'll go add a user and we'll name this user Tom and their last name Anderson and we'll give them tom dot Anderson at portal.example.com and save this user we'll also need to add their username as the email and save that and let's then add them to the application tom Anderson and there we go he's provisioned and let's see we have a request from Okta to add a new user and we'll double check with Prisma and we'll do a refresh and we have Tom now in the database and they are active now let's say Mr. Anderson no longer wants to work for portal we'll need a way to unassign him from the app or deactivate him from the app we can do that by unassigning him the app here and when that happens a patch request is sent to our server and setting active to false if I do a refresh and that does the soft delete that we coded for in the server okay now let's say he returns because he realizes that portal is in fact the best place to work and he'll need to be reassigned to the app so we can go ahead and do that by assigning him the app like so and once this is done another patch request is sent to reactivate him back and that you can see here so then we'll double check in our database that he is in fact set back to true and now let's look at one last scenario let's say not only does Mr. Anderson return would like to be called Leo so we'll need a way to update user info and fear not because ours Kim server can handle this so we'll go to his account and we'll go to his profile and change his name to Leo and once I make the changes our downstream app is notified and we can see in our database that the changes have come through well there you have it I have demonstrated common user lifecycle management scenarios from our skim connection with octa you can certainly repurpose the server now to work with other skim compliant identity providers as needed now let's conclude this workshop if you've been following along you now have an open id connect app with skim provisioning your users can now authenticate securely with oidc and make use of this application once they've been provisioned and assigned this resource from an identity provider such as octa if you want to continue building on your server I have added a section called where to go from here on the blog post accompanying this video which I will link in the description as well as a section on my tool recommendations for testing your skim server to recap I have shown you how to implement skim in a sass app and implemented in a way that can potentially support provisioning from multiple works in learning how to implement the server I hope you recognize that you may interface with multiple separate skim clients as it is possible for your customers to be leveraging different idps with different skim clients and lastly please give me a thumbs up if this workshop was helpful to you and also please comment below if you have any questions I'd be happy to answer and as all our proof of concept projects and SDKs are open source we invite you to initiate a pull request if there's something you want us to improve upon and as always please like and subscribe to octa dev for more helpful videos