 Hello! Welcome to this video on building a Node.js application using SQL Server end-to-end. In this tutorial, you'll learn the basics of creating a Node.js application with SQL Server by creating a simple calendar app. To give you an idea of what you'll be creating, your application will be a simple event calendar that uses Vue.js and Material CSS on the front end, secured with a login to allow new users to register for their own account, and the front end will communicate with your secure API to create and delete events. I'm excited to step you through this project, so let's get started! Before we get started on the code, we need to make sure that you've got a couple of things that are required to complete this tutorial. First is, you need to have Node.js installed, and you can check that in a couple of ways. One is to open up a command prompt or terminal, depending on what operating system you're using, and type in the command Node-V. This should print out to the console the version of Node that you have. If you get an error message, or if the version of Node that you have installed is below maybe version 10, then you're going to want to install an updated version. You can do that by going to Node.js.org and clicking on the big green download button. If you're own Mac or Linux, or if you're using the Linux subsystem for Windows, I highly recommend that you install NVM, and that is the Node Version Manager. This gives you a command-lined option, and it makes it so much easier to work with upgrading Node and being able to install multiple versions of Node, like on my system. If I do NVM LS, I see that I've got version 12 and version 13 installed, and I can switch easily from one to the other, and I can also do upgrades. The other thing that you need to have installed or access to is SQL Server, and if you have access to a development instance of SQL Server, that's great. If you need a local instance, then there's a couple of ways you can do that. One is, if you're on Windows and you're comfortable with installing something like SQL Server, you can go download the SQL Server Developer Edition. I used to use this a lot when I was developing .NET applications on Windows, and it works fine. It's got all the features that you can possibly need with SQL Server. But if you're on Mac or Linux, or if you don't want to mess with installing a complicated server application like SQL Server, then you can use Docker. If you don't already have Docker installed, you can go to Docker.com and go through the options to find Docker to download. There's a free community edition that works fine, and that's what we'll be using in this tutorial. I'm going to step you through how to install Docker or how to install SQL Server using Docker. The first step is to make sure that Docker is running. If you've got Docker, you should have some commands that you can use. It appears that I do not have Docker running. I recently upgraded my system, and so Docker needs to be installed. I've got to install it too. Look at here. Let me install this real quick, and then I'll get back into the video. Now I've got Docker installed, and I'm ready to continue. The first thing you need to do is pull down the latest Docker image for SQL Server. You do that by using the Docker pull command. The full command is docker pull microsoft-mssql-server-linux-2017-latest. When you run this the first time, it should take a few minutes, depending on your bandwidth, how much your download speed is. It'll download the image, and you can run this again anytime, and if there's an update to the image, it will download the latest version. Once you have the image downloaded, then you need to create an instance of that image as a container, and use the Docker run command for that. The Docker run command in this case is going to be dash D to create a background daemon, so that the server runs in the background all the time, without having to have Docker running, or without having to have this running in the terminal. We're specifying the name of SQL Server, so that you can easily start and stop by using the name SQL Server. And then there's some environment variables that we have to pass. One is to accept the ULA, and the other is to set the password for the system admin account, and you can change that to whatever it is that you want to change that to. We're setting the SQL product identifier to the developer edition of SQL Server, and then we're mapping some ports. So by default, SQL Server runs on port 1433, and so we're mapping our local 1433 to the containers 1433. And then finally, we're specifying to use the image Microsoft SQL Server Linux 2017 latest as the image to create this from. I already have a container named SQL Server, so it's giving me an error. I could back up here and give this another name and create a second instance SQL Server, but I'm not going to do that. All right, but I do need to start it. When you run or do Docker run, it's going to start that server automatically after it creates it. I need to actually call Docker Start SQL Server, and now my SQL Server instance is running as well. Now, next thing when you do is to create our SQL Server database and create some table and seed some data into it. To do that, I'm going to use a tool called Azure Data Studio. You can actually do this from inside Visual Studio Code if you're using that as your editor. There's a plugin for it. I'm going to show you real quick how where to go look for that. So if you search for Azure Data Studio, you should land on this page and you can find instructions on an installer for your operating system. If it's a zip file or an installer for Windows. So download this application. If you don't already have it and install it and then we'll get started on the next step. All right, so if you install Azure Data Studio and launch it, you should see a window that looks similar to this. We're going to create a new connection to our SQL Server database. So the server will be local host. We're going to use SQL login authentication. User name, we can use the system admin and then the password that was passed to the Docker command when we initialized our container. And we can check the box to remember the password. And there's some other options, but we're just going to go ahead and connect with this. All right, now we're connected to the database and we need to create a database for our application. You can use command in or control in to create a new query window. And in here, we're going to issue some commands. So we're on the master database right now. We can say create database. I'm going to call it Cal. Actually, I've already got a database named calendar. I'm going to drop that. Don't do this in production, right? We're able to do this because we're in development. So now I'm going to, this is the command I want you to run. Create database calendar. Go press F5. That's executed the query. And now you can change your current database to calendar. So now inside, you know, make sure that you are in the calendar database. You want to double check, say use calendar and press F5. And that will switch your current connection over to the calendar database. Don't want to do this on the master database. So now we need to create a table for our calendar events, identity column. They just automatically increments. And we're also going to specify this is the primary key for the table. And this value cannot be null. We're going to add a user ID in bar card 50. And it cannot be null. A title, not null. A description. A description is a key word. So for SQL Server, we're going to wrap that in brackets so that SQL Server knows that we're using this as an identifier for our table label. A column instead of using a keyword. Let's make this in bar card a thousand. It can be null. We need a start date of type date that is not null. And then a start time. I'm splitting these out separately just so that it makes it a little bit easier to deal with the data. The data that's stored so that we don't have to extract parts out of the date. The time part. We're just going to have these things be separate in the database. That way you could have a date like an all day event kind of thing as a start date without a time associated with it. And not have to, you know, have a flag for that or have the time be specified as, you know, 12 a.m. and assume that means it's a date or an all day thing. And then we're going to have an end date that is a type date that is null, nullable, and then an end time. We're also going to want to go ahead and add an index for this table. I mean, it's overkill for a tutorial, but I want you to know that you can specify indexes when you create a new table using it. You can name it whatever you want to, but I always like to have a convention of IDX for index. The name of the table and the name of the columns that are included in that index. I just know ahead of time that we're going to do some queries based on the user ID and SQL server likes to have some hints or likes to have indexes for performance. That'll make this much better down the road. All right, we've got our full create table command here and press F5 or you can go up here and press the run. And now the table is created. If I try to create the table again, it tells me that there's already an object named events in the database. If you want, you can specify, like if you have some DDL type statements, like if you have a script file that you may want to put in production, like a migration file. You can do things like drop table, if exists, events and run that. I mean, just know that if you have that in there, then it's going to drop that table. So if there's any data in that table, it's gone. And this is good or useful in development when you want to just start over clean with a fresh copy of your database with with all the tables that you might create. All right, we're through setting up a SQL server. One more step. I want to go ahead and add a couple of rows in here that we're going to want some data to play with later on in when we're setting up the application. Start into events. These are ID, title, description, start date, start time. Those are all our columns and then our values. We'll say we use our one, two, three, four. This is an appointment description stuff. And our start time is going to be our start date will be March 31st and time will be 230 p.m. And the end will be, I'm going to say no and no for the end date and time. Let's go ahead and add a second row. Same user one, two, three, four. We'll say it's an online conference. No description. This will be on, we'll start on April the 1st. There's no start time and we'll end on April the 2nd. No end time. So now if I run all of this, it will recreate the table for me and also insert two rows into this database. Now we've got the database set up and ready to go. Let's dive into some Node.js code. All right, switch back over to your terminal or command line and we're going to set up a directory for our project. Change to whatever directory that you store your projects in. For this demo, I'm going to create something in my project demos folder. Make their node, call it whatever you like and then change to that directory. And the first thing you need to do when you're setting up a Node.js project is you need to initialize that folder. Use the command npm init-y if you want to accept the defaults. If you don't specify dash y, then Node will prompt you to answer several questions about the name of the project and your contact information and the license and so forth. All these things can be, you can set these defaults using the npm command. So if you go search for npm init defaults, you should be able to find how to set the defaults for things like the version and the author and the type of license you want to have for every project going forward. Alright, so now basically what npm init does is it creates this package JSON file for us, gives us a starting point for adding the dependencies that we need for the project. When you use Node.js, there are lots of great frameworks that you can choose from to build web applications. Express is one of, is probably the most well-known and popular. In this tutorial, I'm going to use Happy. It's my personal favorite. It was originally created by Walmart engineers suitable for building APIs and services and complete web applications. So in this prompt, we're going to run npm install happy happy. Now it's going to download some dependencies and get that first happy instance set up. You may want to specify just to make sure that if you're watching this video and maybe it's sometime in the future, just for compatibility, you may want to specify add it on to the end at 19 just to lock down a particular version of happy. In case there's a breaking change in a later update that doesn't work with this, the code in this video. So happy happy at 19. The other thing that I like to do when I'm setting up a new Node.js applications is to use ESLint. ESLint is going to alert me to common mistakes, JavaScript issues, errors, things that could cause me pain if I don't like initialize my variables or something like that. So ESLint is a great way to do that. So also use npm install, but this is a developer dependency. This is not something we would put into production. So you want to use the save dash dev flag and I want to use ESLint and ESLint config reverent geek. This is my own, you don't have to use this, but this is my own collection of ESLint rules that I like to have that helps me to format the code consistently and some other things. So we'll wait for this to install. ESLint needs to know what kind of rules to use. So I'm going to create a file called .eslintrc.js and I'm going to put into this code file the set of rules that I want to ESLint to use when I'm linting my code. I've got my own config for that and I can never remember what the settings are. So here's a tip. You can type npm docs and any package and it will automatically open up the readme for that package. So I can know that, hey, I need to create a config file and put this rule set in here. So this is the one that I want that's for Node.js and now I can switch over to Visual Studio code that I'm going to be using. And inside this ESLintrc.js I'm going to add my module exports that tells ESLint what to do. All right, if you were curious about what I did to launch Visual Studio code, you can also add this code as a command line parameter. If you press command shift P that brings up the palette and if you type in the word install, you'll see that one of the options is the shell command to install code in your path. So if you're working from the command line, you can type code anywhere and it's going to launch that current folder in your Visual Studio code editor. Really handy. All right, so now we've got that. We want to create our first Node.js web server with happy. Let's create, I like to organize my code, everything for my servers under a source folder. So we're going to create a new folder called source and in here I'm going to create a new file named index.js. Now you may be wondering why I'm using use strict at the top and because you may know already that in the browser, if you're using any modern JavaScript, if you're using like the imports, exports, the module type settings that you don't have to use use strict if you're using modules in on the front end. But for Node.js modules in the in that sense of using like the import, exports syntax for for things, that's not yet a default. And if you don't specify use strict, then you're you're not running in strict mode, whereas in the browser strict mode is is pretty much guaranteed to to be run all the time. If you're using modules, if you're using a transpiler like Babel, it's going to add you strict to the top of all your files automatically. But node is something that we need to do today. So we're going to require in our server module, which will create in just a moment. And we need a start server function. Wrap this around the try catch. We're going to create a config object. Specifier host is local host and port is 8080 create an app. We're going to wait on our server passing in our config. And we're going to await server dot start server running at. And now we just need to set our catch statement. Finally, we'll call start server. So our asynchronous function that creates. This should be app dot start not server dot start. You saw may have seen. So this is where ESLint helps me to keep making stupid mistakes. So the squiggly underline under app indicates that app is declared, but its value has never never been used. Like, oh, well, I put in the wrong thing should be app dot start. And so now everything goes back to not being read again. All right, so we got our index. Now we need to create our server module again starting with you strict. We're going to require in our happy module. Specify requiring in a routes module, which will will create in just a moment. Again, if you're creating a hello world application. You could put everything into one index.js file and and make it work. But that's not going to serve you really well going forward. So I like to organize my code in even in the beginning into things like separate server.js, which will make server.js easy to test later. Separating out things like my routes plugins, which are the equivalent of middleware that you may have used middleware for express. If you've ever played with express plugins is what happy uses and you know just different sections of code into their separate files because I know over time is as the project grows. I'm going to want those kinds of disciplines in the project. And there's unfortunately not a whole lot of guidance out there on how to set up projects and it's all a preference thing. You know, there's no one set way of defining what Node.js projects are supposed to look like. That's one of the beauties of Node in JavaScript. You can do things the way that you want, the way you like them. All right. So now we've got our routes module imported. We're going to create our app that takes is a synchronous function that takes in a config. We're going to destructure the host and the port off of the config object. And create an instance of our happy server using the host and port. Next, we're going to store the config on the server dot app instance. And this will come in handy later on in the project. I know ahead of time that we're going to want to find some things like the SQL server connection information. We're going to put that all on the config object later on. And then from anywhere inside the application at any time we can use server dot app dot config to get to those configuration values, which is really handy. And then finally, we'll use our register our routes passing in the server and then finally return server. And last but not least, we need to set our module dot export. This is the magic that for of common JS that allows us to expose to the outside world to anything that consumes this module. That's what helps us to expose what objects or functions or classes if you want to use classes are available when you import that module. All right, now we're still missing one thing and that's our routes. So I'm going to create in my source folder, a new folder called routes, because I know I'm going to be adding multiple routes or collections of routes at some point. And in the routes folder, create an index.js. Again, you strict. We're just going to module dot exports. We can just say we need just one function called register that takes in a server and uses server to register a route method. Get path is the root. So we're setting up what this is doing is setting up a route that's listening for like the default document. So the when you go to local host 8080 without specifying a route name or anything, this is what's going to serve that up. And then we need to define a handler, which is a an asynchronous function. Happy passes in a request object and this request toolkit for whatever reason that it just call it H. I guess it's short for happy. But it's the request toolkit that has lots of things that that are like utility functions and all kinds of cool stuff. But we're not going to use either one of those in this hello world type first instance. So we're just going to return the text, my first happy server and we're done. And one thing before we run this is we want to go into package JSON and change the main to look for source slash index dot j s. This is so if you run node at the command line is going to know where to find the the entry point for the application. Now you can start the application. We'll do that by going to the command line and typing in node and a period. And well, I must have messed up something with the output. Let's switch over real quick. I forgot my dollar sign. You probably saw me do that. All right. Those are a template strings, a template, a template literals and JavaScript. So let's press I'll do control C to stop it and run it again. Now I got the text that I expected. And you can, you know, copy and paste this into your browser, whatever the terminal settings that I have, I can use command click on a URL, which is really handy. And when you browse to the URL, you should see the text my first happy server. Sweet success. Your first no JS happy server. All right. Let's go back and stop that server. You can press control C. Now, before we get into writing code to interact with SQL server, we need a good way to manage our applications configuration, such as the SQL server configuration or SQL server connection information. No JS applications typically use environment variables for configuration. However, managing actual environment variables can be quite a pain, especially if you're on a on windows. So there is a module that we can use in no JS called dot ENV that exposes an ENV configuration file to know as if they were actual environment variables. So for us to get started on this, we need to install dot ENV as a dependency. And here I'm specifying version eight because that's the version that the latest version is going to work with this particular project. And now we need to create a dot ENV file. You could create this in Visual Studio code. I like using the touch command on on Mac. It's just convenience. So now we have a an ENV file that's empty. And we need to put some configuration information in here. So I like to specify the it's a common thing to have a node environment variable. And you definitely want to set node ENV equal to production. When you deploy an application to production, that's going to give you a lot better performance. Now we need to specify our happy server config port equals 8080 host equals local host. I'm going to go ahead and specify a host URL just as a shortcut. I know I'll need this later. And then also later on in the application, this tutorial, we're going to add security some authentication to our app. And we're going to use cookie sessions, cookie based sessions to manage that security. So I need to specify if I can type a server cookie encrypt password. And this can be this can be whatever you want. You can be just a big long string of gibberish. You want something that's that's secure. It does need to be at least 32 characters long. So I'm just going to call it my super awesome password string that is at least 32 characters. Change it to whatever you want. And now we need some SQL server config specify our SQL user is in production. You're definitely going to have different credentials set up for your application. So you do not ever use the system admin account in production. But for development, we can we can get away with it. If you're real peculiar, or if you're using a database server that's hosted somewhere else, like if you're using a development instance of SQL server in your organization, for example, you probably don't have the essay password account anyway. So you're going to use whatever developer account that you have. So specify that we need to specify the SQL database calendar, need to specify the SQL server name. And in this case, I'm using local host, but this will be whatever. If you're using a development server that's hosted somewhere else, you can put that in there. And then something that you'll need if, if ever you want to deploy this application to Azure, you need to set SQL encrypt equal to true for Azure. If you're for local development, you're set this to false. And then finally, I'm going to go ahead and add some optic config in here because later on in the application, we're going to add authentication to the app so that people can log in and enter dates for their own thing. And we're going to need at some point an octa org URL, octa client ID. We'll change these later, an octa client secret. Another thing to point out is that if you're using a source control system like git, you do not add this .env file to your source control. You need to ignore, set it to ignore this file because each requirement, each environment requires a custom env like different SQL server connection information, different port or host information. And you don't want to expose any of these secrets in your GitHub repository so that other people could find them, especially if you're creating an open source project, right? You don't want to accidentally check in a file that includes, hey, here's my SQL database connection information or my client secret for authentication. Typically, what open source projects will do is create a separate .env sample file that includes placeholders. And then in the readme, we'll say you need to make a copy of this file, rename it to .env and change all the values to be for your environment. All right, so we're going to use this configuration information. We're going to create a new file under source called config.js. And we're going to pull in our .env module. And there's one thing, you have to run the config function on .env. And this is going to read in that .env file and then add all the environment, all those values that are in there to the Node.js process. So that it looks like to Node that these are environment variables that are set in the environment. Let's pull off because environment variables are always uppercase. We're going to pull off all these values and return something that's a little more palatable for Node.js or JavaScript. So instead of dealing with all these uppercase values. So we're going to pull off port, host, host URL, the cookie, encrypt, password, SQL server, SQL database, SQL user. We're going to pull those off of the process.env, which is Node.js is built in environment. That's how it exposes environment variables inside any Node.js application. We're going to check our SQL encrypt value and spell. And now we'll export all this configuration, mapping all these values to things that make more sense. So like SQL, we'll create that as a separate, like an object within the object. This will make it easier to lay down the road. I'm also going to specify the version of SQL server client that we're going to use for Node.js. It's currently requiring to set and enable Eryth abort either true or false. Otherwise you get a warning message if you don't specify what Eryth abort. It should be true. One other thing I want to pass on is a good tip. A lot of times when you're deploying an application or changing environments or something like that, it's real easy to miss adding a environment variable that's required that your application expects. And if the environment variable is not specified, then it could cause issues in your applications and unexpected behavior and you're thinking might take a while to track down. So one pro tip I would say is assert that all the values are at least present in your configuration. So one way you could do that is to use the built-in assertion library that comes in Node. And then after you pull in all this information, just use assertions, assert that assert port and just duplicate this for every required environment variable. In the case of this application, every environment variable is required for this application to work like it's intended. So just as an exercise, if you want to add all those assertions into your own application, this could save you some headache down the road. Now we need to update our main index to change where we hard-coded our configuration here to use our new config module. So here we'll say const config equals require config. We'll delete this and now we can just pass the config into the server. Now we can get to the fun part. In this step, you're going to add a route to happy to query the database for a list of events and return them as JSON. You're going to create a SQL server client plug-in for happy and organize your data access layer in a way that will make it easy to add new APIs in the future. So first, we need to install some dependencies, the most important being the MS SQL package. So back to the command line. We're going to say npm install MS SQL version six. And also in this step, we're going to install FS extra version nine. FS extra is or FS is the built-in Node.js library for file system, like for reading and writing files. I personally like the FS extra library that extends the built-in file system and gives some nice utilities around reading and writing JSON files. Back to the project, we're going to add our data access layer. So under source, let's create a new folder named data. And under data, create a new file for index.js. Creating connections to SQL server is a relatively expensive operation. There's a practical limit to the number of connections that you can establish. And by default, the MS SQL packages connect function creates and returns a pool of objects. A connection pool increases the performance and scalability of an application. So in the lifecycle of a Node.js application or any type of application that connects to a database like SQL server, you're going to want to create that connection one time for the lifetime of the application and use that same connection objects over and over again. So what we're going to do is set up a client that we can use from anywhere in our application to make those queries. And the client itself is going to handle creating an instance of a connection and keeping that connection alive for the lifetime of the application. In this index.js file, we're going to require in an events module that we'll create in just a moment. And we'll require in the MS SQL module. And we're going to create our client function, which is an asynchronous function that takes in the happy server and our happy config. And we're going to initialize a pool object. This is how we're going to keep that connection pool around. We'll create a couple of internal functions on this client. One is close pool. So in the application is shutting down or for whatever reason needs to close the connection pool, it can do so. And then we'll create a function called get connection. Check if pool already exists, then return it. But if this is the first time this connection has ever been created, then we'll create a new connection pool. Set pool equal to await sql.connect passing in our config. And then we'll check if on error, if there's an extra error or anything, we'll just force a close pool. And finally, return our connection pool. There's an error of any type that's unhandled. We'll also log this to the console. And set the pool equal to null just to be safe. Finally, our function that we're defining here, the client function, we want to return something. And that something is going to be an events object where the events are going to expose functions for doing queries or operations on the events table, the calendar table. We're going to pass in our sql client and our get connection. Because those are two things that every data access layer group of queries and commands are going to need the sql object and the get connection function. And then finally, we need to export. So just to reiterate, when using sql server with Node.js, one of the most critical things to get right is handling connection errors when they occur. And internally, this sql data module has two important functions that we talked about, get connection and close pool. The get connection returns the active connection pool or creates one if necessary. And when any connection errors occur, the close pool makes sure that the previously active connection pool is disposed to prevent the module from trying to continue to reuse it. All right, so under the source data folder, let's create a new file called utils. And in here, I want to add some code that's going to help us to read in a bunch of query files. We can define sql files as external dot sql files and read those in when the application loads. We're using the FS extra module that we pulled in. And we're also going to use the join function that's part of the built in path library for Node. And the path library or module deals with folder file paths. We're going to create one function called load sql queries that will pass in a folder name. We use the process current working directory and we'll say in the source folder under data, look for this folder name by convention. And then get a list of all the files that are in this folder and then filter that list of files to be only files that end with the dot sql extension. And then let's initialize a queries object and then we'll loop over for const. For of is a really handy newer convention for looping over arrays and objects and so forth. We're going to create a query by reading in that file. We're going to join with our file path, the name of the file. We're going to specify encoding is utf8. Remove the sql extension from the file name. And so we're creating a property on this queries object that specifies for instance, maybe we have a query file named get event. And so queries dot get event without the sql connection is going to be equal to the contents of that get event dot sql file. And finally return queries finished with that. So when we back up and explain again, it is possible to embed sql queries as strings in your JavaScript code, but I believe it's a better practice to keep your queries in separate sql files and load them at startup using a utility like this. This utility module loads up the SQL files in a given folder and returns them as a single object. So next we'll create a new folder under source data called events. Let's create a new file index dot js and let's go ahead and create a new file called get events dot sql in the index file. Let's read in the required the utils module and let's create a register function that's going to initialize all this. We're expecting a sql connection sql object in a get connection function passed in first one to read in all those sql files. So let's say sql queries equals await utils dot load sql queries passing in the folder name of events. Let's create a get events function. It takes in a single user ID. We're going to create a get a connection from our get connection function that was passed in. We're going to create a new request. In the request we're going to define our query parameters and the query that we're going to use. So in this request we'll say request dot input the user ID. It's a barcar50 type and we're going to use the user ID that was passed in as the value. And then we're going to return request dot query using sql queries dot get events. Again by convention when this reads in all the query files it's going to expose a get events because this sql file is named get events. And finally our register function needs to return that we're exposing get events. Again eslint to the rescue. I forgot my equals. And now in this get events sql file let's add the query for getting a list of events. So we'll say select. These are all our columns. Now it's not necessary to wrap all your column names in brackets but it's a good discipline. It's a good practice to get in the habit of. Because we know that description is a keyword and we definitely need to wrap that in brackets. So we'll select those things from the events table where user ID is equal to whatever value we pass in as the user ID parameter. And then we'll order these events by start date. So you're going to notice that we're in the sql file and the sql files that we create later on in the tutorial that we're using a parameterized query. And that's to guard against sql injection attacks. You never want to see in any code whether it's Node.js or Java or .NET or any language building someone building a query string using by concatenating strings and values together. Always use parameterized queries. Next we need to create a database client plugin to make it easy to run these sql queries from other parts of our application. So from our routes we want to be able to get our sql client and run a query to return from a particular API endpoint. And as I mentioned before, in other frameworks like Express, this concept is usually referred to as middleware, but in Happy uses the term plugin. So under source, let's create a new folder called plugins. And in plugins, let's create a file named index.js, lots of index.js files. One of the reasons why index.js is used so much is that CommonJS recognizes index.js as like the default file in a folder. So it's a shortcut. You could name it anything that you wanted to, but when you require in a module that's in a folder, you would have to specify require plugins slash whatever that file name is. Whereas if you have an index.js file, you can use the shortcut require slash plugins and it's going to automatically look for a file named index.js and read that in. It's just a convenience thing. So in this plugins index, we're going to create our module. We're also going to create a sql plugin. So we're going to require in this sql module. Now you've seen me use this pattern quite a lot, this creating a register function and exporting a register function on all these top level folders like the routes and the data plugins. Again, this is just a convention that I like to use. It's specifically with happy because happy also has a register mechanism. So I'm just kind of repeating that pattern in my own code as a convention. There's lots of ways you can alternative the ways to register plugins and routes and so forth that wouldn't require this. But this is something I've come up with is that as a project grows, there's you'll always know that there's one place to go in any of these folders like the data folder, the plugins folder, the routes folder. You want to go straight to that index.js and see where things are being registered when the applications initialized. So that's our main index. Let's create a new file called sql.js. We're going to require in our entire data folder. Let's going to initialize all the read in the SQL files and do all that kind of stuff. And then to be a plugin in the happy ecosystem, you have to export a very specific looking object. This object needs to have a name property. We'll call it SQL needs to have a version property. And we'll just set this to 1.0 and it needs to have a register function that takes in a server. When happy registers plugins, it's going to pass the server instance to every plugin so that you can do cool things with the server. We're going to grab the config. Remember, we stashed the config for the entire application onto server.app. Well, now we can say server.app.config.sql. We're just going to grab the SQL object off of that config. And we're going to create our client passing in server and the config object, which is just the SQL connection information. And then last, we need to expose the client to the rest of the application. This is a mechanism that happy has the server.expose function. You can specify the name of the property that you want to expose and then the object or function that you want to expose. What this does is when happy is initializing and it's registering all the plugins that you've defined, it's going to create an instance or provide a way for you to access those plugins from anywhere else in the application. So in this case, we're going to expose the database client where we can get to it easily from any of the routes anytime we need to. And finally, we need to update our server to register those plugins. So we're going to add plugins requiring our plugins folder, which is going to read the index.js file that's in the plugins folder. And then after we create our instance of the happy server after the config, then we need to await our plugins.register. So now our plugins will be properly registered. Well, our next step is to add an API route that will execute the get events query and return results as JSON. You could add this route to the existing source routes index. However, as an application grows, it's be better to separate routes into modules that contain related resources. So we're going to create a new folder under source routes and close some windows. We're going to create a new folder named API and in here create a new file, of course index.js. And this will be like an index.js that as we add more routes to the API, we could just bundle those all into this index.js. So we're going to require in our events API module that we'll build in just a second. And then we'll export a register function that takes in the server, runs await events.register. So let's also add an events.js file. And in here, let's export a register function that takes a server argument and uses that server to register a route. This time the method for this is also get the path will be API events. And then we're going to set a config for this route. Actually, let's just say handler. We don't need a config yet. So we'll say our handler is an async method that takes in a request. And now we can just wrap all of this in a try statement. Go ahead and finish the try. Another tip while I'm here is that you don't have to use the console to log errors. Like in production, you probably want an error logging mechanism. It's out of the scope of this tutorial, but you may want to look at something like Happy Pino. And there's lots of other loggers out there too. Winston can't think of any others off the top of my head. But those would be better ways of logging errors instead of just logging them to the console. Unless in production you have some way mechanism of capturing those console log errors and capturing those in a way that you can get to them later. But for now the console log will work just fine. In our try statement, we can now get our database plugin right off of the request. Our request includes a server. And we know off of server we can access plugins using the plugins property. And our plugin is registered under the name SQL because that was the name of the plugin. And then we exposed the client from that plugin when we registered it. So request.server, and then from server you can get to plugins, SQL, and the client object. So now we have our database object. And for now we're just going to hard code in our user ID, which is that user1234. That's what we inserted into the database. And now we can say, let's get a weight db.events because that's the data access layer. As we were building that, we created an events object. And we have that one function on the events object, getevents. And we'll pass in our user ID. And then finally we'll return response.recordset. Now when you bring back a database response, or when the response from SQL server comes back, it's got, maybe, I don't remember what all properties it has off of that response, but it could include things like the number of records and just metadata about the response. And a SQL server response can also include multiple record sets. Could have multiple queries from like a stored procedure. So the SQL client MSSQL module that we're using supports all those different scenarios. In this case, we just want the record set property off of the response and return that. And this automatically returns it as JSON, which is why when we created the table in the first place, we used the, I can't remember if camel case or Pascal case, whichever the case, whatever the appropriate name is. We start with the lower case letter and have uppercases in the middle. That's the convention that you typically see with JSON data. That's why we named those column names that way so that we didn't have to do any mapping to make it look more like JavaScript or JSON. So that's all there is to it for the Routes API. Now we need to update our routes main index to register this new API. So in routes slash index will require in our API folder, which will automatically load that index.js. And then in our register function, we'll await API.register passing in the server. And I think we're ready to test this. So let's go to our command prompt and type in nodes base servers running. We'll go to the server here. We've still got our first happy server, but if we go to the route slash API slash events, boom, there's our JSON data. The two rows that we added to our SQL database early on in the tutorial. So we have a title of one or an ID of one title of appointment. Description is description stuff and so forth. So that gives us what we expect. Well, now that we got this working, let's get some real users in our application. So manually building authentication and user profile management for any application is no trivial task. And getting it wrong can have disastrous results. You don't want to be rolling this stuff on your own. It'd be better to have a security expert at your side. And that's what Okta does. So to complete this step, you're going to need a free Okta developer account. And to do that, you're going to go to the Okta portal developer.okta.com. And you can click the sign up button here at the top or the create free account. This account is always free. It's good for up to a thousand active users per month. Really has everything that you could need for building applications and adding authentication, login, password resets, all that kind of stuff. So sign up for an account. And then after you sign up, then you'll be at the Okta dashboard. And from here, you want to click on applications. Click on add application. We're building a web application. So click on web and then click next. And now you can name your application. Let's call this node sequel app. The base URIs and all these things are already set up as we need them. I purposely chose to use port 8080 because I knew that, you know, that's kind of the default for an application that we're building or adding in Okta. So all these, you can leave as defaults, click done. And now on this page under the general settings down at the bottom, there's a section labeled client credentials. These are the two values that we need to put into our environment file. So click the little copy to clipboard icon. Now switch over to our application and go to the ENV file. And we'll paste in the client ID, copy the client secret and paste that in. And then last, you need your org URL. And to get that, click on the dashboard to go back to the dashboard and you see right here in the top right, your org URL. So copy this URL, whatever that is, it'll be different from mine, of course, and paste that in in your config. One last thing you may want to do is to enable self-service registration. So that is what gives your customers the ability to, when they go to use your application, if they don't already have a login, there'll be a link that says if you don't already have an account, you can click here to sign up. So it's basically adding a sign up link to the login form. And to do that, you're going to go to the users menu, go to registration. And on this page, there's a look for this self-service registration. If it's enabled, great. If it's not, click edit, change this to enabled and save it. There's also other things on this form like if you want to add additional profile data to your login and registrations. All right, that's done. So now we need to build our UI. And these next steps, we're going to add a front end to our Node.js application using embedded JavaScript templates or EJS and ViewJS. First, we need to install some dependencies to support our authentication, rendering those templates, and serving up static files. So let's go to the command line. We can stop our server now. We're going to MPM install Happy Bell, which is an OAuth authentication library for Happy. Happy Boom, which is a error handling response library. Happy Cookie, which is a cookie session authentication session management system for Happy. We're going to install Happy Inert, which is used for serving static files. The cookie we need to install version 11 and boom is going to be version nine. Let's see what else inert. We need Happy Vision, which is the template engine allows us to bring in different whatever kind of template engine that we want for server side rendering. And then we need EJS, the embedded JavaScript templates, and that is at version three. So let's install these. And now let's go back to the application and we need to add a an authentication plugin for our application. Whoops, I went too far. So under our plugins folder, let's create a new file named auth.js. We're going to require in our Bell module and I can't type. And we need our cookie module. And let's check if we're running in a secure environment, meaning like SSL, TLS. And we'll just assume that's based on our node environment variable. So if we're running in production, then we're going to assume that we're also running under using SSL certificates. Otherwise in development, we're just using plain HTTP. So module.exports, we're exporting a JavaScript object, which is our plugin definition. So just like our SQL plugin, we have a name property. We're going to call this auth version default to 1.0. And we need a register function that takes in a server, the happy server. And we're going to await server.register those plugins that we need, the bell and the cookie plugin. And now we need to grab our configuration off of the server, server.app.config. And we need to define a couple of what happy calls strategies, authentication strategies. So the first authentication strategy is for our cookie sessions. So session, we're using the cookie plugin. And then the configuration for our cookie plugin is the name. We're going to name it OctoAuth. We need a path. We need to set a password, which is going to be our cookie password that's defined in the environment file. And then whether or not this is secure. And we need to set our redirect. If someone's not logged in, we need to redirect them to a new route of authorization code slash callback. That's one strategy. We need to add another strategy for our OctoAuth based login. So everything else in the application is going to be using session cookies. The session cookie is not there. It's going to redirect to the Octo authentication strategy, which will trigger an Octo login. So in here, we're going to call this strategy Octo, which uses the bell plugin and the configuration for this plugin. The provider is Octo. Config is going to need to specify the URI for where to go when the person needs to log in. And that is specified using our Octo URL in our config. We need to set using the same cookie password, which I just realized did not spell that correctly up here. Also specifies whether it's secure. The location is our config.url, which is in our environment variables set to localhost colon 8080. And we need our client ID, which comes from our environment variables under config.octo.client ID and the client secret. And I think that's all. And we're going to set the default strategy for our website to session. So all routes that are configured to use authentication will default to session. And if the person using the website is not logged in as a session and that route requires authentication, then they'll be redirected to the Octo strategy. Which will log them in, set them on the flow of logging in to Octo or registering with Octo if they don't already have an account. Last thing we want to do is this is just a really nice, this is kind of like a bonus tip of using happy. What we're going to do is create an extension on happy that is like hooking into a request lifecycle event. So on pre-response, like this means on every request that's made to the happy server before the response is sent back or processed by the route, we want to intercept that and add some of our own information to that response. So this also takes in a request and the happy toolkit. And we're going to check is the request response a view. So we're we're filtering out like calls to the API. We only want this to be triggered when when a view is being rendered and sent to the client. So if it's a view, then we're going to say, let's check to see if the person requesting this is authenticated. If they are, then we're going to set some some values is authenticated is true is anonymous is false. The email address of the of the current person is can be found under request dot off dot artifacts profile email. The first name is also available as first name last name. And if they're not authenticated, then we want to say is authenticated as false is anonymous is true. Email is an empty string first name, an empty string last name is an empty string. So there's our off object. And we want to decorate the response, the context with this off object and we'll be able to use this context within our views, our view templates to check is the is the current person authenticated or not. And if they are, we can quickly be able to access their email and first name and last name. We want to put that in the response. And then finally, after all this, we want to return H dot continue. So this is just part of the lifecycle of a response. Now we need to update our plugins index.js to register our off plugin as well as the other plugins that we've recently brought in like the like the EJS templates. So in here, we're going to require in EJS inert for static files vision for our template engine support and then bring in our own off plugin. And now let's update our register function to not just register one. Well, let's see. Instead of just registering the SQL function, we'll register an array of objects, which will be our off the inert vision and SQL. And then we need to configure our server view engine. So we'll use, we'll do that by saying server dot views and our configuration. The engines we're using are only using one EJS is relative to this is the path to where to find these templates. And we're going to use the built in underscore underscore der name, which is just a built in Node.js shortcut for getting the current directory of the application. And then the path to our templates is going to be in the templates folder. Sorry, the der name is actually pointing to like the current plugins folder that that this module is running in. And we're going to specify that layout is true. So we're going to have a layout template. And the engine will look for a file specifically named layout dot EJS to be like the wrapper template that goes around every page. All right, so let's add our server templates under the source folder, create a new folder named templates. And we'll create that layout file layout dot EJS. And I'm going to just copy in some code here and walk you through what this looks like some HTML code. We got a doc type HTML ahead that includes a title. So this title is using some EJS syntax to when when it renders on the server, it's going to look for some data that's passed to this view that includes a title value. And it's going to inject that into the title tag. We're setting or referencing a material icons font from Google. And we're specifying that we're using an index dot CSS file, which we'll add in a moment. We're including a navigation partial. So this is a way of injecting yet another template into the current template and then our our content. So all the other templates that are using this layout, this is where this content will go. And then we've got a script tag here at the bottom that includes an index dot JS, which we'll also add in them in a little while. So we need to next add a new template for our homepage index dot EJS. And again, I'm just going to copy in a whole bunch of code to save some time. And in this container, we're using a container class. So we're going to be using material CSS as like a CSS framework. So our class is container. If the current user is authenticated, then show them render this app div. If they're not authenticated, then render a header with a message and a login button. So depending on the logged in or not, it's going to change the way that the page is rendered. And I jumped ahead. I copied in too much stuff, way too much stuff. Whoops. That's all we need for the index dot EJS. The next is we need to edit a template for our partials. So under the templates, we need our navigation partials. So create a file named or a folder named partials. And in partials, create a file named navigation dot EJS. And here, we're going to create a nav tag. And again, we're using the is authenticated to conditionally show a log in or log out button. And this is where the, in both of these cases of the home page and this navigation file, we're looking is the current person authenticated. And the way we're able to get that information is because we, we added that to the response, the on pre response event. We injected that on every view. So now it's available anytime we need it. So now we need to update our routes to support the views and authentication. So open up your source dot routes folder. And let's create a new route named off. So a new file named off.js. And in here, we're going to bring in our boom module. We're going to find a login. And these, this is just what I'm, what I'm about to show you is an alternative way that you can define some routes and register those routes. So in the past, we've just done server dot route and specify the config in there. We can also create individual JSON or individual JavaScript objects of those routes and then register them all at once with one server dot register command. So that's what we're about to do. So our login route, the method is get path is login. And then we're going to specify some configuration for this, for this thing. So the handler, you know what, we don't really need to specify an option. Let's just create a handler, which takes in a request. And if the request dot off is authenticated, if it's not authenticated, then we're going to return a message. We should probably never see this message. But just putting this in here for, for safekeeping, just in case. We need our define our callback route for OAuth callback. And this is also a get the path is authorization code callback. The handler is a takes in a request and the toolkit. And if the request is not authenticated, then we throw a boom error. Boom has a number of default response types. So there's a boom unauthorized. And but if they are authenticated, then we need to set the session cookie. So we're going to use request dot cookie off dot set. Set that to the request off credentials and then return each dot redirect back to the homepage. In this case, we're going to set our options. The off strategy for this particular route is octave. And that's when it's going to trigger. If the person is not logged in, there's no session is going to redirect them to to octa to log in. We also need a log out route, which is a get pass path is log out. If the person is authenticated, then we're going to clear the session cookie and then redirect them. Back to the homepage and our options. We're going to set the off mode to try and what what there are several there. I think two or three off modes that you can choose from the try means it's going to check to see if there's any authentication information available. And it will populate the request off object. If off mode was set to none, then request off would not be populated with anything, even if that current request that person is logged in. And then request off mode. I think it's called always or required. It's one of those. It will check to see if someone's logged in. If they're not logged in, it will initiate the strategy to start a log in. All right, now we have our three routes defined. We need to register those. So we're going to export a register function that takes in a server. We're going to just register all these at once. Now we need to update our homepage route so that it renders our EJS view instead of returning a string. So under source routes index, instead of returning this, we need to render our view. And we also need to register our auth. So after we register the server, we'll await on after we register the API will register auth routes and our homepage route. We will change the handler to return h.view. We're using index route and let's set the title to home. That's how you pass data. You can pass a JavaScript object with a bunch of properties and things to a view. So the EJS template has that data to work with. And then under options, we want to have our auth mode also equal to try. We also need a static route for our static files for like the index.css and JavaScript that we'll create in a little while. This is a get. The path is really anything. And this is like a catch-all. If it doesn't already find another route that's explicitly defined like our login or homepage, our logout route, if none of those routes match a current request, then this catch-all route will handle it. And we're going to set this to a directory with the past path of dist. This will be a folder that's created for us automatically by our build process. It's going to build our front-end for us and generate our JS file and our CSS file. Well, now that we have our authentication added to the application, we need to update our API to query the database based on the current logged in user. And at minimum, we also need to create some API routes for creating, updating, and deleting events along with those respective SQL queries. So let's go ahead and add some of those queries. I'm going to go to the data events folder and create a new file named add event dot SQL. And I'm just going to copy in some code here to save some time. And this is just an insert statement taking in all the different columns we have. And then we're passing in all the values as query string, or not query string, passing those in as parameters. So this is a parameterized query. And then after the insert, we're going to return the newly generated ID using the scope identity SQL function that returns the automatically generated ID. Now let's create an update SQL. We'll call this update event dot SQL. Again, copy in some some code here and very similar to the insert. This is the update events, setting the title to the parameters, title, description, start date, so forth. See a little typo there where the ID is the ID and the user ID specified. And then after updating it, return back the newly updated data, just as a double check. And then let's create a delete event. And this one's really short. I'm just going to delete from the events table where ID is equal to ID parameter and the user ID is equal to user ID. And now let's update our events dot index JavaScript file to use those queries. So in the get events where we I'm just going to copy in some code to save some time. So we need an add event. And that's going to take in a JavaScript objects that were that's destructured into the user ID title description and so forth. Just like the the get events will will get the pool off the will get a connection. This needs to be doesn't have to be but just to make things consistent. We'll get the request. And we're setting all these values for parameters and then running the query and returning the results. These could be these could be a weights but the request our query is a promise based. So it'll it'll work with the async and await as well. And then we need update event. Again, I'll rename this the connection for consistency update event looks almost identical to the to the add or the insert statement, except we're using a also including the ID of the event that we want to update and then a delete event. And finally, we need to return those functions from from here to the client has those. Now let's update the the API route under events. And instead of hard coding in the user ID, we need to get the actual user ID that's requested. One thing we need to do is we need to set the options on this handler or this route. So the off mode is set to to try so that our authentication information is populated. And then at the very top of our handler, we'll check is the current user authenticated. If not, we'll return an unauthorized error. Now we don't have boom. We need to pull that in. And then we can get the current user ID instead of hard coding that value. We can get that off of the request.auth.credentials.profile.id and everything else stays the same. And we also need routes for creating events and deleting events and updating events. I'm going to add a couple here just by by pasting these in and I'll walk through what these do. So here's a route to to create a new event. Again, the off mode is try. We check to see if it's authenticated or return unauthorized, if not. And then we're going to pull the start date, start time, all these values out of the request.payload. So when you post a JSON document, a JSON object to an API endpoint that is made available through happy using the request.payload object. So we're pulling all those values off of the payload. And then we're calling our API, our data access layer client using the add event object or function, passing in all those parameters. And we expect coming back from that update or that insert, the ID of the newly created event. And if there's an error, let's console, log that error, and then return. There's a short utility function on boom called boomify. It'll turn any error into a happy boom error. So we're returning boomify that error. And then finally, let's add a delete route. And this route is a delete verb. And it takes in a, this is how you specify in the path a parameter as part of the path. So events, we got a placeholder here for ID. And we get to that ID using request.params.id. So instead of request.payload, which takes the data that was posted, we're using params.id. That gets the parameters off of the path of the URI. So we get the user ID from our credentials, and we call delete event using the ID of the event and the current user ID. If it was successful, if the rows affected is equal to one, then we return a response code of 204. If we don't get a record back, then we return not found. Again, if there's an error, we'll log this to console and return a boomified version of that error. Now, it could do an update route here too. I'm going to leave that up to you as a, as an exercise. This should be very, very similar to the create event, except instead of a post verb, it'd be a put verb. And you could use just like the delete, you could add an ID on the path, or you could make it part of the, of the put statement either way. All right, now let's, we're going to create our UI for our application. And we're going to use, as I mentioned before, material CSS and view. And then some, some other components that'll help us make the application better. So let's move over to back to the, the command line. And we're going to install some dependencies. So npm install, we're going to use axios version point 19 to do some to actually do the communication from the client front end to the API. Luxon is, is part of, Luxon is part of the date, time UI that we're going to be creating the materialized CSS framework. Version one, the moment library, version two, view, a view, date, time component, and another component called week start. All right, now let's create a folder in the, in the application. I did it again, closed many windows. Let's create a folder in the application in the root. And we'll call this client. In the client, we're going to create a new file index.js. And in here, I'm going to put some, copy in some code. Notice we're using the new module syntax for, for JavaScript. We're going to import the date, time module, import the view materialize CSS frameworks, our date, time CSS. And we're going to import app from a new app component that will create. We're going to use that date, time component, register that with view and create a new view application and, and register that with the app. Div tag that's in the front end. Now, go explaining what all view does and, and how to use view is out of the scope of this, this tutorial. But hopefully you'll be able to follow along and see how these things work. This is the code for our index. We need to create a app view components. And in here is a, another big block of code. Again, I unfortunately, I don't, it's out of the scope of this tutorial to go through all of it or to go through it line by line. But hopefully you can follow along what some of these things are doing. And its view is a, includes a template where we can have these braces in there that, that are like tokens for the different data that we want to join with this template. So these are client side templates as opposed to server side templates that are in EJS. So view is our, these are basically our template engine for doing client side rendering. And our, we're going to loop through the events that we get back from the API and, and populate a table on the front end. And then we have some forms on the front end like we're going to add an event. And this ad event has a start date and a start time. And we're using the date time components to be able to show some, some nice UI based components that are going to be like a calendar type object or a time component to work with. And when we run the application, you'll see how that makes the, the experience a little bit nicer. So this is all the HTML for the forms. And we have some HTML for a delete confirmation. So if we're deleting an event, it pops up this message box that asks for confirmation and based on whether they click, the user clicks on delete or cancel. There's some behavior. And then also as part of this component, we're defining the, the JavaScript that goes with it. So as I mentioned before, we're using Axios to do the, the API calls from the front end. We've got materialize in our moment. And as far as view component, we have some number of computed properties. We have a data property that includes, you know, all the information for a single event. And we have these methods defined like ad event and the Axios will post that to our API. We have a confirmed delete event, a delete event, which actually does the, the delete calls that delete API endpoint. And then we have some other functions such as format date and format time and formats the events that come in. And then finally we have a load events that calls, calls that API endpoint that, that get endpoint and returns all those events and reformats those for display. And we also view has a, some life cycle functions and one is called mounted. We're using the mounted event to know that the, the component has finished rendering the initial view. And now it's safe to go and fetch all the events that are currently associated with the user and load those into the UI. And then we have some, some style, a few customized style elements. So let's save this. Now, to generate the UI, we're using, as I said before, the latest import module, import and export module syntax. We're using, you know, some things that are modern JavaScript and those, those things might not be available to every browser. We're also pulling in modules like the materialized CSS. We need a way to build all this stuff so that it's, it's in a package that we can deliver to a browser. So we're going to create a build process that will do all these steps that we need to generate a final JavaScript file, a final CSS file. And so we need some dependencies for our build process. So let's go back to the command line and let's use npm install this time. These are, these are development dependencies that you don't need. We're not needed for production. We're going to build everything that we need at the time of deployment and not rebuild anything during production. So we need, we're going to use nodemon, which is not necessary for building our assets. But this is a really nice utility that will make your life as a developer much easier. Nodemon monitors your project directory for any changes to files and then restarts your NodeJS application. So you don't have to continually be going back and using control C to stop a web application and restart it. Nodemon does that for you automatically. So we need nodemon. We're going to use npm run all, which is a nice little utility to chain together a bunch of npm scripts. We're going to use parcel bundler to do our bundling process of view and CSS and JavaScript. So parcel is similar to webpack, but it's kind of like zero config, zero code by default. It can understand a lot of different front end projects and create a appropriate bundle of those. So that's just a way of saving some time. We need a view component compiler utils for our bundling process and a view template compiler version two. All right, now that that's done, let's update our package JSON file to include some build scripts. So package JSON has a NodeJS with npm has a way of defining scripts that we don't have to use something like gulp or grunt or any of the other kind of script runners. We can use npm itself to do a lot of those kinds of things. So in the scripts, we'll define a build script that's going to use parcel to build the client folder and the entry point that we want to point it to is index.js. We're going to define a dev script, which we'll use node mon. And we're going to specify that it watch the client folder. And we want it to watch the source folder, both of those for any changes. And the file types that we want it to watch include JavaScript, EJS, SQL, View and CSS. So if we make a change to any of those types of files, node mon will detect that and restart the application. So if it detects one of those things, then we tell it to run, use npm run dev start. What's dev start? We also need to define that. So dev start is npm run all build and start. So we need to start script and that is just simply run node in the current directory. And that's it. You can run any script defined in your command or terminal line with the npm run label. And as I mentioned before, node mon is a fantastic utility. So the only thing we need to do now is come back here and run npm run dev. And that should run node mon and kick off our build process. I don't think it ran our build process. So let's go back and look at this again. npm run dev start. Oh, I forgot to include an exec parameter. So exec tells it what command to actually run and not just run node. By default, it does something like this. But if we wanted to specify a particular script to run, we need to use the exec parameter. So let's try that again. npm run dev. This time, it kicks off parcel. Now we have our local host running. Oh, no. An internal error. Well, let's switch over to the console and see what's going on. All right, so there's an internal error reference to message is not defined. Oh, so our homepage is expecting to get a message sent to the template and we're not passing a message. We're just passing the title. So let's go over to the templates folder under source or actually the templates find. Remember, it's looking for a message in the paragraph. So we need to go to our routes for the homepage. And we're just passing a title as a home. Let's also pass a message of something like welcome. Now what should happen is that node mon node mon recognizes the change we just made to the file and automatically restarts the server. So if we go back to the browser and refresh the page, boom, we get our login. Something strange though is going on here. We're not getting any of the stylings that we would expect to get. And I think I know what the problem is. So in our index JS when we just created the server route, we did not specify that the off mode should be actually this off mode. We can just set this to false server restarts. Let's try it again. Now we get the the stylings like we'd expect and click on the login button. It's going to prompt us to log in to our octa account. And since we have register self registration turned on, there's a link down here that says if you don't have an account, you can sign up for an account is really cool. Oh, no, got another error. What's what we got happening this time cannot read property first name of undefined. So I must have messed something up with the data. Maybe I miss misspelled something. So in our routes off plug in or let's see our off instead of profile. I called it process. You probably saw me make that mistake. Let's try it again. So it automatically restarted. And now we have our events. We have no events yet because the two dummy events that we added to the application database earlier, those are no longer in there. So click on start date and we've got a nice little calendar pop up here. And time we've got a nice time picker. So let's say 3pm. And we won't specify the end date and we'll call the appointment just. Of course, you could set that for every hour of the day as far as I'm concerned. And click submit. And that goes off and makes an API request to insert that data into the database and returns updates. Our list of events up here. So if I were to refresh the page, we still get a list of the one event. Let's add one more event. Let's say kind of like our conference will say it starts on April the 1st. And ends on April the 2nd. Virtual conference submit that. And now we got our virtual conference. If we want to delete one of these, we click delete and it's going to ask us to confirm if we click OK, then it calls the API to delete that from the database and refreshes our list of events. So I declare this application a success. I hope you liked learning about Node.js and using SQL Server. A little bit about happy and how to authenticate your application with Okta. If you like the content, please click the thumbs up and subscribe so that you can find content from us in the future. Thank you so much and I hope you have a great day.