 So, we're going to start so we can end early, because I know everyone is tired, and yes, let us begin. Cool. So, we're going to talk about our Skeletor scaffolding. I'm going to go into introductions and everything later, but first off, I'm going to show you what we're actually going to be building. So, our goal for this presentation is, right now, to show you how we build a progressively decoupled application into your own workflow using real-world examples and tools that Yoshi and I actually use in production. So what it is, and there's a little bit of a live demo here, so it's risky, but what's life without some risk into it? So we have here is a little application where we'll be able to dynamically show attendees at a conference. And this application here is React. So this is my React inspector. If I refresh the page, there should be a little flash of loading, and then it dynamically loads the user. This is just a card, simple, simple. When I edit the card, so I have the attendee, it has the title, it has their first name, it has a bio in some language I don't know, which is many languages. I'm going to save it, and then this is going to dynamically update. So we're getting live information from Drupal. So I removed the Drupal con, therefore the demo worked here. And what we're doing here is specifically using a React component within Drupal and building out an architecture to meet some specific challenges. The challenges that we're trying to face, this is very weird. Again, as I said before, the first one, we're going to build it so that there's a React component inside of Drupal. This isn't going to be two separate decoupled environments. It's called, again, progressively decoupled. So it's within Drupal, you'll have a Drupal page that provides a React application specifically in this context. And here we want to have one of the challenges that we want to meet is that we have a front-end team that is using React and a back-end team that is using Drupal, and they don't like each other. They want to have separate repositories, separate code bases, so that they can iterate and build on at their own pace. And then on top of that, we also want to have confidence in our React components. So with Drupal, we're used to unit testing, we're used to behavioral testing. And in React, we want to have the same level of testing and security, too. So given the fact that that's our end goal, that we want to have this little application of cards to come out, we want to meet these challenges. How are we going to actually do it? And so, for this presentation, we're going to talk about how we do that over four topics. We want to look at how we actually build a progressively decoupled site with our framework called Skeletor, and it is open-source and available, so you can go and read all my spelling mistakes and code. We're going to see how we can do continuous integration in a progressively decoupled environment. So as each team makes its change, they're able to pull in the most recent version of the other team's work. We're going to look at unit testing React components within Drupal. So as I build out the Drupal artifact or the final build artifact, we're going to have the unit tests run alongside us. And then finally, we're going to test for data consistency using what's called a JSON schema, which allows us to define the structure of our APIs. So given that this is the presentation, if that's not your cup of tea, cool. If it is, we hope you stay and enjoy the ride. So I talked a bit. I'm Erin Marchak. I am an associate director of Drupal practices at MyPlanet, where a web agency out of Toronto, Canada, so there's a little bit of time difference. I've been doing a handful of other presentations and work within Drupal 8 for the past few years, and I'm really excited to help show and promote the progressively decoupled approach because I believe that that's a lot of strength that it provides developers so that they can pick and choose which tools they want for the best projects and my lovely presentation partner. Hi, everybody. I'm Yuri Savenko, and I'm using, usually I'm using my nickname, Yoshi. I am a solution architect at MyPlanet, and I'm doing actually both Drupal and JavaScript, mostly JavaScript right now, and running some small JavaScript practice group in my company. Yeah, that's me. Cool. So we will have our Twitter at the end of this. If you guys have any questions, there will be time at the end of this presentation for questions, or if you have something that wakes you up in the middle of the night and you're like, they never answered that, you can always reach out to us. These slides are available now at the very, very, very tiny URL at the top at emarchak.com slash skeletor dash scaffolding. So if you want to follow along, you should be able to follow that on your mobile if you want, or download it later. So cool. Out of the four topics, let's get started on the first one. So I'm going to talk about how we actually build a progressive decoupled site with Skeletor. Skeletor is our scaffolding framework for actually building decoupled apps. So what the heck, what is the scaffolding framework? What is this? You are asking yourself. What it is, it's very similar to how Drupal Composer has built itself out. And we use a lot of the patterns and behaviors from there in here. It is essentially a Drupal install profile that allows you to create and build your own Drupal doc route using Drupal 8. The reason why we use scaffolding instead of such as a distribution or an installation profile is that working with the clients that we do, we tend to find that a lot of the challenges that we have on projects are unique and one-off. We usually need to override, extend, or manipulate code. So something that's a scaffolding that gives you a foundation to build on top of that you can edit, you can adjust the foundation, you can remove or add each time, gives us more flexibility than what we need. So it's available on Packagist. You can use it to download. And you're like, what does this look like? This is my local environment where I'm running it. So you can see here I have Skeletor. And what this does is it scaffolds out a doc route for you. Within there, there are some bits of magic that we like to use. There's a few hooks. We have a few scripts that are specific for us that I'll get into and show you how they help us to progressively decouple our site. So within Skeletor Scaffold, and one of the key components of actually building up this decoupled framework that we have is REST UI and JSON schema. And I'm going to go into those individually. And they're both contributed modules. They're available on Drupal.org, and I like them a lot. And what they do, respectively, is one provides an interface to manage and control endpoints that are available and configurable through Drupal Core. And the other one provides us a schema to describe it. And I'll show you how we can use that later on. So within REST UI, as I said here, it provides an interface for configuring Drupal's eight REST module. Now that's in core, so that comes out of the box. And you can see here on my site, this is the REST resources. It's under admin, config, services, REST. And all this does is provide an interface for you to point and click your way to worldly success and fortune and fame. But within here, I can enable and disable a lot of the different endpoints that are by default provided by core. And if I choose to add my own, you'll be able to see that in here. And all that I've done for this example, as I've enabled the node end point, I've enabled the get parameter so I can get the node and I can read it. And again, if you're coming in here later on, what we're going to be doing is, I have a little React application here for presenters that is within a Drupal site that allows us to consume these endpoints. So from the REST resources, I enable my content end point on the REST UI. I'm flipping over to what's called Postman and allows me to just read the nodes in an end point in a reasonable way. And so I've enabled the node type. Again, this is no configuration outside of just core and enabling it through REST UI. I get this lovely end point that describes my node and provides all sorts of different information. So if you've come to any of my previous presentations, I've normally spoken about how you can configure your own endpoints. If you're just trying to test this out, REST UI and the core REST module at this point is fine to use, but you can also choose to extend it if you want. So that's REST. There it is, there's the little slide that I talked about. The second one that I mentioned is JSON schema. And what this does is it allows us to generate a schema, which describes our endpoints. And what this does is we can then connect it to open APIs, or we can use it to provide syntactical information for the endpoints. Open APIs, their goal is to create an open description format for API services that is vendor neutral, portable, and open. They believe that that is critical to accelerating the vision of a truly connected world if you couldn't read that. But the point of this is, is that it is an endpoint that is agnostic to any application that describes the UI. Or sorry, that describes the interface for the endpoints. So it's giving structural information to react, to angular, to your phone application if you're using to consume those endpoints so they can understand what you're actually trying to send through the nodes. So we've enabled JSON schema. And you can see here at the other endpoint that I have pulling, oh, thanks Postman, that's helpful, that's great. But you can see here this is the actual endpoint that JSON schema provides, which you can then connect up to the open API. And all that does is it provides a schema, description, and object here. So it's, again, describing a payload for the node entities of the attendee bundle. So within those two modules, when we have a slide describing it, we've described and provided information regarding our attendees. So we now have something that connected application can look at, understand what the payload is going to be for the nodes or for the attendees, and then actually get the payload itself. So great, job one done, awesome. All I had to do was enable two modules. Easy peasy, I can go along in my Drupal universe and keep on coding. Now, again, the other challenge was is that we wanted to have a decoupled application or progressively decoupled application. But I wanted to make sure that I'm able to integrate each change that the React team does during each build of my process. And I want it in one doc route. I don't want to have to have multiple hosts. I don't want to have to deal with multiple server configurations. In the end, the React application is JavaScript. I just want to call JavaScript and then download it and then, like, anyway, I get upset when I have to do a lot of work. So my goal for this step is to, all I want to do is say composer require and then the module name. And we call this a Skeletor Scaffold. And this is a package on Drupal.org that allows us to then attach NPM packages within our application. So what I'm going to be doing is using Composer to fetch my node application, download it, and install it. So all I have to do is run that. So how does this actually work? If you go to Skeletor Scaffold here, this is a separate. It's just a Drupal module. And what this does is it provides basic scaffolding. And again, this is open source. You can look at it and insult my naming conventions. That's fine. What this does is it's just a Drupal module and it allows us to download NPM dependencies. And how you actually download. So you're like, how do I use Composer, which is a PHP dependency manager, to download node, which is a JavaScript dependency manager. There are two separate languages. Hold on to your pants. So what you can do is register your custom package within your Composer.json. Now, Drupal already does this. It registers its own customer repository here. And again, I'm just looking at our core hey. Woo. Wrong button. I'm just looking at the core Composer.json file. And you can see here, under Repositories, by default, we register packages.drupal.org. And that's Composer Registry. What I'm going to do next is then register my own NPM package, which is my custom repo that is the code base that all of the node developers are going to be handling. And I register that within my repositories in my Composer.json file. So you can see here, I name it. I give it a version. I give it a custom type, which is we call NPM package. And then I provide it get information. We're going to be using the presentation branch here. And then I have the URL to the direct repository. One of the reasons why we're able to actually have a custom type, so again, NPM package, if you're used to Composer, you'll see stuff like Drupal module or theme. We're creating our own one here. To create your own one, you need to include Composer installers extender. And what this does here is it allows us to declare our own installer types. The installer types, it's later on in the Composer.json. You can see here, we've registered NPM package. And that's it. It allows us to, I say that's it, it allows us to then, when we see something called an NPM package, we download it. Again, we've declared what this is. And we say, I want to put this in my doc route alongside my modules. I want to install it here. And it's the same pattern that Drupal uses. Where is it here? You can see here, type Drupal library. I want to install it where my library should be. Type Drupal profile. I want to install it where my profile should be. Type NPM package. I want to install it where my NPM packages should be. So what we've done, and you can choose to pick this anywhere you want. It's I have under my modules directory, I have my NPM packages. And that's just where my node is. So from this, and all we've done, there's nothing edited or custom in the node package. If you wanted to, you could include a Composer.json in your node package, but I've talked to the React guys and they're like, don't touch my node work, it's mine. So fine, I handle it here. And then I'm able to pull it in and download it. The floor is now closed. So I downloaded the node package, but how do I install it? Because again, I'm very lazy, and I don't want to do anything more than just that one require. Again, shockingly, all we have to do is play around in Composer.json and add what's called an additional script. Now, if you've used a node before, you're probably familiar with post install hooks. Other dependency managers have that. But what we're able to do is chain commands that are done after each process. So we've created where we've registered a special command. Again, this is why we have our Skeletor, so we can extend work. And we have a command that explicitly tries to run MPM install. We've registered that. And then as Composer, I can run MPM install. I can run MPM build. And I attach that to my post install command. So when Composer installs, and you can have a build, you can have a build, you can have an update. There's a handful of these before. It's really, really extensible. But all I have to do is run Composer MPM install. And then it goes through, and it actually creates or fetches, not fetches. It runs MPM install for you. Now, to create these little commands, we have to actually write them in PHP. Thank goodness. That's what I prefer to write. And we have them, again, this is available from Skeletor. We have them in MPM package. And you just create your own specific class. It extends it out. And I register different commands here. And again, if you want to look at this and play around with it, all this code is available online. So I'm actually able to run different commands. And I use different libraries that are available to me to find the Drupal root, to crawl through it, to fetch different things that are the MPM packages and then actually run the command itself. And then from there, what this does is when I run Composer install, it looks at my module. It sees that I have a custom package that is my node package. It fetches that. It downloads all of the dependencies. And after a post install hook, it runs MPM install, MPM build. And so it means that all I have to do is Composer install to run. And this is our Travis instance. Ne, nee, nee. It actually runs Composer.install by itself, if you can see it there. And then you have all of the lovely output of MPM install. Fantastic. But it means that as a PHP developer, I don't need to learn anything new. Once this is set up, your team can just run Composer install. You can, you're off to the races. And what it does is, additionally, to installing everything through Node, it actually allows us to run tests. And it allows the Node team to go through and run the tests. It passes. And then I can deploy this up to my hosting provider. So by using Composer and Composer install hooks and the packages really, really elegantly, I'm able to just run Composer install and fetch a Node package, build it, and run it out. So you're like, that's great. That's amazing. How do we actually then test the React packages? That's my point. Yeah. Thank you, Erin. So yeah, I'm going to talk today about two things about React and particularly about how do we do unit tests in our teams? And how do we connect React applications and Drupal even better by using JSON schemers? And where do you, where's the use case for that? I'm not really sure if I need to explain a little bit what React is. But can you guys raise your hand if you work with React? Or work? I think pretty many people. And speaking about React, I particularly will be speaking about React Redux application because we use Redux for our state management. And we can go to the next slide. So when building tests, first of all, we need to identify what are the moving parts in our applications and where the problem may be, may appear. So we can identify three main things or three main sections in our React applications, which are components. Potentially, user interact with them and something may happen. We want to make sure that component renders correctly every time. We have actions. It's specific to Redux. But anyways, actions is probably one of the most important parts where the business logic may live in our React applications. And we want to make sure that whenever action has been dispatched, which means something happened, like something changed, we want to make sure that nothing broken. So we want to cover that with tests. And we have users, which technically it's a function which controls our application state. And we want to make sure that in the reaction to the particular action dispatch we want, our users update our application state correctly. And I think I will jump to the next slide and what recommendations and what tools are we using. We use Jest. There is a quote you can read. I just steal it from the Facebook page. But basically, it's a test runner. We use Jest as a test runner for a couple of reasons. It's very simple configuration. They claim it's a zero configuration. It's really possible to run it with a zero configuration, just install, and start writing the tests. And it's really buying whatever Facebook mentioned at the very beginning or at the very end. It's something like, at the end of writing, it's provided with ready tool and developers end up writing more tests, which turns results in more stable and healthy code bases. So simple configuration. Snapshot testing, it's a tool we use from time to time. But basically, it's really nice. It can speed up the process for you just as fast. By parallelizing, it's a claim from them. But basically, we experienced it. It's pretty good. And it has built-in coverage reports. Sometimes, probably, for clients and even for yourself, we will see the results right now. And it's important to see what the coverage. We also use Enzyme. It's a JavaScript testing utility. And basically, we picked it for a couple of reasons. It's jQuery-like API. It's easy to write selectors. Whatever is rendered by the react, it's easy to define. And the second one is a shallow rendering, which allows us to build effective unit tests. Test, particularly, those components who want to test and do not bother about any side effects introduced by child components. And I will go through a couple of code samples. I did simplify it a little bit, our component, and this code overall, to feed the screen. But on the left, you can see the pretty simple component. Basically, this component, what you saw Erin presented, it's like a small tile which shows the presenter at the moment. And on the right, we have a test for it. And this test was just making sure that component renders properly. We mock some data on the input. We render a component. We pass some props into component. And we're making sure that the expected content, which should appear on the component renders correctly. And there was some action. I posted here an example of asynchronous of the test for the action, which has an asynchronous request to the server. So again, this action, if we go from top to the bottom, it dispatch one action to show some too-informed application that I'm going to start fetching the data, this get data, get data printing. So properly, you want to show some sort of trouble or indicate that request is going to descend. We use access to send the actual request to the particular URL, get response, validate. I will attach these two lines in the next section for the JSON schema. And if our response has been validated successfully, we want to dispatch data with successful response. Otherwise, we made reports about error. So it's a pretty simple scenario. And on the right, you can see test file. What we do over here is basically we mock the response instead of we just say to access, hey, access, if test is going to send the request to this URL, do not do it, and respond to some mocked response. This allow us to unit test with particular action and to compare at the end. Our whatever is selected is expected actions should be dispatched by this particular action and would just assert if those action were dispatched. And example test for reducer. So reducer is just a pure function. It's much easier to test. So I don't think I will go pretty quickly through this step. So the purpose of the reducer is to check if some action has been dispatched and update its own portion of the state. And that's basically what we try to mimic with our unit test. We have, we just mock some action. We just dispatched it. And at the end, we do whatever should happen in the reducer. We just assemble the same data structure and compare it with whatever returned by the reducer. And the test results. So you saw it from the Aaron's presentation. I just took a small portion of it. So you can see our three example tests we saw on the previous screen run successfully. This one is unsuccessful. And just will just inform you what happened. For example, in this case, expected name, which appears on the card, was not rendered correctly. And this is how test coverage looks like. We have a couple of tests. So again, everything on this can be run, can be started with a composer. And we actually do it coverage properly. It's up to your discretion to do. And summary. So just quick overview what we have seen, we unit tested three main moving parts of our React application, which are components, actions, and reducers, and we saw the coverage report. And we can move to another part, or second part, which is JSON schema. So Aaron mentioned about that. And from the React point, I consider this quite important, because React by itself relies on the data structure Drupal return to React in request. And it's an area where the problems may appear. So as a React application on the React side, on the front-end side, we do not have control of the data structure, which has returned to us. So we need to find the way, how can we early report about this problem and do not discover it on a particular component, or maybe in every component check, like this object has a particular property. We can do it with JSON schema. And just a short quote, what JSON schema is, is vocabulary that allows you to annotate and validate JSON documents, which is a mentor for us. And I quickly outlined the problem, what do we want to do, and where the problem comes from. JavaScript is not strongly typed language. And if, let's say, Drupal return to you string instead of integer, you may want to do additional validation, additional work, and maybe if that happens after your React application finished, like if Drupal returns you some invalid data, you even would not notice that React doesn't render on the front-end, I mean on the production. So we suggest JSON schemas. There are a couple of JavaScript implementations for the JSON schema validators. I printed seven. There was a link at the bottom with a little bit more, but yeah, what are we using? We use HGV. And we find it pretty good. Usually it appears on top of benchmarks, and it's pretty informative in case of what data, what errors does it report to us, if any found. And on the screen, I will show you. This is a small example. Properly on the left-hand side, you will not see such a clean result from the Drupal, like out of the box, or maybe you can. But this is a sample of JSON, which potentially returned to the React from the Drupal back-end and on the right, you can see, again, it's a JSON schema file. And we're going to validate our response against this schema. So what does it mean? We expect to see an object in the response, which has properties, and properties are ID, title, first name, last name, bio, and presenter. And each of these properties has a particular type. It's like type integer for ID, or presenter should be Boolean. And we expect, like, ID, first name, and last name to be required fields. So if, for example, first name is missed, or ID was not returned, we should consider the responses invalid. And this is an example. I just returned back. You saw this action creator. But basically, on these two lines, we just take the schema, like a particular compiler, and validate our response, which appears in the RES data, against the JSON schema. If something goes wrong, we can graceful file application, in our case, just dispatch a message and submit the validate errors to our payload for our action. Or you can, like, if everything OK, you just go ahead and continue rendering the stuff. And to summarize, we just see on the one example how we use JSON schemas and how it can be used. There are more, obviously. You can make your React application, which was not an example, you can make your React application validate the data prior submitting it back to Drupal. So if you post something, or basically if you post something back to Drupal, probably you want to do the same to make sure that data has been assembled in the proper format. And you should use a separate JSON schema file for that and validate the data against it. You can make your React application report back to Drupal about the problem. So for example, on the previous slide, we submit the validation error as a payload. So technically, if it's reported to Drupal, on the Drupal side, you can decide what to do. Maybe on your production project, something broken, your Drupal will update the log, and it may notify the some key person, key people. How dare you suggest my Drupal site is broken? Did I say Drupal? I'm sorry. React application is broken. You can massage some data. Let's say if some data conversion is possible. This is a tricky area, but sometimes it's doable. So let's say all you need to do, you just need to convert string to integer. You can do it, revalidate your updated payload you've got from the Drupal. If validation happens, you probably may want to continue rendering your React application. And I think, yeah. And just to mention about JavaScript types, consider using the prop types. It doesn't help you on the production project, but it will help you during development in forming that something got in the improper format. So let's say same ID you've got as a string. In the console look, you will be reported about this problem. Yeah. Here we go. We kind of powered through a lot of that. In the end, we do think it's really important that the Drupals, if you're building any form of decoupled, whether it's fully decoupled or progressively decoupled, working together as a team between the individual development teams is really important, especially in terms of validation of data. Because when data is being translated across the systems, that's when it can get wrong, it can get manipulated, it can be risk of attack if you're doing anything secure. So using Drupal to provide a schema so that React can expect certain forms of validation is really important. The other side of that is that some questions that we would want to consider is, does it make sense for the server to send information to React describing its own data structure when the server is rendering it? Or should React be defining the schema itself? So React defines what it should be expecting from the server. And Drupal should expect what should it get in return. Because essentially that's the Rosetta Stone, the schema, is the Rosetta Stone of the translation of the endpoints to each other. It's a way of communicating what's there, how it's kind of formed, and how you can actually structure it properly. Another point for the future is that I showed you how to download a custom repository using Composer. If you have many of those, it can get really overwhelming and your Composer file can get really, really big. One of the options that you can do is host your own private packages. So in the same way that perhaps Drupal is not on the general packages, which is the PHP repository, you can provide your own packages for your team. So if you have private packages, if you have private dependencies, if you want to control your own structure and then distribute it out there, out through that, you can do that for yourself. So if you want to have different node packages that PHP developers can pull in and build, you can do that quite easily by just declaring them in your own private packages. And then finally, there's been a lot of playing around with different node environments, whether it's decoupled or fully decoupled, progressively decoupled. One of the barriers that we're finding is that people have to deal with so many new tools. Going with progressively decoupled allows PHP developers to kind of bring React and everything into their own ecosystem. If you're looking at introducing a new technology to your team, maybe try to choose the way that is the least invasive. I know that Acquia and potentially Panttheon are playing around with using their own node hosting providers. So if you could introduce node in a way that allows teams to use their own hosting, their existing development tools, that's another way to kind of lower the barrier. So you can start having these cool conversations of this really exciting synergy that you can get within Drupal. Yep. So as I promised, we started on time and hopefully we're running a bit early. It is the last session. If anyone has any questions or wants us to go over anything, because I know that was really very fast for the end of the DrupalCon, please come up and let us know. Can you come up to the mic, because it is recorded, please? Other than that, if you want to tweet me or my planet, and therefore get Yoshi, because he doesn't like technology, that's a lie, you can reach me at Emarchek or my planet at my planet, and we can continue the conversation. So how long did it take you to develop this solution, let's say, to get to a product that you can use as a framework for your projects? We've been using the Skeletor system. So again, just deciding that we would prefer scaffolding over building on distributions. We've been using that for over a year, just again, spinning it up, and every time that we find a tweak or an adjustment, we bring that in and add that to the next project, so it's kind of evolving through time. A lot of it is based on the principles and the patterns that are defined in the Drupal Composer repository, so using those post-install hooks to manipulate files and using that, we borrowed from a lot of different things, so we're really inspired by the work of the community from there. So I would say we're not there yet, it's still evolving, but it's something that you can look at, hopefully you can fork and steal the good ideas and let us know which ideas you would like to contribute back. Contribute, yeah. Yeah. In terms of the React components. Yeah, it's pretty similar, it's real hard to track any involvement of the project because we usually discover something during the production work we did and return it back to, in case of React, it's usually reusable components. We would like to speed up our process for the future, so we create a repo of reusable components that we will use, so yeah. But initially it looks like it's not that hard, but eventually it's more and more time invested into it, so yeah. And I think with these decoupled approaches, you're seeing a lot of reputed code, it's a lot of again structures that are redefined and you reuse throughout projects, especially on React components, so looking for ways that you can avoid writing the same boilerplate code that you have every time is really helpful, whether you're optimizing for components, whether you're reusing installation scripts with Composer focusing on that to try and make sure that the developers are not just sitting there typing boilerplate, is really, really important. Anything else? Cool, well we ended a bit early, but then that means everyone can go out and check out the closing comments. So thank you very much. Oh and finally, I'm sure you've heard this before, because I'm assuming this is not the first session, you can find this presentation and these slides at the URL on events.drupal.org and please take the SurveyMonkey survey to give feedback for our presentation and DrupalCon Vienna, because we want them to know that we really like DrupalCon Europe.