 OK, good morning, everyone. My name is Yasu Naoyi. Thank you for joining my session against early in the morning. I'm excited and happy to see you in person since 2019 on Seattle and Amsterdam. So let me introduce myself. My name is Yasu. I'm a chief architect working at Document Innovations in Palo Alto, which is a Silicon Valley office of entity Docomo. Docomo is the largest mobile company carrier in Japan. Almost half of the population of the Japanese people are using Docomo's smartphone, such as iPhone and Android. My expertise is cloud, DevOps, and agile software development. I'm leading a cloud project in Docomo in Silicon Valley. I've been involved in Drupal community 17 years. So here is our team. Our team members are globally distributed in California, in India, and Japan. Our cloud team is designed as a completely virtual. So team is sustainable during the past hard time. So this is what we are developing. We have built Cloud Orchestrator as an open source project based on Drupal. So you can search Drupal Cloud Orchestrator. So this is one-stop portal, which is capable to manage multiple clouds, such as AWS, Kubernetes, OpenStack, VMware, and Chaffon. So I can show you some screen here. So this is the Cloud Orchestrator login screen. And then it's very hard to move the mouse cloud over here and then login. So this is the Cloud Orchestrator landing screen. And then you can see the map. And we can see the icon in some locations like Kubernetes, or here, OpenStack, and AWS here. And then when I click, it's very hard. Here, I can see, sorry, from here, it's very hard to see. But we can list up our resource in the cloud by using the Cloud Orchestrator. And then, actually, well, I wanted to show you the Oregon region that's in AWS Data Center here. I cannot see the detail here from my podium, so. But anyway, well, we can manage AWS, Kubernetes through the Cloud Orchestrator. This is based on entire Drupal based back to PowerPoint. So here's today's agenda. So first, I will talk about the development model. And second, what's B-HAT for BDD, which stands for Behavior Driven Development. Third, templating for B-HAT test scenarios. And lastly, automate testing for BDD. Before I dive into BDD, let me start with software development model. To understand the position of design, coding, and testing, I will introduce this V-Model for SDLC, which stands for Software Development Life Cycle. Because it is easy to explain about the relationships among design, coding, and test phases. Of course, it's not aligned to the Azure development, but basically the basic concept in Azure development is almost the same. Then the software development starts with getting business requirements from our customers and also system requirements. And then we can move on to software component design. Actually, thanks to Drupal Core framework, I do believe that we can design the component architecture with our minimum efforts. After that, we proceed with the function design, then coding implementation. On the other side of design phases, we have unit testing, integration testing, system testing, and acceptance testing. Those design and testing phases are corresponding to each other like this. So we handle TDD, which stands for Test-Driven Development at the bottom position. BDD aims systems requirements as a system testing. TDD, which stands for Acceptance Test-Driven Development, covers business requirements. We have a couple of testing types like this, such as TDD, BDD, and ATDD. In Drupal, PHP unit, as known as simple test, covers unit testing and integration testing. We can achieve BDD by BHAT and Drupal extension that I explained from now on. I think you could understand the position of TDD and the BDD relationship. So let us dive into BDD. So first of all, let me show the PHP unit code like this. So do you understand what's going on this code? On the other hand, this is BDD Test scenario code. The same question in the previous slide. Do you figure out what this BDD Test scenario code will do? So we can really understand the Test scenario because it is written in our English language, not programming code. Actually, this is programming code, but this is a natural English language. So the point is once we write a Test scenario in our natural language, BHAT runs the Test scenario as written in the code. This is a beauty of BDD. The learning curve is not so steep to learn about how to write a Test scenario. So non-tech people, like our product manager or product owner, can also write the Test scenario. One of our team members who has a non-technical background, actually her major is digital art. She's joining our cloud orchestrator development project and helping us to write BDD testing scenarios. So remember the B model of SDLC. Actually, we can convert to business requirements or system requirements, such as user stories to BDD Test scenario, as it is like this. So what's BHAT? The short answer is that we can test our web site by virtual browser by using BHAT command. Here is a long answer in detail. BHAT is the Test Suite framework for BDD, which consists of several components. In BHAT, we have feature files that is a set of Test scenarios written by so-called Gherkin syntax. And we install some extensions, libraries, to support Drupal and Drash, and the browser testing by web drivers. So we will use some extensions for Drupal web site testing. The BHAT command invokes web driver for Chromium web browser. And the Chrome driver calls the native API of a headless browser, which means virtual browser without any GUI. So this headless browser is actually just a process on our operating system. And then this virtual browser accesses to the actual website for AUT application under Test site. In our case, we are developing Cloud Orchestrator. So we want to test our live Cloud Orchestrator site by BHAT command with Chromium headless browser. So how we can write the BHAT BDD Test scenario? So let me move on to Gherkin syntax. Gherkin is widely used in a variety of programming languages other than BHAT in PHP, like a Python, Ruby, or Java in BDD. Starting from keyword feature on the top of this testing scenario that we call a feature file, describe the overview of this test scenario, and then starts the keyword scenario, writing actual commands to access to the website, like log in as anonymous user with the keyword given. I'm not logged in when I visit slash clouds. Then I should be on slash user slash log in URL. And I should see log in header. When we start the keyword given, it represents the initial states and the condition. Starting the keyword when, it makes BHAT take the actual action to the virtual browser, the keyword then, and check the expected results. So we understand how to write the test scenario by Gherkin syntax. It's so simple. The rule is very limited. Then soon I had a question, can I use parameters because the test scenario looks static? We want to give some variables into our test scenarios. The answer is yes. We can write, we can include the parameter in the test scenario, starting with the keyword, scenario outline. So we can give the variable parameter into test scenario like this. The green bracket words are placeholder, corresponding to the table header column below. In this example, there are three rows in the table so that BHAT will repeat this test scenario three times by changing the values of the placeholders automatically. Then my next question came up, how it works by natural language? So in the past, I joined some BHAT session in a Drupal call, but always I wonder why this natural language is a QT PHP code or a virtual browser. So here's the answer. Actually, Mink extension and Drupal extension PHP libraries working with the BHAT command defines a certain programming logic in PHP like this. The trick is that the method header includes the keyword as symbol and given and the sentence in this example. So BHAT analyzes the comment header of PHP code and then BHAT processes two files together. One is test scenario called features by Gherkin syntax like natural language that I introduced. The other one is PHP code as a program logic like this. So here is the example of associating with the parameter in the sentence and the variable in the code. We can include the parameter as the placeholder in the sentence and the logic can take over parameter as the variable into function method arguments. So we can even define the sentence using a regular expression. So maybe you have a quick question. Do we need to write PHP testing code in niche conditions, actions, and assertions like this? Thanks to Mink extension and Drupal extension PHP libraries for BHAT, actually, we don't have to write this kind of lots of PHP code because those extension libraries already define most of major operations for web access such as input text field, channel checkbox, clicker link, and so on. We have the more than 100, 120, 140 predefined operation for a website by using the Mink extension and Drupal extension. So most of the cases, we don't have to write this kind of the complex PHP testing code just to write the natural language Gherkin syntax test scenario. So we can use BHAT out of box. However, I wasn't satisfied with the scenario outline scheme because we need to give the hard-coded parameters into the table in the scenario outline because we need to give the hard-coded into the table in the scenario outline. What if we want to put credential such as a username and a password or API token into our test scenarios? In that case, can we publish our test scenario as an open source to drupal.rg for our cloud accelerator? So we decided to develop the pre-processor of the test scenario based on tweak template. We developed the pre-processor for tweak templating for feature files. By using our custom drash command, double brackets will be replaced into some values from separate YAML files. So we developed the templating pre-processor. So OK, it looks we prepared everything that we need. So let me explain how we automate it. So first, we built our Docker container as a BDD testing client. It downloads our cloud accelerator source code, including the test scenario templates from the drupal.rg Git repository. And our team could publish our test scenario without placing our own credentials to access to our back end system, such as AWS or Kubernetes, because these are just tweak templates without any credential or concrete, hard-coded parameters. So we run the Docker container. And it runs the pre-processor of templating by drash command. First, it reads YAML files, which contains parameter mapping that we stores on our local file system for such as a credential and a secret. So this private file, like in this example, private underscore promise dot yaml file includes the credential. Because we separated the secret and the credential into our local file system. And then replace the double bracket variables to actual values defined in the YAML files, generate the actual feature files into the design directory. And then now we can run bhat command. In our case, we execute run underscore bhat.sh command, which also kicks Chromium headrest virtual browser. So the bhat executes the BDD test based on synthesized feature files. And we can get the output. So now we want to automate the test process for daily automated testing. We are using GitHub CI-CD pipeline. We have already done with our automated testing for TDD. Thanks to Jupyter community, we can test our code on the Jupyter.org every time we push the code. But we want to have the same one, like automated testing system on Jupyter.org because in the past, unfortunately, we faced some problem that the automated testing system had been broken that stopped our development. So we built exactly the same CI-CD pipeline like Jupyter.org by using a private GitHub repository in our project. So by the way, here in the architecture basics of GitHub CI-CD pipeline, we have the developers in our teams and the GitHub CI-CD pipeline consists of two parts. One is GitHub master hosted by GitHub as a software as a service. The other one is GitHub runners. GitHub master has a GitHub repository and it has functionality of CI-CD pipeline. GitHub CI-CD pipeline system invokes GitHub runners to process a certain test. We can set up multiple GitHub runners like this diagram. GitHub runners are, in other words, those are workers, actually. In that way, we can run our test parallely. Therefore, we can schedule multiple test scenarios independently at the same time. Also, GitHub has its own container registry like Docker Hub when a developer push the code to the GitHub repository. The GitHub CI-CD pipeline detects the repository state and the event that is first is pushed, for example. And GitHub CI-CD pipeline puts a container from the container registry at the testing environment, including being hard to test suite. And then deploy the container to each runners. Then start the test parallely. So that's the story of the CI-CD pipeline automation. And here is our actual deployment diagram using Amazon EC2 for GitHub runners. So also, the test result is notified to our Slack channel. So how we can build CI-CD pipeline? So very simple. Describe the pipeline definition in one YAML file like this. Then put that file to the root directory onto your branch. That's it. So that's the story of our automation. So, okay, so let me show some demos for VHAT and CI-CD pipeline. Let me try to show the console. So let me log in my testing. Okay, so when we specify the patch link from the Jupyter RZ, and when I, well, just type the command and enter. So what this command is doing is, take the patch file from the Jupyter RZ GitHub repository with the latest code of our CloudOX data source code and then apply patch into the latest, the head of our repository. And it takes one minute. And then it will push to our private GitHub CI-CD pipeline. So now it is cloning the latest code from our Jupyter RZ repository and then apply the patch. Let me just check the link. Okay, it's hard to see from here. So let me go back to the PowerPoint slide. Sorry about that. Okay, then so here is the lesson that we learned. So we forked Jupyter extension and adding our own custom actions for the scenarios such as logging processes, take a screenshot in case of testing failures, test for table rules, assertion of success status messages. So these are what we implemented because the functionalities were not enough based on existing VHAT extension libraries. Here is the known issue and the solutions. If you are interested in handling those issues, please let us discuss offline. So in the last part of my presentation, I'll give you where to find our solution. Basically, our script to automate VHAT is including our cloud module in Jupyter RZ and our custom Jupyter extension is open to our GitHub repository. So also visit our website and learn about the cloud accelerator and also subscribe to YouTube channel. Please contact me if you want to use the cloud accelerator in your project or organization. Any development contributions are welcome at cloud project in Jupyter RZ. So that's all. Thank you for listening to my presentation. Thank you. Any questions? Oh, no, no. Yeah, we just take the VHAT, you know, we didn't have some other experience about the other project. Yes, we want to do that, but we just finished the automation of our VHAT with the desktop version. Yeah. And also I didn't mention about the web driver, the web browser. We switched from Selenium driver to Chrome driver because our team has some experience about some performance issue about the Selenium driver. It's very slow and then it makes the less productivity. So we use the Chrome driver, it's more faster. But, you know, as the next step, I know that Selenium driver had the variety of lots of, you know, the browser. So not only the Chrome driver, but also we want to, you know, go back to the Selenium driver. Probably, well, this slide deck will be available on the internet, so, well, you can download. I mean, so I said that the 120 or 140, it's not long, it's actually, you know, the web operations. Yeah, so click button, click the checkbox. We have a lot of HTML web components, right? The text field. And then the most of the operation that we can imagine is already predefined. So especially Drupal, so Drupal had some, you know, user or role permission concept. And then when we use the Drupal extension, the extension library already defined, for example, automatically create anonymous, create the temporary user like a simple test or PHP unit. And also Drupal had the specific concept and then we can utilize that library. And then we don't have to define PHP code or write the code by ourselves. So if we don't have any Drupal extension library, we need to write the, for example, log into the Drupal site or click here and there and check the checkbox or drop down list. Yeah, any other question? Yeah, yeah, yeah, so this one, right? So if we don't have any, how does that say, our custom, if we don't have any our custom template, we need to fill out, you know, you can see the blue words, that is a static words, like access denied and add cloud service provider, authenticated user, cloud administrator, right? So this test scenario is static basically. Even though we can give the parameter, the disk file itself is static. When we want to make our the test code to the public into the drupal.rg, it's okay. This is static test scenario. But what if we want to have include the credential, like a username and password, that we cannot do that? That's why we, you know, we implemented our custom, you know, the pre-processor with the bracket. And this is a tweak template. So tweak is the kind of a, you know, while the front end templating system. Then we can put the secret. Yes, yes, yes, yes. So here again, the bottom green box is, you know, our, the private file, including the secret. And then we give the tweak template into the pre-processor. And then that pre-processor generates automatically, you know, final feature file. And then our bhat-run command takes the, these feature file from the specific directory. So the pre-processor is kind of a generator, generator of the feature file by synthesizing the private parameters and the tweak templates. Any other questions? Okay. Thank you so much for joining my session.