 I'm just going to get started here. So hello, thank you for coming. This probably came here for the fusion of sensors and robots bit, but there's actually two parts to this session. First part is Fukunosa here. He's going to be talking about the fusion of sensor and robot, which is a blueprint that we've got under the Acreno LF edge umbrella. And the second part is my part. I'm going to be talking about edge for anybody, which you may have heard about in the keynote. It's an introduction to this new edge service enabling platform concept and blueprint for creating and supporting edge services, such as the one that Fukunosa is going to talk about, the robot basic architecture. So I'm going to pass to Fukunosa, and then I will talk when he's finished. OK, let me start by fusion of sensor and robot text. Let me just start by my self-introduction. My name is Haruhisa Afkano. I'm working for Fujitsu and working on the business and architecture planning about edge computing area. And now I am a member of the technical steering committee in the LFH Acreno project. And I am leading a robot blueprint project in the LFH Acreno. And I would like to thank the member of our communities, Professor Shimizu and Jeff. Professor Shimizu is working on the LISMAKER University, and he is expert of robot hand R&D using electronic devices. Jeff Brower is CEO of the signal logic. He is expert of the signal and image processing and algorithm. The objective of today's presentation is to introduce our activity about robot text in the open community. Here is the outline of this presentation. First, I will explain about what is the industry five. And secondly, I will explain about the robot challenges which robot faces in the industry five. And also the solution for these challenges. As a solution, I will explain about the sensor-rich soft end effector system, and also the right-weight automatic speech recognition. And also I will explain about the activity at the LFH Acreno project to promote these solutions in the open community and to enhance using the open community power. CPS robot blueprint family and also a robot basic architecture based on the SSES blueprint. Finally, I will explain activity in the future. This slide indicates about history of industry revolutions. As we know, industry one started in the 1800s with the mechanization using the water and the steam powers. And also industry two started in the 1900s with the mass production using the electronic power assembly line. The Ford car assembly line is famous in America. And also, industry three started using the computer automated production like robots. And now today is the industry four. Cyber-physical system is the typical example. Correcting data from the physical world like a robot and sensors and analyze in the cloud and feedback to its findings to the real world is the typical cyber-physical system. And also now we are beginning of the industry five. Industry five have two key attributes. One is the mass customization. And second is the human-robot collaboration. Mass customization is not only automated using a robot but also customization according to the consumer preference like Nike ID in the sneaker website. And also the human-robot collaboration. Human-robot collaboration will improve the complex task which today only human can do by the robot. So as we can see from these key attributes, robot plays important role in realizing the industry five. We know robot play important role in realizing industry five. So can we use today's robot in industry five? I think we can not. I will expand the reason and its challenges in the next few slides. First, let me compare the requirement of today's robot and the robot industry five. When today's robot handle the object, object is the standardized, the uniform, and the same sizes and same form. And field circumstance around the robot is always stable. For example, position of this object is the same. Therefore, the movement of the robot is always constant and the same. Its program is simple and the same routine. Like these car factories and electronic devices factories is the typical example. On the other hand, object for the industry five where we devise shape is and flexibility are functional properties because the mass customization will change the parts of the product according to the consumer preference. In addition, please imagine that retail store, retail store have many merchandises which have the device, shape is, and the flexibility and the frictional properties. And also the field circumstance is ever-changing. The prominent example is agriculture. The ground is uneven and the weather is always changing which will change frictional properties due to the rain. So in such industries, challenges for the robot will be the automanias handling control by robot itself. There are other challenges in industry five. Let me compare where and how the robot is Let me compare where and how the robot work. Today's robot and robot in the industry five. Today's robot is isolated from the human due to the safety, like these fences. And once human program the robot, robot will play the same routine. Repeat it, repeat it, repeat it. On the other hand, robot in the industry five will walk with the human in the same field and the human ordered to the robot and robot act flexibility by the human orders. So challenge in industry five will be the rapid communication with the human for the robot. We have the solution for previous two challenges. For the freaks, for the automanias handling control by robot, it's make a university develop and research flexible robot handling for the various object on the various environment named SSC, sensor, reach, soft and effective system. For the challenge to rapid communication with human, signal logic developed reliable and low latency speech recognition. From next slide, I will explain these solutions. First, I will explain about SACS for the automanias handling control. In this slide, I'd like to explain how SACS is trying to solve the challenges regarding the various object, various circumstance. There are two big approaches. One is the enhancement of the cognitive ability and the second is the new mechanical. Plus two enhancement of cognitive ability, SACS developed the sensor reach technology for multi-dimensional data acquisition like this multi-sensor module. And also use AI IoT technology with the force and contact information by the robot. And also IoT maintenance and inspection technology. The second is a new mechanical for the grab, grab something. Flexible manipulator using polymer materials and also using advanced 3D printing technology to realize a complex form for the hand. You can watch demo video on the YouTube, YouTube website like these, these washes, remove these washes and serve on the plate this washing. In the video, you can see the automanias three determined the robot position of the object and grab and control the grab. And about enhancement cognitiveity, we have developed the R-CPS as the mechanism for correcting and analyzing sensor data in the SACS and providing feedback. This slide is about, explain about the R-CPS. R-CPS stands for Reconstructible Basic System for Cyber Physical System. R-CPS enable you to construct and reconstruct various system easily and at low cost. R-CPS, there are three main component. One is MSM, PDH and cloud. MSM is much sensor module have external interface and also on both senses. By the external interfaces, MSM input to the external sensor information and also output to the actuators such like robot and motor, the barb to control it. And also sending data to the PDH, it is the IoT gateway via the Bluetooth and L-P-W-A and all wired. PDH is the physical data hub. It is the IoT gateway. It have several functions, adding location information from the sensor data and also the timestamp and also add unit. For example, the meta by second and also the formatting sending data by to that cloud following the JSON format to use the data easily in the cloud and also diagnose the data from the sensor devices. And the cloud have stored the data from the IoT gateway and also the analyze the data and the feedback to the physical world. Fujitsu have developed the basic processing analyzing software for the cloud. I will explain later. This section describe a signal logic ASR to realize rapid communication with the human. Signal logic develop the lightweight automatic speech recognition engine. This engine recognize urgent safety command, voice command like stop, robot stop and also the operating commands like change mode. Signal logic ASR is lightweight. It only less than the 50 watt power consumption and also no fun. And also consume only one atom CPU core which is the Intel small CPU. Even if it is the small, it have enough vocabulary, 20 kilowatt vocabulary on the one motherboard. And also they have noise removal function to minimize the robot background noises such as the sub motors and on the wheel. So signal logic now they are working on the Rumba onboard ASR demonstration. So as you can see from pictures that this ASR can work on the such a small motherboard. So signal logic ASR accurately recognize voice command in the field regardless of the internet connectivity because of the on premise edges and also prioritize the human safety. And we're working on the open community to publish our solution as the open software stack to make it easy people implement advanced robot and to advance this solution using community power. Next few slide I would like explain about our activity in the open community. We are working on the LFH Accraino project. LFH has several project. Accraino is one of them. And Accraino community creates edge computing use cases and it's OSS stack to realize the edge computing use cases. Fujitsu and its maker and the signal logic publish SSES RCPS ASR as the OSS stack, robot blueprint OSS stack for everyone to use. We have already published the robot blueprint in the Accraino project. In this slide I will introduce the summary. There are many challenges and use cases for the robot to apply many industries. So for example, how we simplify that teaching to the robot and also the high level cooperation among the several robot. So these solution will be a combination of the technology, elemental technology. So therefore the various company need to cooperate with each other. So we launched CPS robot blueprint family on the Accraino. And now we are focusing on the one of the blueprint robot basic architecture based on the SSES for the OSS stack for the robot. This is our robot basic architecture based on the SSES blueprint. This blueprint provide open software stack for correct data and analyze data and feedback to the robot control. We have already released the collecting data in the Accraino release six. Now we are working on the Accraino release seven. This will be the middle of December this year. We will add the analyzing data function. And in the release seven we have received Accraino hours blueprint of the year 2022. I'd like to appreciate the community member and their help about this award. This is the detail of Accraino blueprint release seven which we are now working. We will enhancement functionality data processing and the data analysis. Data processing and analysis that depends on the barriers of the use cases. So we will release the basic data processing and analysis software library to analyze to realize the barriers of variety of the use cases. And also we will add the signal logic ASR. So as I said before slide data processing and analysis method differ depends on the use cases. So because we have the barriers kind of data barriers patterns. For example we need to remove data when robot is standby such like that. So we provide basic software function library which accelerate implementation of data processing and analysis. For example, split data and also the detect the peak and also calculus and also FFT and remove noises. And so please check we will release this document on the Accraino week page please check about that. So in this slide I'd like to say last say about our activity in the future. We will add the feedback path from the analyzing result to the real world. But however there is a difficulty in the interfaces with robot and robot because there are various kind of and hand depends on the use cases. Some hand moving by the torque motor and some hand moving by the air pressures. So there is many variety. So we need to absorb the hardware difference. So for I think loss two is the candidate of the solution. So we are looking for blueprint collaborator. So please feel free to contact us if you are interesting. To end my presentation I will sum up our main point. To solve the challenges of today's robot we fusion of robot and the sensors. And we published it open community LFH Accraino to develop these solutions. We are looking for the collaborators. Thank you for listening today about my presentation. I will pass today. My colleague. Hello, my name is Colin Peters. I would like to describe to you a platform that we have been working on as a concept for a few months now. The basic idea of the platform is to make it easier to design services at the edge and including the edge. Hopefully so easy that almost anyone in the world will be able to do it. So I think it is a great opportunity hopefully so easy that almost anyone can do it. Which is why this presentation is called edge for anybody. So first let me briefly introduce myself. So my name, I already said my name. A picture, by the way that picture is taken in Banff in Canada. Which is not where I was born but in Canada is the right country. I wasn't born in Banff. But I am Canadian. I'm a software engineer. I've been working for Jitsu now for about 25 years. Just a little bit less than 25 years. Mostly in embedded systems. A lot of it was in networking sometimes in other areas. In the last few years I've done some work in IoT and gradually moved into this edge computing area. Of course over the years I've used a lot of open source Linux all the time and that kind of thing. But I've also made a couple of contributions. Way back in 1996 when I was in university I came up with a way of building windows executables using GCC and that became Ming-Win which some people may have heard of. I handed it off to other people. I'm no longer involved with it but I checked and there's still people downloading it and still using it so I'm pretty proud of that. Also I'm currently involved in Acreno along with, if it goes on here, I'm a PTL on two projects. One is the Smart Data Transactions Project. Smart Data Transactions for Cyber Physical Systems. And the other one is Edge Service Enabling Platform which is what this talk is about. So enough about me. Let's talk about the Edge. So what I mean by the Edge is anything that has computing power close to the end user. So that can mean something like a personal computer but usually when we talk about the Edge we mean interaction with the physical world and not just through UIs but also sensors and devices like cameras and plus ways for the system to affect the physical world like robots as Fukuna-san described. And of course today we also want to connect our systems together so we want to share data with other locations. We want to analyze it with powerful servers in the cloud. We want to collect it in databases to get new insights. So the Edge is usually more than just one or two computers. The Edge connected to other systems. There's been a lot of activity and interest in building services at the Edge for many reasons. Sometimes you want to handle events in real time and you want to have feedback. You want to bring data into the computing systems from the physical world and process it. And today everyone pretty much carries a round of computing power with them in their pockets and it's always connected to communication networks and it's also possible to include tiny little computers small and cheap low power computing devices and sensors almost anywhere in your house or on the street or in factories or in stores. So people naturally have a lot of ideas for things they want to do with this stuff. Which is great. The Edge is great. The problem is there's almost too much stuff. It's complicated. Getting all the different moving parts to work together from the Edge up to the cloud is really daunting. It takes expertise with many technologies some of which are still rapidly evolving. No offense to any of these examples I've picked for this slide but they're all great technologies but they all have learning curves. And of course there are way, way more technologies than it could fit on the slide. So you might have an idea for a new service or something that you want to put on the Edge and to get that service up and running you're probably going to have to learn about containers and Docker and VMs and maybe Kubernetes and overlay networks and database technologies and monitoring tools and orchestration and network functions and maybe you have mobile technologies in there and you want to learn about 5G and that's before you even get into the specific technologies for sensors, industrial commands, control systems, robots and whatever you're connecting together at the Edge. Or maybe you have an idea for this technology that you want to supply like a sensor or a piece of cloud infrastructure that you think would be really good and useful at the Edge for people creating new services. But you need to put it together with the other pieces of an Edge service and then make sure it plays nice with everyone and maybe have something to show people proof of concept, say this is how it works and that's hard. So we looked at the flow of planning of a service through planning, through design, deployment and operation and at each stage there are these interactions with the technologies that require specialized knowledge expertise as shown in the center of the diagram here and all of these points, the people creating the service have to go and they have to talk to the people who are creating the devices and the infrastructure and the tools. So the Edge service enabling platform goes in the middle there, it's shown and it contains an Edge service catalog which provides reusable and extendable design patterns for people who want to create Edge services and these designs are created by combining system components, definitions which encapsulate the configuration of the components and their connections to other components so that the deployment and operation of the design can be highly automated and the Edge service creators, they use it as they plan the design and deploy and operate their services and the platform makes it easy to find the service designs they need to match their needs and to customize those designs and extend them as necessary and on the other side the device and infrastructure providers use the platform to create a one stop registry of all the parts that are needed for an Edge service and they can find new use cases for the Edge service in the catalog and include their components in these service designs to use as proof of concepts or as demonstrations. So there are four principles that I want to go over that I think will help the Edge service enabling platform address the complexity of creating Edge services and they're shown here the abstraction, interaction, automation and community. First, abstraction so abstraction is one of the basic ways we address complexity it's grouping parts together or hiding in unnecessary detail without losing meaning in the Edge service enabling platform we make use of abstraction by presenting Edge service designs to the user with a level of complexity that they can control according to their needs for example, the picture on the left is a top level representation of a fairly simple Edge service there are several parts shown but it's not like overwhelmingly complex and you can look at that to the diagram and see the overall concept of the system if the designer wants to examine the details of the system they can open up each of the functional boxes and they can look inside and look at the specifics of software and hardware that are used in the system and they can go even further and they can look at the way the components are connected to each other and the flow of control and data through the system the second concept we're using is one of interaction in addition to easy to understand graphical interfaces we want to enable the system to understand the user's intent from natural language interactions and extract necessary information from data sheets and specifications we hope to have a system which can make smart inferences and suggestions while carrying out a conversation with the user for example the system could suggest a set of potential Edge service designs based on a few sentences describing what the service creator wants the service to do another type of interaction would be when the user modifies the service design the system can pick up additional components that are required and suggest the components and their connections and their placement third is automation automation is a topic which has already had a lot of work done on it and we just want to build on what's already available and not reinvent the wheel so what the platform will do is it supports components to automate their own deployment and their testing operations like upgrades and eventually should be possible for the platform to also support developers as they are automating their components the user interface for deploying and operating an Edge service also needs to be simple of course an expert can reach in for their knobs and levers that they need to fiddle with but for default things should just work finally one of the one of the basic features of an Edge service is that it's got a lot of technologies all working together and there's going to be a variety of tools involved in this so the platform needs to coordinate these tools together and provide a simple coherent experience and make sure that all the parts of the service are working smoothly so the last pillar of our system is the community itself so the Edge service enabling platform is meant to operate as a community of component developers service designers service providers and if you refer back to the diagram that was shown earlier you can see how the community of users contributes to the Edge services catalog each in their different ways I believe open source is a great model for approaching this it's a collaborative effort and the platform can also be built by a community and to that end we've started this Edge services enabling platform as an a block blueprint it's just moved into incubation and you can check out our progress on the public Wiki shown here and if you're interested in joining in you can contact us we'd love to hear from you for a short video there's a demonstration just to give you a taste of how this platform is supposed to work the video shows a service based on the Grano blueprint that Fukunosa described earlier so it's got the robots and the RMSM multi-sensor device that he was talking about and what they do in the demo is an additional analysis function to that service model there's other aspects of the platform that I haven't gotten into like registering new components or adding components to existing service catalogs and deploying services but I picked this particular example just to highlight the four pillars that we were describing earlier so please enjoy the video oh sound no sound yeah we're missing sound Edge service used at a food plant employee will create a new edge service by adding a data analysis and feedback component to the catalog of the robot control edge service they are already using the side menus on the platform give user tips on what to do user selects edit edge service catalogs to edit the edge service catalogs here the platform provides a conceptual diagram from which the user can select the basic configuration and functionality of the edge services catalog if you are not an expert this conceptual diagram provides a clear picture of the requirements and structure of the service the user selects robotics to search for the catalog of robot control edge service that they are using the platform can automatically extract the required conditions for the selected component when a user searches here the search results are displayed the user selects the catalog of robot control edge service they already have in operation the platform displays a conceptual diagram of the catalog selected by the user users can also switch between modes and add components from conceptual diagrams the platform anticipates what you want to do in this demo the user enters I want to automate the tuning of food manufacturing robot operations to maintain the quality of manufacturing the platform customizes suggestions and questions based on user input and asks the user questions the platform asks the user to confirm I understand that you want to add a data analysis application for feedback generation for robot control of this edge service the platform suggests additional components to add adding multiple sensors and more analysis data types will help maintain and improve quality do you want to add these components the platform searches for data analysis applications and multi sensors based on the user's answer the platform finds suggested components that are recommended to the user from the search results creates a list and displays the search results to the user the platform displays the search results including the recommended components user adds the recommended components the platform updated the conceptual diagram by adding components selected by the user the platform also automatically added the necessary additional components for data analysis the platform offers different levels in the editing mode depending on your expertise and the type of editing you are doing in this demonstration the user selects the component level to see the components making up the service the user then saves the catalog as a new edge service catalog I'm just going to stop it there because we're out of time oops to the next slide there we go so one more note and my work has been funded by the cabinet office the Japanese cabinet office the cross ministerial strategic innovation program and we'd like to thank them for helping us with this work and thank you for listening to our talks this is my introduction to the edge service enabling platform I hope this is interesting maybe some of you will join us in a crano and work on getting this platform to its full potential