 Good evening and good morning everybody. My name is Colin Peartree. Welcome to our second session of the Meet the Teams webinar series. I'm joined by my colleague Jackie Mori who's the technical advisor for the Avatar XPRIZE and we're here with four members of Team Gitae who are out in Tokyo Japan actually on the day later than us which is a fun little session from the future if you will as they are over on July 1st at this point. Before we get started with Team Gitae I just want to give a few basic Zoom overviews. As you know you're all muted for this call but you are able to chat with us using the chat function at the bottom of your Zoom screen. We also invite you to submit questions using the Q&A feature anytime throughout this call and we will be hosting a brief Q&A session for the end of the webinar so do come with questions and feel free to write in so that we can talk more with Gitae about the questions that you have. As you know Gitae is one of 77 qualified teams who are working on developing this avatar technology. They are one of 14 teams that we currently have in Japan. Japan is a very well represented group in the Avatar XPrize. As you can see from the rest of the list this is a really large global effort to bring avatar systems into existence and we're really excited to be to be a part of this competition and to be bringing this these avatars to life. Without further ado I want to formally introduce Team Gitae to you so I'm going to stop my share and turn things over to Mr Toyotaka Kizuki who is going to kick us off for Team Gitae. Toyotaka. Hi everyone, I'm Toyotaka, CTO of Gitae. Nice to meet you all. This is our newest robot Gitae G1. Good evening, good afternoon, good morning everyone. My name is Yusuke Toguchi. I am director of business development at Gitae and actually you know we categorize ourselves as a space robotics company but actually I'm the only one with a space background in our company. That's it on to... yeah go ahead. Hi my name is Ryohei Ueda. I'm a social leader of Gitae. I'm Yuto Nakanishi. I joined the double fix challenge about seven years ago so I'm really excited. I can join X pride. It's a very famous robotics product. And just to let you know Ryohei and Yuto, they're former employees of SHAT. Actually Yuto is a founder of SHAT. Very cool. Ryohei thank you very much. It's nice to be you all and you stay in Toyotaka as well. So let everybody know we are going to be seeing some demonstrations from G1 a little bit later in this session. Yuto and Ryohei are going to play a critical role in that so you will not... this is not the last you will see of them on this call. Now that we've met a few of our friends from Gitae, I wanted to talk more about how Gitae came to be. My understanding is that you're a relatively small space startup but you are working on some really big projects. So can you tell us about your beginnings as a company and how it formed and what you've been working on over the last several years? Sure. Well in July of 2016 Gitae was founded by our CEO Shou Nakanishi. From his personal experience he wanted to provide a solution that could overcome the limitations of human transportation and since he had a personal interest in robots he thought an avatar would be that solution and when he was looking for applications or our business domain he found that providing avatars here on earth is maybe a bit too costly and might not be a good business opportunity and when he focused around other opportunities he found that space was the domain that we should postulate in because there's already a big demand for transportation of people for example to the ISS. One launch costs about tens of millions of dollars and having one person in space for even just one hour costs about 50,000 US dollars so there's definitely a demand to have a more cost more affordable waiver. So after Shou decided that space is where we're going to work in he decided that providing a solution that can a robot that's dexterous and rigid enough to last long in space and provide a robot that can complete tasks that can assist or even maybe replace people in space to do certain tasks is what we should be concentrating on and that's how we came up that's how all of us gather to realize our mission and you mentioned how we're working we have some other projects going on but we have we already have a very good relationship with the Japanese space agency JAXA we've had some contracts with them already and we also have we also signed a contract for collaboration with a space debris company I won't say who but and we're also working very closely with an automobile a global automobile company that's trying to develop a lunar rover for humans for people to go on and so yeah we have a couple of good projects going on other than the X-prides you do there are a number of projects that are going on sounds like many of them are really related to space and I understand that that is actually kind of what's woven g1 into the into the picture for the avatar X-prides you know when you started off how did how did your work change as you moved into the avatar X-prides you began really very focused on a space program sounds like most of your projects are oriented around that sense when you enter the avatar X-prides did you did you need to shift in any way away from those projects to make sure that you are on the right track for this competition actually so the knees in the space industry there's various knees in the space industry like handling small objects to big objects many small manipulation to handling big objects there's very wide range of technical requirements in the space industry and we're not trying to develop some robot that can work in specific domain we want to scale our robotic system to like to low orbit mar moon and beyond to mars in the future and to do that our robotic system needs to be like really not specific not only for specific purpose we need we need to develop our robot to be very general to be very general and the goal of avatar X-prides is to develop a very general and capable robot that so that really mashed our knees does it make sense so we're developing this robot for space application but we're always kind of making various experiments that give to make our robots general and capable and I think this this our experiment directly contributes to developing a capable avatar robot for the X-prides competition yeah that's really interesting the the general purpose use of the of the avatar system is really important when we talk about the avatar X-prides you know a lot of the entries you know we we're going to be developing use cases that will be tested against but when you think about the possible applications of an avatar in the real world there are great many of them that could be done and even when you think about how it's how it's applied in space there are a lot of different ways that that an avatar might be used yeah the ISS for example is a highly complicated piece of machinery that's in space and so there are a lot of different things that the robot is going to need to be capable of so general purpose makes perfect sense in this case as you try to develop something that is widely applicable so I have a question about the you know differences between you know you want a general purpose robot but is the is the way the robot moves the navigation system which I assume is a wheel navigation system right now is that different when you send it to space what would you mean by navigation system well the the movement the mobility mobility yeah sir um yeah of course it's quite different like when we talk about ISS there's no gravity but obviously this is a robot for making experiments on ground what we currently have in mind is like having a robot without a lower body like going from one bar to the next bar using hands and like a monkey mobility like that arm mobility not leg mobility yeah well I'm imagining maybe having two two more of these arms body for stabilization and mobility oh interesting so almost be a torso grab on and move but also have two other arms that are able to continue to to complete tasks while it's stabilized by the other two the unique quad armed type of solution that not sure I would think of right away but it would allow the robot to move around in a zero gravity situation wouldn't it so imagine something like a spider yeah almost yeah so we are getting some questions already but I want you guys to go ahead and work finish your presentation so some of those might get answered before we get to the question period yeah that's all right I want to talk about a little bit about why Japan is sort of unique in its way of developing robots robots in Japanese culture and society are highly prevalent you know I'm curious of what your thoughts are on on why this is such an important part of Japanese society and culture and what does that do when you or how does that impact the work that you do when you're creating robots well actually um that's a very good question because well in Japan the idea of having robots as part of our daily lives or even as our friends that that concept has been around for over 50 years actually the first anime in Japan uh was called Adam or aka Astro Boy and uh he was sort of um actually his design was based on Mighty Mouse but um he's a superhero that helps people and um that so when the idea of robots were introduced to Japan he was our friend and uh this is a very interesting thing but uh when I travel around overseas to talk with other customers I found that this cultural difference is very different well for example when I heard I heard that in France um they were robots are regarded more like hostile things not as something that you would see helping you to do or helping you in everyday life or as a friend but in Japan since that idea of having robots as a friend has been around for a long time um it's regarded as something very familiar in our everyday lives but on the other hand the difficult thing about having uh introducing robots in Japan is that since the idea has been around and that you know anime shows that they can do just about anything there's a very high expectation from um from Japanese um customers that what a robot can do so um you know when you say you have a robot that's um like ours that's a general purpose and we hope it can do anything the the customers expect that it does do everything and it's like we completely replace a person but actually the you know technological level of development of robots has is far away still far away from that so you know we still do have sometimes we face that difficulty of matching the expectation of the customer and the actual technological readiness of the robot right yeah I can imagine that's challenging and the that idea of having that high expectation for a robot I think goes out beyond just the Japanese culture I think even for myself as I first started working on this competition whereas had that same a similar expectation of the robot being able to you know do exactly the same things that a human would be capable of with the same level of dexterity and even efficiency at times so when you're working with you know your own designs and also working with a customer or a client who is expecting a certain level what what challenges does that present for for your work and what kind of in sessions or compromises do you need to make in your designs well actually um we don't act we don't um we haven't made any altercations or design changes to match uh a specific customer need um we have to basically um you know when we talk it's a different thing compared to talking to investors and the actual customer uh the actual customer you know basically they're familiar with what robots can do and uh they're actually more surprised at the level of dexterity or efficiency that our robot can actually realize it's just that we have a problem when we talk with our investors you know because they they expect to see something very sexy or like a cutting edge of technology what does the guitar have and when we show that um we uh that our core competence is actually the integration of all this technology into a system that actually works um that's not what the you know investors are looking for they want to see something new or sexy so that's the difficulty that we face sometimes but not actually with the actual customers or clients sure yeah you bring up a really great point about integration you know the avatar x prize is sort of built on that where the technology that actually we talk about most of the time whether it's virtual reality um principles robotics itself the grippers visual systems or or the locomotion systems that are used they're not all necessarily novel technologies many of them have been around for quite some time and so it's interesting that you have recognized such a crucial part of this competition as your core competency that being the integration side has that always been something that guitar has been strong with and when you've been developing your space programs or other projects um yes because uh we believe um there's lots of um technical elements inside this robot but like even if we have one very good technology the performance of the overall overall robot will not raise extremely uh we've known that from the from experience of developing robots but what and what we are careful of is like if we had even one bad element inside the overall system that part would be a very bottleneck in the overall system so killing the bottleneck of overall of overall system is crucial to developing a capable robotic system and this will not change dependent on any projects or work that we do yeah that's an important point making sure there are no bottlenecks so that all things can operate smoothly with guitar being a startup uh does that offer you any um any ease of design or any any ways that allow you to kind of make your own decisions as far as creating the best type of system what advantages does being a start up on for you um the advantage of developing a robot as a start up is like um we can select we have uh we can select any kind of technology uh freely like we don't like if we are in a academia we have to have a certain goal of developing uh perhaps have some kind of development or future activity but as a start up we can focus on a very problem that's kind of we can focus on customers that really needs the robot now which which where we can hear actual needs from customers directly and to be able to focus on actual needs is a good point about developing robots as a start up company and if i could add to that um since we're not a spin-off from a certain research laboratory or anything like that you know we're not obsessed to using this certain technology as part of our robot we're free to use whatever we think is best for the total solution um so that gives us freedom to choose whatever we see is best for for the best practice to achieve our goal of realizing a robot system that gets the job done that's it that's a great point there's a story it's almost like you've you've eliminated two bottlenecks in that way one of them being that you are not beholden to a certain type of technology that needs to be included in the system and then you're also able to to integrate those in a way that allows for a very smooth operation of the entire system making for perhaps the more practical and usable overall system um i while we're still on the note of just being a start up before i want to answer i want to ask pardon me i'm going to ask a question from an attendee in the audience just about what it was like starting this robotics company in japan was there a lot is there a lot of competition due to the high expectation of robotics was easy to get funded and can you compare this to other places in japan or other places in the world i joined at a later stage of the company just a little bit of a year ago so i don't i'm not really familiar with how everything was in the beginning but i know that um since our CEO he wanted to he was concentrated on powering the best people that he could find um he didn't want to make any negotiations he just wanted the best so um it wasn't easy to find people that match his criteria but now that we have a full team that had specialties in each aspect of the robot where i think we're have a very very strong team especially on the technology side and um what was the latter half of your question yeah well maybe just answer if it is difficult to start a company just based on the expectations of robotics being so high in japan well um oh that's the that's another reason why show our ceo was concentrating on hiring the best he didn't want anything in between he wanted the absolute best um because uh what we're trying to achieve you know we do have lots of competition in japan in terms of robotics but um since we're concentrated only in space and to provide a complete uh a total solution not just the robot itself but also the controls for it even the infrastructure like communications we our company we develop everything that's involved in realizing a low avatar system so um that actually differentiates us from the other competitions and especially how we're concentrating only in space we believe at this moment we're the only company in the world that's uh has only one um operation that's concentrated only in space robotics and how many people do you have to bring on board to make this well-rounded team with all of these different uh technology expertise right now our team is um um 12 people and uh two part-time employees but um ten full-time and um there's uh other than show and myself everyone else's engineers so the full-time engineers are just eight people so you are pretty a small but nimble team in that and you know really clearly do have quite a well-rounded group of experts that are able to make that that well-rounded system possible i think the the points that you just brought up about um you know about being able to have a full system but not that it's just the robot that you create but also the capabilities within that the communications as well as the control systems i think that's a really perfect way to move us into the demonstration of g1 if you're ready to go in that direction so i know that the audience and i myself included would love to see more about what uh what g1 can do so with that i'd like to turn things over so we can we can see more yeah and um to get uh to provide you a good view of what we're showing uh first we'll give you an overall view of the whole experiment but we also come uh prepared three other two of the views one is uh of the operator himself yes this is our control system the manipulation system of the robot that we have we call it h1 but um he this is the operator who will be controlling the robot and we also have another view that the robot itself is seeing this is the camera view of the robot so we will be switching between these views to give a good idea of how our robot works perfect system is getting started can you tell us about the things that are on the on the table and what uh yeah what's the task so what we're trying to do is uh uh lifting up a heavy object um this is eight kilograms right here about 18 pounds and we want to show that it's not just this robot isn't just strong but it's also very capable of handling a small object to show that was doing a demo of opening a small bag it's for it's this is kind of bag used in the iss or ctb carbon transfer bag and astronauts always open these this kind of bags and takes things out so it requires really small manipulation so we'll show this that the robot is strong and at the same time it's very capable of handling small objects and it's a soft bag too so it makes it another difficult thing for robots to handle okay let it roll so the robot has a eight degree of freedom arm with a gripper at the end and on the torso it has three degrees of freedom one for going up and down uh making a bow and the rotation operator feel the weight of that um eight kilogram yes we have two sensors at the end of the robot one is the six axis sensor on the wrist and three axis four sensor on the fingertip of a gripper so and the gripper operator can feel what's for this and then the text texture or even something like that okay yeah so we do have haptics on our robot too i have a question coming in now while uh while we're opening this bag about the uh inverse kinematics of eight degrees of freedom the question is how do you solve the inverse kinematic yeah we are of course using uh inverse kinematics techniques because the robot has eight degrees of freedom and the haptics device has six degrees of freedom and we are using some kind of techniques such as uh newtons that form on QTE or collision check also mass mass techniques we are using so it also looks like you've got at least six intel real sensors in the head of the robot um at least why so many so the reason why the uh real sensors in order to see the 30 environment at at once so we also have this this uh fish iron uh in order to uh see the uh see the environment at once because we don't want to move the head and to uh because yeah it has it makes a lot of latency from the hardware so it's to see all around at 360 and does the operator see any of that um 3d environment that the intel real sensors are sending back uh not not not for now so these are based uh the two fish eye camera is for the operator okay the internal sense is for kind of like autonomy kind of application yeah so the robot knows where it is yeah exactly yeah okay and are you using um Ross ROS for your operating system yeah we do okay thank you those were some of the questions in the Q&A I think we're up to speed so that was pretty impressive what that were about just did actually we have a few more demos that we'd like to show so this was a simple combination of uh elements so what uh what we what we have behind us is a mock-up of the ISS keyboard module and which mimics the environment of equipment that's actually in the ISS and we'd like to show a demo of this robot conducting several tasks uh okay well we'll hold the questions while you do the next few demos okay well we're moving over toward this this demonstration you talked about these and things that are uh mimicking what will be seen in the ISS how closely does this resemble what what an astronaut might find oh two years ago we had a joint research contract with JAXA and uh been communicating a lot and the the components that we've been using in the in this experiment to set up is based on the actual uh partners that is implemented in the ISS that's what the switch is these are very uh common activities that the crew the astronauts do on board the ISS you know flipping switches turning knobs uh pulling in and out cables and things like that yeah that was the first demo turning on and off the switches the second one is uh opening a melphi it's it's a kind of uh fridge that's used in the ISS the environment in ISS is made specifically to human and it's it's not the environment isn't made for robots so it it requires certain uh dexterous capability of the robot you know we do get the question a lot uh are we concentrating on making humanoid robots but actually the question answer is no and uh why we have G1 designed this way is because we're concentrating mostly on getting tests done in the international space station which is designed for people so as a consequence we have to have a dual arm robot in the environment that it would uh that it would be used in that sense at least yes in the scenario in the demonstration that we're seeing now uh how do the haptic systems that are in the big rippers help the yeah so currently the we're making feedback from the three axis four sense on the fingertip could you close up on the so can you see it here yeah that's good to see the close-up of the operator's hand so the operator is feeling the haptic sense on the and fingertips on the fingertips is that a is that a hand device that you guys designed yeah we build it from everything we decided in the house for the next demonstration so the next demo is a task related to the MHU mouse habitat unit so in the ISS there's lots of scientific experiments conducted on orbit and this is one of the equipment that's actually used in the ISS and this song why we're demonstrating this is because this is an actual demand from JAXA we heard that on this running this experiment requires lots of proof time and once this experiment starts the majority of their time is occupied by this experiment and if we can if they can automate this in some way they'll be very happy so that's why we're demonstrating how our robot is capable of actually doing some tasks on this mouse application when we see these these blue ovals here saying left forward how are those being generated or set something the operator just says go and we see that or is that a little autonomy what is that so the for the control of the feet the operator is like pushing the what pedals pedals okay i wonder this work we're showing that through the head mount display to the operator okay what's what you're seeing on the top left is the actual actual screen that the operator is seeing gotcha so the ovals are to show what the feet are doing yeah and we've been we've been long working on the high resolution data transfer of the visual data and we have 2.4k by 2.4k visual feedback so the operator can feel the depth of the actual environment in high resolution which we believe it contributes a lot in conducting various tasks actually so if you didn't have enough resolution in the robotic system you wouldn't be able to see the screw holes like this and the attaching objects smoothly but with high resolution visual feedback we are able to conduct this kind of task that requires very high high accuracy great i think we just answered one of the questions in the q&a they wanted to know the visual resolution so in 25 f yes and of course the higher resolution we have it causes more communication traffic so we've also developed a way to transfer such information in less than 70 militech with our internal latency and we also don't have an actual physical net so the operator will be able to we'll have very little vr what do you call that it doesn't get um vr sick no sim sickness the latency will do that to you and what is this task that he's doing trying to take something out and so this is a mock-up uh experimental equipment to compare one g and zero g environment so this goes into that equipment and when it when it if it rotates it's called this one g inside the equipment well where it doesn't when it doesn't rotate there'll be zero g yeah it's it's actually a cage unit that um holds the mice inside oh yeah so um yeah um you know in the jackson experiment they provide an environment where a mice one mice experiences zero g and another experiences one g and they see the difference how um the the the mice feels different according to the different types of back then i don't suppose the g one is capable of handling a mouse but it would be able to be able to assist that kind of operation on all that well actually you brought up a good point um you know we did show how it's capable of lifting heavy object when it has power but you know at the same time we'd like to have the robot be able to handle soft and um um objects maybe like even a mouse so uh we didn't prepare today but we also showed a demonstration once we recorded demonstrations once where we lifted the same um this uh dumbbell but at the same time lifted up a potato chip oh wow yeah that's right of course yeah a drastic difference in weight in that sense is the haptic system that's in the gripper able to detect something of that weight and that texture does it need require something of heft to to allow it as for the texture we haven't been working on uh actually expressing that kind of feeling if it's required in some kind of market we will certainly welcome trying to do that but it's mostly the operator who has to through the visual system and the haptics decide whether an object's fragile or or very sturdy yes well we're we've done a demo using g1 but we're also working on uh so it doesn't have to be this size dual arm robot in the ISS so what we're working on is uh making a robot specifically uh for the purpose of this task like this one is a minimum setup for uh conducting same experiment that's just the base the the necessity the one necessary arm that would be required to complete a lot of these tasks yeah how many uh i'm going to look at at g1 again because i know it's going to be the robot that mostly is what you're focusing on for the avatar enterprise how many iterations has it gone through to get to the stage it's at now of the look the size the way that it moves vision systems um this is the as a company it's about seventh generation robot but uh previously we've been developing uh robots of a smaller size to actual uh that's shown here which has been a very prototype you know initial prototype yeah very small it's one of our first uh yeah prototypes but uh from the sixth generation development we've been making a human size robot and uh counting from the sixth generation this is the this this is the third generation third in terms of the human size so since the enterprise started this is the third generation robot yeah well you're talking about using this robot in a in the ISS which is you know a delicate to say the least environment you know things need to be done correctly so can you talk about your design how do you work safety and uh into your into your um considerations so uh if we are sending robots to the ISS uh it cannot harm the environment and it cannot harm the truth so what we're doing is uh sometimes we're applying uh some control so that the robot robot can act uh to the environment for the human very softly to be very soft against external force we've been working on that kind of the control too to for the safety yeah that's a really crucial element you know when you if you accidentally bump into something or a human is close to it you want to have that that ability to be soft is it where I know that the the object itself is quite hard you know but in that in that environment things there are a lot of things that are crucial to ask so that's important to to consider an easy question how much does just to g1 way a single arm itself is like 14 13 14 kilograms so your upper half of the body will be less than 50 kilograms less than 100 pounds I guess no lower part heavy so that it doesn't fall over sure yeah of course that makes sense and so it would be even more lightweight you know when it's you know when it's baseless say on the ISS um you may add another arm into the equation so it has a little bit of stability as we mentioned more mobility um weight is a very difficult hurdle that we have to face when we send something to space because we're actually charged per kilo and of course you know we don't have a endless capability of sending things up to space so we have to that's also that's a technical limitation that we always have to face when we try to send something to space and we need to send this robot to space is the operator still on the earth or are they somewhere else in the space station um when we're sending robots inside the ISS we assume the operator be on earth but for example if we're sending robots to moon we can have a there will be an old satellite orbiting around the moon so the operator can be in space and uh controlling the robot on moon from space a lonely jump but of course you know the farther we get uh the the distance between the operator and the robot gets further and further of course we have to face a physical challenge about latency so when we start sending robots to the moon or even further to Mars you know we can't avoid that so we'll we'll definitely providing a robot that has that's half autonomous half remote controls but we will always leave some space for human recognition to intervene in the operation of the robot itself right so there'd be some AI predictive planning that type of thing you have plans soon to send g1 or any of your projects to the ISS do any of the projects that you have working with JAXA for example have that coming soon um so we have a couple of missions experiment of this uh MHU is something that we're working with JAXA but as for the other experiment with planning to launch our robotic arm to ISS somewhere next year uh this is the experimental setup that we are planning to send to the ISS and we're currently hoping to launch next May May of 2021 and do you have to go through a lot of safety checks on earth before you can send this up to the space station because I know you know they're pretty risk averse up there yeah we've already started uh we've already got a contract launched to the ISS and we've already started the safety review with the NASA pulse we've already passed the phase zero safety review and we're preparing to go to the next phase so we're always we're not just developing things that work on earth but like changing the materials changing the circuits so that we can prove that it's safe in the space industry too well actually you raise a very good question because one of the very very difficult things that we have to face when sending things up to space is the safety review especially on the international space station because you know people are there all the time you have to make sure that it doesn't harm the environment plus the people um in the environment so um yeah the safety review that's actually one of the biggest hurdles that we have to clear when you're trying to send something up to space yeah and you can't really test it for zero g like uh uh if if this thing was floating around and it's going to bump into somebody there's it's kind of hard to test without the same uh situation here on earth exactly um there's some things that we can't test but um not exactly the same environment so we have to work on um feedback or experience from people who have been to this space station of course the gas room you plan to put a lot of velcro on it so they can go to some of the velcro pads to stabilize it when it's on the iss well we'll um secure it with um screws and things like that yeah well i can see your monkey kind of arm thing you know going to from the velcro to velcro unfortunately we haven't gone up to that um stage yet well we're just going to have a one single arm that's going to work on that's going to show some demonstrations of the capabilities that it can do just like you see here this is actually very close to the um actual configuration that we'll have on the other side so we'll conduct experiments of turning on and off switches by autonomous operation and at the same time we're going we're planning to do that with the remote control so and also we're kind of we're planning to conduct some uh in-space assembly tasks to construct a structure in space so that would require you being out in the vacuum of space so are all of your um all your robots and robotic arms capable of operating outside of the iss out in the back of your space as for this project uh we're only going to conduct the experiment inside the iss not outside got it and in our next experiment we're hoping to conduct in 2022 or 2023 we'll go outside in the space environment so you're working toward that that's pretty amazing yeah that's a wide operating conditions there i'm curious about the operation of g1 um for as far as being inside of that system being an operator maybe i'm wondering if either of you or maybe it's just uh ryohei who has spent the most time in that system but did it take a long time to get used to um is it pretty natural jumping into that cockpit to operate it actually the new cockpit that we've been developing is quite new we finished building that like a month ago so the operator hasn't been able to train a lot but it's kind of getting more and more intuitive and controlling the device now yeah i think it's because of the sensory feedback and the high resolution visual data that we are sending to the operator but of course you know if you want to achieve the level that our operator has right now it will take some time to get used to to the controls but uh we don't expect something like a year or something sure yeah so and as in anything you know learning a new control system in any sense is it will present its own challenges you need to get acclimated to a system but you feel like yours is something that's a little bit more natural um that somebody could easily or more easily uh jump into and so that you know that thing also what i'm noticing is that it's relatively simplistic i obviously understand that the actual operations of it requires some complex connections but the look of it just being two arms um with a seat in front of it and i believe some pedals down below um how portable is the system um portable we haven't thought of that our remote control device requires like two kinds of OS ones for the Ubuntu that's running ROS on one side and we're using Windows system to control the uh head mount display so two desktops and uh the weight of our haptic device is around 20 to 30 kilos so maybe two suitcase would be enough to carry the whole device so i mean all things considered that is actually fairly portable you know the weight is not you know too much of an encumbrance but um maybe something to think to continue to think about we don't want to build something that can only be used in our office we want to develop something that can actually go into the market so it has to be some if we have that kind of certain extent of uh yeah so if we we want to build something that we can actually take out anyway well we're we're running up on the uh on the hour mark for our conversation here so i'm going to begin to wrap things up but i have a question that i'm curious about um you know you have you've been doing a lot of work um separate from the avatar x prize kata has been around since you know before the inception this competition um but thinking just about the avatar x prize aside from perhaps winning the grand prize or some part of it how would you describe success for your team or for your system uh at the conclusion of the competition um well um one of the main reasons that we decided to participate is because the attention that we get so if we win the whole thing or even just staying in the final round you know we saw the example how um companies acquired attention during the lunar lunar exploits and that was very very tempting for us because you know we're very still a very small company very few people know about us but um the attention that we can get if we can at least go to the final round uh that's something that it's very uh we're looking forward to very much mm-hmm as a as a technical side we we don't want to develop a robot that can only pick one thing we want i i want to develop something that could be very general so uh to keep making experiments handling general objects x prize competition is a very uh good technique is kind of a good motivation for us to conduct various uh experiments so yeah let's be happy to be your motivation thank you yeah and we are uh obviously excited to see more about you know more from g1 more from guitar and uh to be able to see more of what comes out of uh it's a project for you guys this has been an incredible demo just to see all the capabilities and the amazing work you put into creating g1 and the other projects that that you've demonstrated today that we commend you for your efforts up to this point and wish you the best on months to come um any parting words uh from your team as we wrap up at the top of the hour hope everyone enjoyed our tour we're kind of really interested in seeing how other teams are going to attend the expert competition and put these soon in the next uh semi finals and uh we'll keep um developing a robot and uh introducing more videos of uh our capabilities so please keep an eye on for us check our web page on the website we look forward to that kudos today very nice presentation thank you thank you guys hope everyone enjoyed we will let's be looking forward to seeing more from your team um more videos and we'll be on the lookout for all the competing teams who are also watching be on the lookout for more from gtai um it's worth to be aware of some really interesting developments as we go forward so we are out of time today for this session on behalf of avatar x prize x prize at large and uh the rest of my team i'd like to thank toyotaka yuto ryote ryohei and yusuke for taking the time to be with us today uh for those of you who have joined us live we thank you for your time as well and if you're viewing this recording uh so hello from the past here uh it has been a pleasure speaking with you gtai and uh letting the audience see you closer look at you and g1 if you want to learn more visit gtai's website at gtai.tac it's g-i-t-a-i dot tap t-e-c-h you can check out more about what they're doing in the future you have questions for the avatar x prize feel free to email us at avatar at x prize dot org and um be on the lookout for more from us as well so as you know we're creating we've created this series to highlight our uh our competing teams the next meet the teams interview is expected to be in mid july and we'll be speaking with team santa ana from italy and uh we are really excited for that one as well so be on the lookout for more webinar information soon and mark your calendars so until then we are wishing you well from los angeles and uh hope that you're all staying healthy and well we hope that you enjoy the rest of your afternoons evenings and mornings for those of you who are coming to tokyo or elsewhere in the world and that we wish you well take care everybody