 Good morning, everyone. You are joining the Meet the Teams Santa Ana webinar. This is part of the A&A Avatar XPrize, which is a $10 million competition focused on the creation of a robotic avatar system that will enable the operator to to interact with in a remote environment and in a way that as if that person was truly there. So we're meeting today with Team Santa Ana from Pisa, Italy, and they're one of 77 qualified teams in this competition that are working toward the realization of an avatar system. I want to turn things over now to the team leader of the Santa Ana teams, Antonio Frisoli, and he's going to give us an introduction to his team and the equipment that they're working on, Antonio. Hello, everyone. Hello, colleagues. So we are live from Scuola Santa Ana. We are in particular now hosting the meeting in Perceptor Robotics Laboratory, and we have all team components and members here today. So my name is Antonio Frisoli, and I'm the team leader for this team that gathers several researchers and professors from Scuola and two companies, Prinsilia and Wearable Robotics. And I'd like to introduce the other member of the teams. So I'm a professor in Scuola Santa Ana in Robotics, and we have a long tradition in the area of exoskeleton, heptic interface and teleoperation. And so we believe that we can bring an interesting contribution in this competition. And as a second person, I would like to introduce Professor Marco Controzzi, that is from the artificial hands area. Marco, please, you can introduce yourself. I cannot hear you. You are muted, I think. Okay. Hello, everyone. Yeah, okay. Hello, everyone. My name is Marco Controzzi, I'm a system professor of robotics at the Biobotics Institute of Scuola Santa Ana, and I'm working on the artificial hands area. So I represent here the parts related on the and the factor of the system that are based on two artificial hands we developed in the last year. And we basically we are focused on a prosthetic sense. So replacement of missing parts and body people that suffer from amputation, but we are confident that these hands are even powerful to a teleperation scenario since there's a very extensive sensory system and that's the capabilities. Thank you very much, Marco. I forgot to say, of course, that we are Scuola Santa Ana is placed in a pizza in Tuscany in the center of Italy. And then I like to introduce Francesco Porcini, that is basically behind the teleoperation architecture and overall setup. And I don't know whether he can show up. Hi, everyone. I'm Francesco Porcini, PhD student of Scuola Superiore Santa Ana. I'm not sure if you can see me now in the camera. And here is with my expertise on the teleperation architecture. And you can see behind me our architecture of teleperation composed by the master exoskeleton Alex, which is now guided by Massimiliano. And on the left side, you can see our our slave device, which is composed by two UR5 manipulators and two hands of which Marco was talking about. And we can have a demonstration of the movements of our setup. You can see we have, I'm sorry, I'm going outside the camera so you can see the complete movement of the architecture. You can see that Massimiliano is moving the two hands with the movements of the inside arms. And you can see the human-like movement of the UR5 manipulators on the right side and the left one. You can also see he's controlling the closing of the grid there with the pressure of an handle on the arm of the exoskeleton. And in this configuration, our master device is a seven degrees of freedom exoskeleton, which also allows him to feel the first feedback by the touch of the remote environment. And in this moment, Massimiliano is showing just basic movements. This is our most basic setup. We plan to complicate it in the course of a lifetime. Massimiliano can show experiments like catching the ball and passing it through one hand to another. So this task is particularly complicated in this setup because Massimiliano cannot see the ball because we are not using the camera of the exoskeleton. So he's just seeing it by his side. It's not so simple because he has the UR5 which is covering the ball. He's now completing the task. So simple. He's completing it. That was actually pretty impressive. The amount of grip strength it looked like it had just on the fingertips. The grip that it had in the Mia hand on the right hand side was very strong with the transferral. The transferral was very interesting because it was really just at the tips of the fingers. And I wasn't sure in a hold or not. And that was a really interesting demonstration of the actual strength of the tips. And I can say also that this is not our best setup because it's not controlling all the fingers independently but as one. But just by pressing the handle. So this is not the best setup that we can achieve later. Daniele will demonstrate how we can control one finger to other finger with our exoskeleton. And also how to feel the pressure on the finger with the finger. So this is just a basic setup and our most basic architecture. Of course this is only a demonstration of basic functionalities that we have available now and on the top of which we are building the avatar robot for the competition. So this is just a demonstration on some of the components. Yeah, of course. Yes, please. Sorry, go ahead. I don't know if you can see something. No, I'm sorry. Francesco, were you going to mention something? No, nothing. Just to say that in this architecture we have just a basic movement as a commentator. And this is just a demonstration of how we can control the finger. And also the fact that this task is in particular complicated because of the fact that we cannot see completely the environment because of the camera. Colin, I'd like to introduce also the fourth member of the team, the fourth component of our team. And then we will see in details all the demos that represents the contribution that we have from the students of the Scuola Santana because our Scuola, we have a special status and students are involved a lot in laboratory activities. And so we have the opportunity to involve a lot of them in this competition. And there is Andrea Boscolo Camilletto that is connected on the first floor of the lab to use himself together with Luca. Can you hear me? Yes, we can hear you. So here I am. I'm a master student in robotics here at Scuola Santana. And I'm leading a team of, I think, 20 25 students that are working on this project. And I'm here with Luca and we are going to show you the mobile part of the robot, which is an online directional with base the robot that we think is the best solution for the task we are looking for because you know it can it can rotate without moving and can move without rotates. So it's the best fit for the task we are going to achieve in the next month. So here we can see the only rotation part and maybe later we can show some more specific tasks we can achieve with. Now we are controlling it with a joystick. In some months we are going to implement the SLAM navigation and all the rest. So I think that it's okay for us as introduction. Great. Thank you. Yeah, it looks like that robotic base is moving really smoothly and well done on that build. That's something that you build custom, is that correct? Yeah. It's so custom. We made it from scratch. So we built all the parts and we chose the motors and the gearboxes and the frame and all the lateral parts. So yeah, it's quite new. It's a brand new robot. Yeah, congratulations. It looks really great and smooth and smooth. We're looking forward obviously to what you mentioned is fully integrating that with the system that we saw that the full tele robot system that we saw from Danielli's screen and putting that onto that mobile base and creating that mobile avatar. So thank you for those introductions and those brief demonstrations. We're going to come back to those in a little bit and see a little bit more from each of those components. I want to get a little bit of information from Antonio and maybe the rest of the team as well who are welcome to chime in. Just a little bit about the story of Santa Ana and my understanding is that the school is actually comes from a long lineage of really strong robotics. Antonio, can you tell us more about what kind of other programs or tell us about Santa Ana? Yeah, basically in Italy is Santa Ana school has a special university status because we say that it is a school of excellence because we select top talent undergrad students and we have faculties in the applied sciences. So engineering, medicine and then for social and humanities we have of course law and political science and economics and in engineering in particular the two strong areas are robotics and photonics, communications, robotics and the high in general so we have also all the computer science and the better system department and in particular we think that this challenge was able to involve a lot of components in the squalor and we consider also as an opportunity to work together towards a common demonstrator of abilities because we have telematics for the remote communication, of course robotics and also all the real time control so all the computer science part. And of course in the squalor we have all the graduate and PhD programs and the graduate and PhD programs and we have a good number of PhD students that like for instance Francisco we have involved as well in inside this competition. Right, thank you. Do you have a number of different levels of students, graduate students both undergraduate and you're also working with some faculty. Are there other crossovers from department to department or is this really just in one area of the school that your team is working on this avatar project? Oh yeah probably I was not clear yes it is already I was saying that I was just acting as coordinator but as I was saying Marco is representing a different area that is the artificial hands and it comes from the birobotic institute then Marco will tell a little bit about the story of bionics research and birobotic research we have a long tradition into that and so this is a spin-off of this technology in a different domain and as I was saying we also have other faculties in the area of telecommunications and computer science. So I represent more the mechanical engineer and the more hardware part. So we are now more or less representing four or five different laboratories in squalor plus some companies that are involved as well. Yeah exactly. So in particular in our laboratory that is perceptor robotics we had a long tradition in the area of telepresence, haptic interface, so haptic feedback and virtual reality. So this is like the exoskeleton that we are using comes from research in which we develop several upper limb exoskeletons and we like to use this technology to provide haptic feedback to the operator. And probably Marco can add something about the research biotics and birobotics and something. Yeah I'm representing the birobotic institute as I told at the beginning of my introduction and in the birobotic institute we basically develop robotics for humans, meaning that robotics for people or prosthetics system. So this bionic system that interface the human body to the machine and I'm part of this artificial hand area so we develop hands since 20 years ago and here we are representing also the spin-off company Prinsilia that commercialize our prototypes since 2009 and here for instance we are presented two hands that are very new, the azura hand that is on the left side of our system that is a tendon based hand with intrinsic attrition meaning that all the functional components are in the sides of the hand itself and the side is quite similar to the human hand. It's a tendon driven so it's quite bio-inspired meaning in this sense and under-activated meaning that there are less actuators than the articulation joints in the in the fingers and as you Colin appreciate during the demo so the fingertips are based on human palms so there are compliant fingertips with inner bone inside and this allows to have a very wide grip on the object so you can feel all the objects using a very low power force in the fingers and extremity of the fingertips and then there are sensors so here there are sensors in the fingertips so these are dual cantilever strange gauge-based sensors so you can feel the force on the normal and tangential axis and this is concerning the azura hand while on the right side of the system we have a brand new Mia hand that looks different from the azura hand because we work a lot on the cosnesis of this hand so it's a waterproof hand in that there are this glow that cover all the functional components that are inside of the hand it's lighter respect the azura hand and it's more powerful this hand can grasp an object with a force of 70 Newton so it's a large force compared to the other robotic hands that are naturally available and still even if the size is very compact it can perform different grasp types so we have one motor that allows the independent flexion of the hind fingers and the abduction of the thumb this can allow us to switch from different grasp for instance this is a lateral grasp that it's useful for instance to grasp very tiny objects while we can switch on the digital grasp that can be used for instance for very small objects and precision manipulation this is not the way to switch from another grasp this is the pinch grasp so we can manipulate very small objects and perform quite precise manipulation and there is the cylindrical grasp I mean the power grasp that can grasp objects larger objects even this hand comes with a sensor inside so we have the actual sensor that can be used in the index middle and thumb and we can feel the normal force and the conventional force that can be used after a demo to show you the optic system that we are going to use that's mostly all the things regarding the hands great thank you for that demonstration it's really great to get a walkthrough of those three different types of grips that are built into that hand it's also an interesting feature that I'm not sure I'm too familiar with the waterproofing of that where all the components are sealed in a way that would protect it from the elements to ensure that the hand is actually going to function at all times my understanding please no go ahead basically there are electronics and mechanical components that are here that are straining sides and then the globe here allow for water from connection in this wave so there are a sort of orient here that can close hold hand against the water on splash from outside yeah that's a really critical design consideration also you can imagine that an avatar might be in any environment truly it could be in an environment where it is exposed to harmful elements that could inhibit the actual functioning of that system so having the hand which is an important part of completing a lot of tasks protected allows it to really continue on functionality regardless of that location it's really interesting we discovered this requirements from the prosthetic users the prosthetic hands would like to use the hand in a very chronological scenario so you want to wash dishes and everything and you want to wash hands and using the hand in a very strange scenario so this is what we can I mean deliver from the prosthetic field to the teleoperation field yeah that's a great source of inspiration for that type of design because it's truly a human use case and having the prosthetics which is something that's clearly demonstration of using something in an everyday human sort of way is a good analog for moving into the avatar space which is a multi-purpose robot that's driven by a human so I think that's a really important thing to look at while you're considering how to construct your hands and how to take into account my understanding is that you also have a robotic gripper the last time we spoke I saw that the gripper was attached to that right arm can you tell me more about the gripper and when that might be used I think the idea was to show basically the functionality of haptic feedback if we can provide now Daniele can also show them we can probably have a close-up on the system it is better to understand how it works so I don't see we are just moving the camera we are so the idea that we'll show Daniele is basically the example of wearable haptics and fingerprint sensors that we'd like to exploit but Daniele you can explain can you hear me? here we have a very sensitive force sensor the blue one you see here this one is very sensitive you can see in the monitor I am barely touching the sensor it is a very stable signal of the pressure and we are going to use this signal in order to provide feedback to the user of any tiny contact force this is occurring when we manipulate objects especially in fine manipulation so we are using this device it is a wearable haptic device using a voice coil actuator the voice of the actuator are going in contact with my fingerprint and it has to be linear and wide bandwidth actuator in order to provide a good haptic feedback and we believe this is critical in fine manipulation we have to sense how we grasp objects and be sensitive also to contact thresholds and the slips of the objects in the hands so I'm showing you this setup on the desktop we are going to integrate it in the final setup using also a sensor that are already embedded into the azura hand the left hand you've seen before and if you have any questions I'm here that's a great point between the reliability of the sensor thank you for that demonstration those of you who are watching this webinar feel free to type in your questions and we'll be happy to read them live and talk to team Santana about any questions you might have and provide some live answers there with this gripper you mentioned one of these sensors is already in the azura hand are there plans to integrate more of these in the digital array of fingertip sensing in the robotic hands or just on the gripper ok with the azura hand we are going to implement more that is the one finger but we can distribute them more and on the operator side we are going to use this device whereable as you have seen the operator was using right now so yes it should be distributed on different things at least thumb, index and middle finger at least three that's a really important integration making sure that the operator is able to sense whatever object it is and you mentioned having the right amount of pressure for an object is there anything that would indicate let's take the example of that cup that you just had the user presumably would be able to feel that pressure in such a way that it wouldn't crush the cup but it would hold it with the right amount of tension is that correct? Yes, precisely, especially the contact pressure is very important I can show you, I don't know if you are able to see also the graph from the screen sorry so here the tiny pressure on the cup is enough for the signal to be read by the sensor and to be provided by the user so at the moment the actuator is pressing my finger if I just remove this cup you see the small movement so even those tiny forces are very important to be rendered to the operator especially the contact pressure but also then the modulation of the pressure for more complex grasping just out of curiosity how similar would you say that the sensation is for you, obviously the tactile feel of a cup is going to feel different to your own fingertips but the actual pressure that you feel back on the device how similar would you say that is to actually holding a cup? Yes, of course a huge set of cues and sensations so we are providing just one it is the pressure but I mean given that it's very similar so the activation here due to the generality of the sensor of the sensor and also of the actuator is comparable to real grasping but of course this is just one of the cues you are providing to the user so just the contact with the plate and the pressure there are several others that would be great to integrate in one device but there are really a huge set of different haptic cues that could be integrated we are just exploring which one is the more critical one to be felt in order to operate and to provide informative haptic feedback not only realistic but informative in order to accomplish manipulation tasks Of course there are a lot of ways that we sense when we are gripping something or just holding on to something so it seems like what you have devised so far is enough to confirm with the users with his or her brain that I am gripping something even though the tactile feel may not match there is actually something indicating that you are holding something enough to indicate that you are actually unmanipulating an object Will you have more than one of those on the operator side? Yeah, at the moment we are just using a handle right now that is sensorized in pressure but we have no feedback at the moment with the exoskeleton you have seen before we have several devices we can use here we have an exoskeleton for the hand so when you use this one it could be worn on this hand we are going however to integrate those optical devices on a more compliant device with respect to the rest of the exoskeleton so we have already different technology in order to do that we are just developing a new kind of hand tracker and we are going to integrate it in order to be the best to be matched with the existing exoskeleton or the full box there will be a new device in order to integrate these with the exoskeleton yeah So this is just to share our vision in the sense that we need to emphasize that also the quality of haptic perception is relevant for the experience so of course integrating all technologies and providing an ergonomic experience and natural experience to the human is the challenge so trying to simplify all the interface to the level of user experience that's a good balance to find making sure that you have that experience that's enough to operate in a remote capacity providing that experience that feels actual and real but also making it simple enough that it's operable is a really important thing to consider so as we've seen so far there are a lot of components that go into an avatar system there's a lot of components within it including the haptic technologies all the basic electronic hardware that needs to be protected you know with that said avatar systems are very complex integration of a lot of technology what would you say Antonio are some of the most important components that will make a successful robotic system like this yeah I mean of course depends on the side we can evaluate from as I was saying before from the user experience perception experience from the user that is remotely transfer embodied into the avatar and so we think that of course while vision technologies you know are at very highest level of development of course it is important then provide to us haptic and hearing feedback there are a little bit behind about smell rendering because on that we are considering as the last teacher to integrate in our architecture then on the other side from the point of view of user interaction with the robots this is also a very relevant aspect because this is a collaborative robot and so as Marco was showing for instance hands are very relevant we should manipulate and interact with the human objects so it should have shape that are capable to do that and are perceived as human like also by other persons but also the idea of having sensors on the robot like we are thinking to integrate robot skins we have other relevant persons in the group like we have professor Otto that is also in our team that is developing some robotic skins so that in principle you can also touch the robot and transfer the feeling or any way stop the robot in case it's doing some movements that is unwanted transfer some let's say information to the remote user even using let's say the body as a means of transfer and of course for the part of mobility we were now relying on a wheeled mobile base and probably Luca and Andrea can show now some more features and of course this is a limitation because we cannot move in some conditions like if you have stairs or uneven terrains but on the other side it can be very simple to be controlled by the remote operator and can have highly maneuverability since the solution that we are relying on as four degrees of freedom sorry three degrees of freedom so it can what translate and rotate so I think also having all of these possibility will simplify the thought in particular when you have to orient the body of the avatar with respect let's say to a table or to a person or to a part of the environment so of course you should imagine that we will have cover mounted on the top of the base and all the upper body trunk that now is remotely displayed I have two questions for you on that first of all how are you controlling the wheeled base here do you want to integrate the SLAM technology navigation you talked about with the operator controls yeah there are I mean our first approach will be that the operator can take full control and in this case it might use pedals let's say to control we already have projects in which we were using the sort of pedals in order to control the movement on the other side the operator can have some high level functions in which it can select for instance a location in the room and then the robot will reach autonomously the place where he has to go and of course sensors on board can be used anyway to keep safety level high so in case you command a position in which there is a risk of a human or with some other object or the sensors artificial vision sensor will prevent this that's great I also have another question you know you said you've got a lot of human qualities in your system when it all comes together with the hands and you also have said you're going to put the upper body trunk on the wheeled base what plans do you have a face or something that the operator can control to share with the recipient on the other end yeah this is an interesting question our group of students has been working on the design aspect of the final avatar and so we have now a nice rendering of a possible human like avatar and at this stage we'd like to that the robot has some human face like shape so we will have cameras integrated inside let's say a special design head on this probably Andrea can have something because he's been part of the group work Andrea and Luca can have something on these aspects of design yeah so I think that it's not our top priorities right now the face part but we have a prototype on the computer we draw on the CAD a model of the head we will wait on that until we have a full prototype with the trunk and we have full manubrability and control of the robot when it's everything working as we want I think we'll implement the head but right now it's something I don't think it's our top priority and I don't think it should be I don't know if Luca wants to have something about that thank you I would like to highlight that yes probably let's say mobile robot are not similar to human it's not what we would expect probably from an avatar but at this stage of technology probably it's a reliable solution and we have all the movements for the user and from the user interface we can move the trunk in any direction and we can orientate it and these movements are independent and so we are satisfied with this choice thank you I'm going to switch the view back over to that robot if you could give us a few more show us a few more capabilities of what that base is doing and also maybe talk us through how the control might be integrated into the full operator system would you plan to use just the same joystick mechanism or is there some other thing that you might integrate to ensure that just the one operator system can move the base yeah so as you can see right now I'm using this joystick so I'm moving the mobile robot with the left stick I can do movement right on the left and front, back or any direction rotate it and I can combine these two commands as I wish and probably on the user interface you cannot use a joystick because your hands are already doing something else so we are going to use a pedalboard and I think that the best solution would be to have a pedalboard with a two axis pad for the translation and another pad for the rotation maybe also to reorientate the camera so we can use this four axis three for moving the base and another one maybe to reorientate the camera so let's say the vertical angle of the camera so probably I think we will go for this one yeah it's a great idea to have the pedals there because we're used to being able to manipulate pedals in our automobiles and things so given that your hands are busy it's very good to implement the navigation so it's a great tool device yeah I think that's our best solution because right now the game controller is only here because it's the first thing we were able to use to make it move so in the following days I think we are going to try to control it with a pedalboard so we think we will go all the way so yeah thank you both for that demonstration it sounds like you have three distinct components or rather two perhaps at this point where you have that mobile base and then what we've seen already today is the tele-robot system with the two arms both of your robotic hands that are connected to it so you have two distinct components that are coming together to ensure that then you then have a mobile avatar I'm curious to what we've seen so far from the demonstration was you know you're sitting next to the the robotic system Danielli I believe is sitting just very close to that have you tested it from a certain distance how far away are you able to manipulate this system yeah I don't know whether Francesco wants to answer on this because it's really working on all this developing some algorithms dealing with time delay so we tested already not this architecture but a similar one in long distance not long distances I mean in not kilometers I mean and some of the issues related to unreliable data transmissions required special algorithms for preserving energy transmission Francesco can tell something on that probably yes sorry yeah so can you see me can you hear me okay so as Antonio was saying I'm working actually on the stability and the performance of the operational architecture in terms of also of time delay as you were saying one of the main problem is the distance between the operator suit and the slave robot actually we tested our architecture with this same slave in the Centaur project in which we have also quite big distance between the two platforms stable even without passivating the architecture to guarantee stability we are intentionally to implement the time domain passivity approach to handle all kind of delays in the architecture so this is for us a priority because we need because the stability must be guaranteed in all situations so even if we we need to guarantee the stability with techniques like the time domain passivity approach and this is one of the most common approach which is used to guarantee stability because of its simplicity because it is you don't need a model of the architecture so it is simple to be implemented it is very effective but actually I'm working on the performance because as you well know we have an approach to guarantee the stability but degrading the performance of the manipulator so we are implementing now new techniques to guarantee the performance of the architecture even by guaranteeing the stability so it starts with that stable base of connectivity between the operator and the robot system if you have a clear path toward ensuring what we are seeing today it also depends on the channel in which we are communicating actually we are communicating using cables and UDP protocols so it is a stable communication very fast and we also perform the time delay which is due to the UR5 architecture but as you saw the architecture is stable and it is in presence of the time delay which is inside the controller of the UR5 we think that you can probably show that you can also interact with the robot since you are there yes maybe someone else because I have short cable this is the maximum distance I can reach from the robot I'm sorry for that no no no I was saying I see you there so that someone can have maybe Marco can have some interaction with the end I can try it too Marco you can see you from this camera so as you can see we also have some kind of interaction we also have some kind of interaction with the user which is safe interaction absolutely safe interaction and the end is just pressing on Marco and the UR5 is shaking but Marco doesn't feel pain of course or the other thing is he's not showing his pain you can pass an object like I mean like the robot can give a cup of tea or a bowl in this case so it can be reciprocal so you can easily you can imagine that in a final setup the humans will be easy, will be interacting easily with the robot yeah safety is a really important consideration as well and I saw even while Francesco was speaking Antonio that you were there shaking hands with the robot as well and obviously comfortably doing that that's an important thing you know we talk about human interactions being a really core component of the Avatar X Prize and so having those safety measures in place that allow you know hand-to-hand contact especially on that handshake level something that's really important to making a human-to-human connection which is something that we're striving for in the Avatar X Prize yeah for instance in the hand we have so here you can see here the fingers are under activated so you can so if you close the hand you can see the robot with the hand without damaging also you can gently conform the finger on different shape and this safety feature in that yeah yeah having those safety features is very important so I believe it's Daniella who's operating this I'm gonna switch back to that view tell us a little bit more about what that feels like on your end there's no feedback system currently that is enabling you to sort of feel that interaction but that is something that's planned is that right you mean the master operator I think he is Massimiliano that we didn't introduce is our I'm sorry no no no problem is because he's in a whole project he's doing the operator so he's a more experienced person and Massimiliano is a professor in mechanical engineering in our lab and I don't know if someone if you can if you try to talk Massimiliano I don't know whether we can give you probably not but someone can I can provide my info to Massiliano okay very good you watch that you are moving also the robot and you know yeah I don't know yeah you didn't get the question probably so Connie was asking how it feels because of course now you don't have the feeling at the level of the fingertips you just have the handles so you can provide some feeling of your user experience just if you can say something okay okay well of course without the so without the indexing pattern only the end in this simplified version of course I can't have the feedback on the fingers so I just adjusting from training everything just by looking at what actually when with the arm with the force feedback and in the final version so with the index of the skeleton I can perceive the force that the slave is applying to the object and so after I can we can see that the object's constraints on the remote environment can help me doing the right movement can help me guiding my movement and I can also not to rely only on the visualizations but also my I mean on the physical so first everything is more natural and these and black is what we always do in normal life so that we manipulate objects just by the tactile and that they're not on the looking at what we are doing with our yeah thank you for that I'm also noticing in your control mechanism it's just a post that's that you're holding is that something that's receiving tension as you grip it to make sure the hand closes that how that functions did you get the question another microphone so we can switch from different brass posters by triggering for instance by a long press of the button we can switch from different so in theory we can use one handle and one input just to open and close and by using different strategy we can switch different posters answer the question great yeah we just had a question come in that's sort of related to that so using that bar shaped hand controller how many different hand configurations are possible yeah so the bar shaped controller if I got the question it's by using just one input is it right so with this just input you can open and close and then and then for instance by a long press of the button you can switch from different brass posters I see so there's a there's not only the handle but there's also a button in place that can switch between the postures that are programmed in the hand so how many different postures are there that are capable with that controller system yeah yeah we can add three different brass posters meaning that we can use a sort of in which we can switch from different brass posters so we can even use multiple brass configurations by switching them among different information that are available mm-hmm yeah so you said the mea hand has three that we saw the pinching you know cylindrical and there's also kind of a tighter cylindrical, lateral and pinch mm-hmm actually other brass posters for instance pointing down pointing up different brass posters that can be programmed inside the hand right now we just forgot three brass posters I see yeah so there are more that it's capable of but right now we use three yeah great thank you you know just a reminder that question did come from the audience if anybody else has other questions just before we're a few minutes left in the webinar if you have questions for Team Santa Ana or anything about their technology or their story feel free to type them into the Q&A you'd be happy to answer that Antonio I do have a question is you know the team coordinator or leader you know when thinking about the next stages of this competition and knowing that the School of Santa Ana is something is a place that has done a lot of robotics work in the past to do though you know I think of the Avatar X Prize what would you consider how would you describe success for your team at the end of the competition in a couple years sorry your question is how you what we consider a measure of success for our team mm-hmm of course for us we already we had also in past the dream was a telepresence system where you can really feel embodied like movies show so you feel really embodied in this avatar so first we will be happy if we can achieve the full integration of this capability and and this would be already a very very nice result for us and then as a second point would like also to exploit these to spread this concept of telepresence to a wider community probably now during the COVID emergency there's been more interest about these technologies of telepresence but until now their potential to me has been underestimated so I think that the second challenge for us would be to spread the relevance of this technology to a wider community and public audience and of course making also available out of the laboratory so technology that can be used in for instance of course this technology in the current configuration is still expensive but after the proof of concept we will know also how to go in order to exploit this technology for more widespread use so how to simplify the technology an example was the one of wearable haptics so using this fingertip device is a way to simplify the level of haptic feedback with more reliable technology while on artificial hands all our group led by Marco is already had because they already did a lot of steps to simplify and to reach a generation of hands that can already be used by amputees and because they are reliable cost affordable and so on so also cost affordability is a challenge of which we aim to reach as well so you have a number of goals that you are working toward within and without the avatar X prize those are all really excellent ways that you can apply this technology driving down cost and making sure they are available to those people that need them down in the future we really appreciate your thought into that and obviously the goals that the school is driving for align well with what is going on with the avatar X prize so thank you very much for that response well everyone we have a lot of time for today's session so on behalf of the avatar X prize I would like to thank all of team Santa Ana for taking time today to join us and give us a closer look at their technologies and what their team has been working on it's really been a pleasure speaking with Antonio and their full team at Santa Ana so Antonio before we sign off is there anything you would like to add? I just wish to thank you all for the initiative and also the today call it's a nice way to be alive and share the experience with other people and so we send goodbye from all the team group great thank you very much Antonio and to the rest of the team at Santa Ana it's been great to see your tech demonstrations and to hear you talk about the work that you've been putting into the build of this avatar system from the mobile base all the way up to the full robot the dexterous hands that you've constructed so thank you again everyone listening this has been the third installment in the series of Meet the Teams webinar interviews if you have questions about these interviews you're welcome to email us at avatar at X prize dot org if you want to learn more about team Santa Ana you can visit their website at Santa Ana pizza dot IT and you can visit avatar dot X prize dot org to learn more about our competition and to view the list of qualified teams that are also in the competition the next installment of this series is coming up at the end of July on the 30th and we'll be speaking to team avatar quest who is based in on the west coast here in the United States so be on the lookout for more information soon so you can mark your calendars until then we are wishing you well from Los Angeles and hope that you have enjoyed today's session and hope that you're staying healthy and safe and this time so until we meet again enjoy the rest of your afternoons evenings and mornings and take care thank you