 Then we can continue with the last talk or the last formal talk of this the summer school And now we go even one step further and it's like almost like the final step that you can go with VPH technologies and with modeling and combining it with information and data and data analysis And that's going towards use it for for example Searchery for surgery planning for doing the experiments It's my great pleasure to introduce Michelangelo Gonzalez Baiste who is also working here at UPF He's also an Ikea research professor here in Catalonia, and he's working together with us like in the whole set of groups where Oscar and me are focusing mostly on the cardiovascular application and Michelangelo is doing all the rest and is doing it much better than us So we are looking forward to hearing your talk Michelangelo. Thank you very much So I always joke saying that there is a very strong group on on brain Analysis here in the university a very strong group with heart analysis here with Oscar and Bart And I'm the person without the brain and without the heart that works on the rest Which is not true because actually by also the neuro things and cardiac things When when I looked a bit at the the program of the of the summer school, which I think is great We had some really really good speakers I Tried to think about how to focus my talk because we've seen a lot of work or a lot of presentations on on imaging on image processing on modeling for like Understanding the basic workings of physiology on modeling for also patient specific diagnosis and prediction But then I want to give a bit the the complementary view of Okay, once you've done all this analysis and you want to treat the patient what sort of technologies or was what sort of approaches can we do? Also instead of focusing the presentation on Talking about my research. I Structured it more like a lecture. Okay, so we will go through some of the basic things that I think are fundamentally different From what we've seen up to now and maybe focus on three main things one is navigation technology So things that you can implement to help the surgeon or to guide the surgeon in the operating room To reproduce the plan that he or she has done prior to the surgery The other is interoperative imaging that can help you Compliment the information that you have during the intervention and the other is surgical robotics So there will be two main take-home messages in what you will see one is this so the basic idea of computers It's that surgery the basic elements, etc And the other is that the need to think out of the box Which I think we've seen the previous presentations as well. Okay, but When you go into a lab There is always a tendency that you continue the work or the lines of work that are in the lab But I always try to To expose students from the beginning to the application to get them to talk to the surgeons or to the doctors or whatever and come up with their own crazy ideas before Giving them a sort of root of what they should implement So I think this is also very important. So to think about What is your view and to have some sort of scientific? Let's say science fiction type of ideas That maybe are not feasible nowadays, but they will in the future so Focusing on treatment of course not all treatments are based on surgery There are a lot of treatments that don't involve any type of surgical intervention But to think a bit about What should be the future of surgery? Maybe I will deviate a little bit Focusing on what is the past of surgery first and to give a sort of brief Overview of the whole process that has taken us to the the current setup in the operating rooms Of course, anciently 7,000 years ago There were some very basic Interventions that are documented that were already done in prehistoric times and then later in Egypt and then later on in Greece and Rome, etc. So there is a long history of documentation of surgical procedures But until like the mid 19th century nobody thought really about antiseptic Aspects of the surgery so nobody thought about washing your hands or watching the instruments basically before him So think that the whole process progress that has been going on in surgery is fairly recent and things that we Nowadays think, okay, so this is crazy. These guys were doing things without any safety So probably people in a hundred years from now will think about the way we do surgery and think that we are crazy And we're doing some really silly things Anyway, so coming a bit more to practical things So that the recent past of surgery has been hugely influenced by the advent of of imaging technology and Since Philips is in the room. So we should acknowledge also that part of this is of course thanks to Philips and other manufacturers But so where should we go from now, okay And there are if you look around a bit just out in the internet whatever there are different views where people have seen We should go from now. Yeah, one is of course the integration of robotics. So having more and more automated treatments Robots that operate on the patient or more than robots tools that operate on the patient Part of this is already there. So we have some surgical robots that are in the market and we will see some some examples Another idea is the idea to Incorporate information in the operating room So have not only the patient laying there and being locked in the room But actually have access to a lot of external information as well to integrate all these So, of course, this is also things that are happening and this is the whole concept of the digital operating room There's a big community and led by many manufacturers That are trying to integrate all the devices and have connected operating rooms and these sort of things What else well new ways of interacting with with things is from a movie And these are some prototypes that people use with the Kinect and things like that to To access data or more robotic systems that operate in the In the patient and in particular this robot is from a movie But it's also a prototype that is actually used in Experimental surgery was developed by a university. What else? I have also the concept of like Let's say Things that require less intervention, okay, so now the the concept of surgery is about opening the patient and doing things, right? This is going to change I think it is and in fact there has been a lot of ideas about how to operate from inside or have to have How to have devices that cure the patient without being so invasive and This relates a lot with the concept of nanotechnology And maybe some years from now people will think that we are doing all these blood and all these cuts for nothing, right? In any case in the situation in which we are now is that we are dealing with surgeons, of course And we have to provide tools to them to be able to guide or to be able to Help them perform the surgery Okay, this also involves the need for being fault-tolerant and to kind of monitor What's going on in the operating room and be able to integrate all these Okay, anyway, so the first block that I wanted to discuss was about surgical navigation The idea of surgical navigation is of course well, it's very similar to car navigation. Okay, you want to go somewhere You have made a sort of route a plan and You want to have some external device or some system That is able to guide you to the location or to the anatomical or pathological structure where you want to go So what sort of technologies or what sort of devices have people thought of and have implemented in in routine? From the past again, so just briefly to give the perspective that the first idea was this Stereotactic frame concept that was used in neurosurgery. I Was just a fixed frame that was attached to the the skull of the of the patient So the frame was attached with with some screws And then it was used to guide electrodes into the brain of the patient So what do we gain from this? We gain having some sort of reference system So it defines a Cartesian space in which we can relate positions and you can say okay So so many millimeters in this direction so many in this other This is still used in fact, and this is a modern version of the stereotactic frame But some some of the main limitations that led to further development and in fact The most crucial Flow of the system was that it was assuming that the anatomy inside is more or less constant So it was assuming that if we if you move three millimeters from here and five millimeters from there in this direction You should aim the to find the brain ventricles for example Obviously especially for pathological cases, this is not the case. There is quite a lot of variability and And it was not until much later when we had better Imaging equipment that we were able to integrate 3d imaging and 3d information and more let's say meaningful coordinates to guide the operation This is a picture from Hansfield who invented as you know the city scanner And well, this is why this whole field started to develop in the in the 90s basically Towards more integrating the 3d imaging and having more computing power To guide the surgeon and the first development that was let's say commonplace Was the use of articulated arms which are these sort of things? Okay, where you have a tool and you had a set of articulations and if you have a study robotics and You maybe are familiar with the whole concept of having several Articulations and doing the inverse problem to be able to guide to a particular location, etc In this case this this was a sort of passive robot So it didn't have any any autonomous movement of the of the articulations But it was able to compute What is exactly the 3d location of the tip relative to the reference? So if you had some three data of the patient and you were aligning through a calibration process this 3d data to the origin of this tool and If the anatom of the patient doesn't change while you are doing things to him or her Then this gives you a reference frame basically and this is still somehow in use Okay, this is a more modern navigation system What we have normally is that we have some pre-operative images They can be CT MRI whatever you have that you have used to diagnose the patient to plan the intervention From these pre-operative images You can have also some pre-operative plan and there is a whole field on how to do interfaces to plan interventions And to plan trajectories, etc in a way that is useful for the surgeon And this can be shown in this displays you have then also devices that are able to do tracking Here and tracking in this case in this device would be this camera here Which is basically like an eye that is looking at the scene and is able to track what's going on So what is actually the surgeon doing? Where is the patient if there has been any sort of? variations in terms of relative positions of the tools of the patient, etc And we will very briefly explain the basic workings of how optical tracking and electromagnetic tracking work And what's very important is that we have many different Coordinate systems, so we have the coordinate system of the pre-operative image And this can be many different coordinate systems because we we need to align all the pre-operative images as you know by image registration But then also we need to align all the pre-operative data to the patient because the patient is lying there on the on the bed And this concept is also known as registration is patient registration And there are quite a lot of ways of doing it, but the basic idea is that you have your images somewhere and you put them on the patient So you're able to work on the same 3d space. So the camera is Measuring 3d positions in the patient and these 3d positions they can be related to the image That's a bit the idea So this is an illustration from a very early navigation system It's from 90s and it was developed by the group where I worked for several years in Bern And it was the first system that was doing navigated spine surgery So you would have some pre-operative data of the patient and this data was shown in this in this display You would have also the camera that is the tracking system that was looking at the scene so it was looking at all the instruments And you can see that there are some small devices here which are sensors that are looked by the camera So they can be located and the 3d position of these things can be found And these sensors are attached to the anatomy, so to the spine And this is the tool of the patient that has also some sensors, so the 3d positions can be related And the basic idea is that then you can map the tool on the image So as the surgeon is doing things, he is able to see what is at the tip of the tool Even if he cannot have a direct view of the anatomy inside So that's the basic idea. If you cannot look inside directly thanks to this You can have access to the full 3d information and you can navigate the tip of the tool Ok, in terms of technologies that have been developed for tracking And that's something that if you work on this field or if you collaborate through your work with some people that are doing interoperative navigation There are a number of choices in terms of the available technologies So you have of course articulated arms, it's an inexpensive option Some early technologies that were tried were also based on ultrasound So 1d ultrasound probes just to locate positions of points Electromagnetic tracking which we will briefly discuss later Optical tracking and there are a number of choices and we will see now as well Or things like structure light or patterns of light that you project on the patient And then you are able to reconstruct the 3d shape based only on this projected light So there are a number of technologies that are used to be able to track the movements of the tools and the patient In terms of optical tracking which is probably the most common nowadays What you have is a camera or a set of cameras In this case this device has two cameras that are in a certain configuration that is well calibrated and controlled And you have tools where you need to attach some thing that has to be seen by the camera You have two types, active optical tracking in which is this configuration here You have LEDs that are emitting light on the tool And these LEDs they emit light and the light is seen by the camera And this allows to be able to track the tool However they need some power source, so either a local battery or some cables, etc The accuracy is better though Then you have passive tracking in which you have some coated markers So you have some reflective balls and there is light that is emitted by the camera and reflected So these are less expensive but also they are less accurate in terms of tracking accuracy The problem with optical tracking is of course that you need direct line of sight So if the surgeon gets in the middle between the camera and the tool Or if you want to reach a certain location that is inside the body where the sensors cannot be seen Then this poses quite a lot of problems and you are not able to track In any case, as I said, this is the most common technology And it's for example in some of the integrated operating rooms of BrainLab in this case Or in some other surgeries, you can see the camera hanging from the ceiling and looking at the scene It's a bit like a tennis referee type of thing One thing maybe to mention is that for every different tool that you want to use What they do is they have a different configuration of the markers So they are identifiable So if you look at the pattern of the shape of these balls, you are able to identify which tool it is And also the idea of tool calibration Tool calibration means that you have a CAD model or you have a computational model of how is the tool So the camera only sees the balls So you need to somehow calibrate and be able to tell where is the tip exactly relative to the balls Or what is the shape of the tool basically That is what is known as tool calibration To track the patient, it depends a lot on the type of intervention of course Just to illustrate, so for example for orthopedic interventions It's quite common that the use of a dynamic reference base A dynamic reference base is the same type of sensor In this case it is an active marker, so you have the LEDs here emitting And it is rigidly attached to the anatomy So if the patient moves, you can track the location of the patient And this is in fact one of the main limitations nowadays of these systems Even though they are quite used commonly For example for orthopedic interventions you are requested to attach these markers rigidly to the bones that you want to track So if you want to track what is the flexion of the knee in this case Or you want to track the movements so all the plan moves together with the patient Then you need to invasively attach these DRPs to the bones And the attachment is quite important, so this is a practical thing It's nothing really fundamental scientifically But if you kind of kick the marker during the intervention, which is something that happens quite often And it's not rigidly attached, then the system will think that the whole anatomy has changed So you will need to recalibrate and redo the whole planning Let's say, or update the whole planning during the intervention The other technology that I mentioned, apart from optical tracking, is electromagnetic tracking In this case this is based on generating a magnetic field through a field generator And then this magnetic field is in a way that your tools can be located in 3D Without going into detail, but it generates a field And you can imagine that the sensor locally or the tool is able to read the value of this field And through triangulation and things be able to locate the 3D position Of course the good thing of this is that it doesn't need any direct line of sight So you can use it for example at the tip of the endoscopes So you can put one of these sensors at the tip of the endoscope And even if the endoscope is inside the anatomy of the patient And it cannot be seen directly by the eyes The 3D position can be tracked anyway The cons of this technology is that it's sensitive to the presence of metallic or ferromagnetic objects So of course you need to control a lot the type of instruments that you use for these interventions Okay, so that was the first part So this idea of tracking systems or navigation systems and how you are monitoring the intervention The other concept that I wanted to throw during this talk was to mention a few types of interoperative imaging That are used in computer assisted surgery and some example systems Of course one of the most common is X-ray fluoroscopy So as you know fluoroscopy is continuous X-ray exposure so you have dynamic sequences It's used a lot for orthopedic interventions It's used a lot for catheter interventions in vessels, stenting, things like this In this case for the use in navigation it's very important to calibrate or to take into account The exact configuration of the X-ray machine So where is the source, where is the sensor to be able to interpret the images So to be able to let's say calibrate the dimensions of the images So if you measure three pixels how many millimeters this is basically And there are a number of procedures that have been used or that have been devised To be able to calibrate these in very non-invasive ways Of course nowadays with the use of advanced C-arms you are able to have continuous 2D fluoroscopy But also by doing a full rotation you are able to have combim CT scans Which give you a full 3D image of very good quality And you can relate this 3D image with the 2D projections This can be done in a number of ways For example here this is work that is done in our group by Chong Zhang That is so registering or maintaining the correspondence updated all the time Between the 3D dataset which is a pre-operative CT But it can be also an interoperative combim CT And the continuous 2D projections through fluoroscopy To be able to guide interventions of abdominal aortic aneurysms Other modalities that are commonly used in navigated surgery Ultrasound of course because it is cheap it is non-radiating And it's available in most hospitals This is for example work on guiding right the frequency ablation of the lever With the help of the view of ultrasound So you can imagine what is the basic idea here So you are monitoring the intervention and you are tracking the position of the tip of the probe This is another example and in fact it is from work that was going on in India When we both were working in India Was that like 15 years ago? And in this case it is for neurosurgery Where the idea was to have a robot doing the surgery And you would have a pre-operative MRI where you are able to plan the interventions So to see, analyze where is the tumor or where you want to go And exactly what you want to do with the patient And then have interoperatively ultrasound scans So this is possible because there is a window that is close to the ear Where you can have quite good ultrasound images during the intervention And the whole process was defined as a problem of registration basically Because you would have many ultrasound images You would be continuously monitoring the intervention But you would have only one MRI volume And the idea was to fuse this information and be able to keep updated the location of the tool relative to the MRI And to do this we thought of some methods for registration This would be an example of the pre-operative MRI, the ultrasound image This is a 3D ultrasound And this would be the result of the registration and the overlay of the gradient of the MRI on the ultrasound So to just monitor if the registration was correct Okay, what else? So apart from fluoroscopy and ultrasound I also mentioned that there are quite a number of interventions that are CT guided Mostly things related to needle biopsies So you would see the needle here and you are able to track if you are actually in the right location The basic thing is to be able to track if there is some deformation So if the needle has bent or if the trajectory has been modified relative to the planning And the same idea also for MRI So in particular open MRI, as you know, or maybe you don't There are open MRI magnets in which the magnetic field is generated by these two donuts here And you have an area that is open so the surgeon can stand here and operate And then he can go back and get a new image and go in and it's kind of flexible for interventions Okay And then there are a number of combined suites that have appeared lately I think we mentioned one, you mentioned the one of KCL also But for example, this is a combined operating room in which the patient can be moved Inside a pet machine or a city machine and the data can be updated So basically during the intervention the patient can be cleaned, let's say, and then thrown for imaging So we have updated information while we are doing the intervention Okay, then moving to another type of imaging and maybe showing some more examples of things that are more related to what we do here This is an example for surgical microscopes And surgical microscopes, of course, they give optical imaging So you see things bigger, that's the idea for microscope And they are very much used in neurosurgery and also in ENT surgery So ear, nose, and throat, so tumors in the skull base or endonacell type of tumors, etc This is a system that we built that was incorporating the tracking camera attached to the microscope And in fact this configuration was fundamentally much more accurate than other existing systems Because the other systems had the camera outside, so that needs to track the microscope device as well In any case, this was implemented, we were able to overlay through augmented reality the position of the tumor For example inside the patient, this is the mouth, the nose, etc And the access here for the tumor removal was endonacell And you're able also to overlay through augmented reality also things like the CT slice at a particular depth Relative to the position, etc And this was tested on a number of cases in Bern Other things that are coming up quite strongly is of course the combination of optical imaging With markers such as fluorescent probes that can be injected in the patient And they go naturally to certain locations that relate to the activity of the tumors So they fluoresce and of course they can provide complementary information to be able to delineate clearly the bounds of the tumor And thus update the margins of the removal area that you want to act upon Similar to microscopy, there is endoscopy as you know Endoscopy, the camera is at the tip of a tool So you are looking through a keyhole type of thing There is rigid endoscopy and also flexible endoscopy And the idea of this diagram is again to stress the importance of keeping track of all the reference systems Because we have the reference system of the preoperative data, we have the reference system of the patient There is the natural reference system of the camera, the marker that is attached to the endoscope The position of the tool relative to the marker, etc And I would say that this is in fact one of the things that always goes wrong So there is always something that either because the origins are switched or there is some, let's say, bad maintenance of all the transformations that are required It requires always some debugging In any case, then there is always, of course, the calibration of the camera itself As you know, in the scope cameras they have a bit of a fisheye type of optical view So you need, of course, to calibrate the focal length, well, all the parameters of the camera So you are able to relate, let's say, what you see in the image with the fisheye dimensions This gets even more complicated if we deal with flexible endoscopes And this is our case because most of our applications use flexible endoscopes In this case, the camera is at the tip of the endoscope and is this type of camera This channel through which you can put things inside, so you can put tools to remove small polyps and things like that You can have lasers that burn tissue or a number of things But as you can imagine, this is used, for example, for colonoscopes or for abdominal interventions So there are a number of very complex processes that go on here The only just being able to track the tip of the camera is also being able to monitor the deformation of soft tissue organs And this is where a lot of modeling and simulation comes into play To be able to kind of predict what are the deformations that are happening And compare them with the visual information that you are acquiring And these things So these are some examples If you have calibrated your camera, you can then also generate virtual views So in this case this would be a virtual view of the endoscope inserted in a pig in this case And you can compare the virtual view or overlay the virtual view with the optical view that you have in the endoscope One thing that we recently developed here at the university And this is work of Marta Guardiola in collaboration with Oscar as well Is the integration, it's basically a new imaging device integrated in the endoscope So it's based on microwave imaging And we built a system with a number of antennas that are attached at the tip of the endoscope And these antennas generate and get back signal And this is able to locally create an image And this image is also related to the dielectric properties of the tissue And in a sense it's a bit like doing local non-invasive biopsy So you are able to hopefully, once we get all the validation done You are able to have not only the optical view of what's in the colon But also to characterize if the tissue that is around is benign Or has a certain probability of being cancer And then you move and have a pattern for this So it's related to endoscopy and things that we do here in UPF One of the main lines of work in our group is on fetal medicine In our group and also in the group of parts So related to brain development in the fetus Related to cardiac problems in the fetus And also related to intervention in the fetus So there are a number of pathologies where the babies have to be active upon before birth Otherwise they die For example in this case we have a case of twin-twin transfusion Which we will mention now And in this case we have rigid endoscopes We may have some monitoring through ultrasound that is looking at the scene And we want to track everything that is happening to be able to guide the surgeon One thing is that the whole environment is quite complex So there is a small room, a lot of people Because you need somebody to use the ultrasound machine You need of course the surgeon that is using the endoscope, nurses, etc And you want also to integrate all this and provide the information at the right time And with the right accuracy So in particular for this case of twin-twin transfusion What happens is that we want to burn, as you will see here There is a laser that is burning some vessels This is because in these cases these are pregnancies of twins of course And there are connections in the vessels of the placenta And then blood is flowing from one baby to the other And we need to do something to close these connections Otherwise what happens is that one of the babies gets too little And the other gets too much nutrients And what this means is also that this is bad of course for the one that gets too little But also for the other one Because this deregulates completely the way the body is working And this can cause very severe problems leading to death as well So this is one of the big projects that we have running In collaboration with the fetal medicine unit here in Barcelona Okay so we talked a bit about computer-assisted surgery and navigation We talked a bit about different intraperative imaging modalities And how computer-assisted surgery has been used in some example projects And I will close with some considerations about surgical robotics You may be familiar with this device I don't know if you have seen or you have been exposed to one of these interventions So this is the DaVinci robot In fact there is quite a lot of controversy as to whether this is a robot or not Because what happens is that the surgeon sits in a console And they have some very sophisticated joystick type of thing And a visualization system that is a 3D view And sort of gives the view of the surgery But it's tele-manipulated So it's not a robot in the sort of conventional sense that you would think of it There are some cases of robotics that I think I just want to throw Because they give a bit of the idea of all the status Of all the problems that are related to robotics in surgery There was a system called Robodoc that was quite successful It was built and sold in many places and it was for hip surgery So the robot would attach to the femur and to the hip And then it was used to perform the cutting of the femoral neck And to cut the planes and so you would put the implant in the hip in a certain way What happened with this robot is that it was successful But then there was a study proving that the stresses that you force on the muscles By having these attachments in the femur were actually damaging the leg So it's not that the accuracy of the intervention was wrong or anything It's just that because it's a bulky robot and it has to be attached to the bones And it's putting stresses in the surrounding tissue Then this was creating problems And of course this means that the company went bust and sort of robotics went down This is another example of robotics This is related to a European project that I'm coordinating on cochlear surgery And this is a robot developed by the University of Bern And the idea is to have a drilling robot So this robot is able to basically orient this drill in a certain orientation Relative to the patient, here we have a skull, a calaveric skull And to drive the drill to a certain depth So in general it seems quite simple There is a whole amount of work on doing the planning of the intervention Because in cochlear surgery you want to reach the cochlea And the cochlea is surrounded by structures such as the facial nerve And some other risk structures that you don't want to touch And it's very, very important to be very precise So one of the things is that the building of the robot itself was not very difficult So it's not something that took a lot of time But actually getting the robot to be in a range of precision that is good enough That took many years And this is why we have for example a system that is a closed system So you have the monitoring of all the sensors and everything that is inbuilt in the robot But also a very, very, very precise camera that is monitoring the scene And there is a kind of feedback loop between the tracking that gives the camera And the information of the sensors So to be fully sure that the intervention is reaching exactly the position that it was planned Well, some considerations about MRI compatible robots Just to mention that there are some robots that need to comply with the way they are built So they can be put close to the magnetic field of an MRI And you have here for example an MRI compatible robot for prostate surgery I think you don't need more explanation on this And also similarly there are CT mounted robots So devices that are thought for intra-imaging use, let's say So these are robots that can be attached somehow to the reference frame of the CT So this provides also a certain simpler configuration So the accuracy of the intervention is relative to the image And it's quite accurate, it's another example And maybe to finalize also that there was a very interesting concept by a company in Israel Which is a mass of robotics And it's to have a very, very compact robot And this is used for example for spine surgeries And what they do is they mount the robot on the bones of the patient So it has to be of course stable enough But instead of having to track the patient and everything They put it in a way that moves with you So all the locations are relative to the anatomy that you want to target And not so much tracking all the deformations Okay, and then this is my last example I think So these are some crazy concepts of people that are doing robots For example for cardiac interventions in the shape of a snake That would get into your heart and would act upon certain structures inside it Okay, so summary and conclusions We saw some deviations about the future of surgery and science fiction and things We saw also some things about the basic idea of computer-assisted surgery Surgical navigation, historical things about frames and articulated arms Navigation workflow, so pre-operative, interpretive tracking, etc And some of the tracking technologies, optical, electromagnetic And some examples of navigation in the context of different modalities So fluoroscopy, ultrasound, microscopy, endoscopy, etc And some considerations about surgical robots One of the key things is the whole idea of minimally invasive surgery So to have interventions that require little incisions And that can go through small holes because you have a better understanding of what's below Because you have a plan that you did before and you have some tracking That is able to provide you the location of the tools relative to the anatomy Without having to open to really see it And in terms of future directions or current limitations It is quite obvious that nowadays tracking technologies is still too bulky So you need to still attach a lot of markers to the anatomy And attach bones to the cranium to have some fiducials, things like this The use of predictive models Or maybe not so much in terms of models But the need to have some way to update the information that we have That is given by the pre-operative images, the pre-operative plan, etc To be able to update it relative to what we are doing So if we are removing tissue from a tumor or something like that To be able to have a plan that is updated relative to what we did And not just the pre-operative plan that doesn't take that into account And then while in general this type of technology is currently available in many hospitals But still there is a lack of strong conclusive results proving the cost-effectiveness So proving the improvement that you get in the treatment of patients when you use this technology Relative to not using it and the analysis of cost And this is quite a big debate because people say these systems are expensive But if they save operating room time, which is the highest cost in the hospital as you would know Then actually it's cheaper if you use them, if you save certain time in the operating room And then while in terms of surgical robotics it's just not material enough And there is a lot of development but as many people would agree this is a bit the future So it will come, maybe not today And that was me, that was the presentation Thank you very much So I think this was very complimentary to what we have been hearing today And maybe when you hear your talk you might wonder a little bit like especially when you do surgical robotics Which is now the major field of research because all this looks like mechanics, electronics and things like that Is that the bottleneck or is it more for example the image processing side Or is it more the models that we need in order to do soft tissue or planning or whatever Where are the current bottlenecks and where are the kind of research questions now, the burning questions going on I mean I try to focus on the complementary aspects So obviously in our group what we have is we have image analysis, we have modeling and simulation And we have navigation as well and the group involves Gemma, Jerome and I And we focus on concrete clinical questions so we need to target all of that To be able to reach a practical system In particular for robotics a bit the tendency is to develop mechatronic devices That's a bit the hip term nowadays So to be able to integrate sensors and actuators on tools that the surgeons are used to use So if you come with your whatever your cuckoo robot or something and it's there and the surgeon would hate it It's a big machine and it's there and it's on the way and it's a bit his enemy Whereas if you have some sort of forceps integrated, some force sensor integrated into tools that are more usable But the surgeon that provide information that is complementary Then this may be the way also that they get into using these sort of devices Maybe partially related to that is how is it with regulations and things like that If you come with a device it's like what do you need to prove, do you need to prove accuracy or efficacy Or where are the things that's going on that you need to take into account So there is of course a big continuum in terms of what you promise So you can promise a planning system and just say okay I'm just showing the information that is anyway in the image in a different way So then you don't need so much regulation or you can do planning system where you are telling the surgeon what to do And then you need regulation or you can actually act on the surgery with a robot In which case the regulations are very very tricky and that's of course one of the big hurdles for the systems Very very interesting I would like to know your opinion about the new kind of 3D visualization systems with the glasses, the Oculus Rift etc And how you see this integrated with the robots and the surgical planning etc Well you know it's a bit, everything old comes back Now it's neural networks and mainframes and all these things that were passed now they come back It's the same with music, every like 30 years you get the revival of music So for example you've seen some things on augmented reality And this is very much research work but actually the technology of the commercial companies has overtaken this So now you have your mobile phone and it gives you augmented reality And I think the same is going to happen with virtual reality in a sense So still the use of glasses, Oculus Rifts and all that in a surgical environment has been tried many times And it's not a way forward, you can ask the surgeon to have these big glasses At least in the operation room maybe for planning is different And it's a bit the same with virtual reality, I think we are in a stage in which it can really happen finally I mean it comes back every like 5-10 years in virtual reality So I think this is going to change quite a lot of things So all these interfaces I think are finally moving forward Because companies are pushing them for the general public in like mobile apps and things like this And as a research community I think we need to be very careful in terms of monitoring what the companies are offering as well Because I see that for example in terms of image analysis for cardiac Where actually companies are doing better than the research community So something has happened, maybe we should wake up and think of new things to offer them So I think it's the same a bit with the solicitation tools Ok thank you very much So a last thing