 Good afternoon, I'm Dr. Lewis Myers and welcome to this month's edition of Health Care Today. We're going to be talking about a relatively new innovation in medicine and medical education and that is the use of simulators to teach medical students, nursing students, nurses and physicians. And with us today we have two guests who are on the front lines of bringing this technology to our workplace and to our university. Mr. Jacob Lind has joined us from Rutland, Vermont. He's with the company Tacitly Operations, which Mr. Lind studied biomedical engineering at La Torna University in Longview, Texas and then returned to his native Vermont. And this company is involved with teaching nursing and nursing students how to quickly get up to speed when they are on the units and the floors at the hospital. Dr. Daniel Ackle is here with us from the University of Vermont. Dr. Ackle is an emergency department physician, also the director of the simulation lab for emergency department residents and medical students at the University of Vermont. He attended the University of New England osteopathic medical school where he got his medical degree and did his residency training in Rhode Island. So welcome to you both. I should introduce this by saying when I was in medical school, which was a long time ago, we had almost no simulators. Almost everything was hands-on or big lecture halls. I don't remember any simulators to be honest. So this is a whole new world to me and I'm going to start with Mr. Lind first down in Rutland. Tell us a little bit about your company and what your technology is doing. By the way, I should add that we're also going to have each of you brought some video of how these work. Yeah, that I work for Tacitly and that we're a software company that develops training for professionals using extended reality, particularly for the medical field right now. And the idea is to make things more accessible by using digital learning tools. Can you give us an example of how this works? Sure, that our particular thing we're working on is called CodeCart XR and that CodeCarts are used in code blue emergencies when somebody's having a cardiac arrest or difficulty breathing and they're a tool you really hope you don't have to use, but when you use it, you want to make sure you can make every second count. And so we've created a digital representation of a CodeCart that can be interacted with because CodeCarts are sealed off with a lockout tag, certifying that that cart has all of its tools in the proper place and nothing's expired. And so that if you break that lockout tag before that cart can be used again, it has to be recertified, which is an expensive and difficult process. So that means accessibility to familiarize yourself with the contents of a CodeCart is kind of difficult. You can't just take one aside and open it to look at it and arrange access to a physical training cart might be difficult for timing or finding somebody to give you access to it. So the idea is to allow a digital recreation of it to make it extremely accessible and have zero cost of familiarizing yourself with its contents. And as you've told me, we of course have a lot of visiting nurses coming in from other places to work in Vermont and vice versa. Some of the Vermont nurses go other places for a while. Are CodeCarts different, each CodeCart? That's one of the difficulties that this helps to face is that CodeCarts are not standardized. And so as you pointed out that visiting nurses, they might be seeing a new hospital every three months or so. And that means that this critical tool that they're new to the hospital, they're not familiar with the arrangement of the tools and medications inside. So that arranging for easy access for that familiarity is super helpful. It gets that nurse up to speed and confident so that in the event a CodeCart does need to be open and used, they're going to be familiar with it and that much more effective with it. And I think I saw you demonstrating this earlier before we started. This is one of the virtual reality headsets? Yes, yes. Do you want to hold that up and just show? Absolutely. So usually this works independently of its own but we currently have it wired to be able to record the demonstration from it. And the idea is to provide a visual representation of the CodeCart that you can interact with in 3D space while also being able to stay present in the room you're in and that we're very proud that we make use of a hand tracking on this so that there's no need to learn controls or having other devices you have to work with if you know how to use your hands you can interact with the CodeCart. Is there any audio? Yes, yes. So that for our demo there's not at the moment particularly much beyond ambient sounds of boxes and drawers opening and the glass of the medicines clinking. But we have hopes that in the future we could implement some ambient sounds that are often present in codes that are, it's actually quite loud. There's monitors going off, there's people shouting for things and you've got to be able to distill from that situation what's important and what to understand. So being able to practice with a demo and having ambient noise that's intentional distraction gets you prepared for the real thing. So you've started with something very specific which is the CodeCart. Do you see it expanding in terms of nursing education into other areas? Absolutely. The sky is sort of the limit that we started with CodeCart because we saw such a need for it as well as the fact that it's a high risk, low occurrence event but there's all sorts of similar things that have that sort of accessibility difficulty that we think of being able to open a lot of the other sealed medical kits that are sterile and such that there's often a particular order of operations with opening those and preparing them, laying them out for procedures. So if we could offer that same accessibility to practice with those that gives nurses the ability to say, okay, I know I'm going to be assisting in one of these procedures tomorrow or later this week. I want to give myself a quick refresher, grab a headset and under a minute you're being able to go through those tools, open those kits and understand what your role is going to be later on. And then five minutes later you can put the headset away and go about your day rather than having to worry about, you know, can I get down to the simulation department is somebody available to show me this particular thing and then you're having to coordinate multiple people. We want to make it as accessible as possible and freeing up those nurse trainers that are responsible for training giving them the opportunity to focus on other things rather than sort of what could be done individually with the headsets. Well, it sounds like it's a very, very useful way to proceed as an adjunct to all the other ways that people are learning. And we really appreciate you being here and I know we're going to get a chance to see some of the video for that. So if you'll stay with us here, I'm going to also talk with Dr. Daniel Ackle who I've already introduced, director of simulation for the Merchants Department of Residence University of Vermont and also, broadly speaking, also for the medical students as well. And I understand the University of Vermont has been somewhat at the forefront for simulator training. Yes, the Larner College of Medicine really prides itself on being at the forefront of medical education and part of that mission is a concept of active learning that when you are learning to be a physician only so much can be learned in a lecture hall or from a textbook. So we want our students to get as much hands-on experience in these simulated scenarios as possible. Again, similar to what Jacob is focusing on, we like to focus on these so-called high acuity, low occurrence scenarios and all of the potential medical procedures that go along with that. So the college has really invested in that concept of cutting-edge technology and we're very fortunate to have the clinical simulation laboratory at our disposal. It's a state-of-the-art center over 9,000 square feet and when you walk in you'd think you were in a hospital room or in an emergency department. And along with just the space, the mannequins, the technology is constantly evolving. Through grants and funding through the school we've been fortunate enough to stay at the forefront of that with our extremely high-fidelity mannequins. Yeah, let's talk about the mannequins. I know many viewers may have taken CPR classes or even ALS classes, Advanced Life Support, where you've got to know Rassassa Annie, which is a large plastic facsimile of a person. But you've gone way beyond that in these mannequins. What can this mannequin do? Yeah, so the 3G mannequin we use can really do almost anything you can change the size of its pupils, you can change its breath sounds, you can swell the tongue to simulate an anaphylaxis or allergic reaction scenario, for example. Some of them you can cut into the chest wall to simulate a chest tube, you can perform a cricothyrotomy into the neck. So the technology is just constantly evolving. Can I ask what one of these costs... To be honest, I don't know off the top of my head, but they're in the order of hundreds of thousands of dollars. Wow. Let me ask, in terms of medical school, and I know you work more perhaps with the residents, but I know people know that first-year medical students go through gross anatomy, which is sort of a rite of passage where you crowd around someone who has donated their body to science after their death and students dissect and learn in that way. There have been some concerns in recent years, particularly since there's a lot of exposure to femaldehyde, so these young people are getting contact and breathing in femaldehyde for several months, and also the fact that it just is not as realistic as living tissue. Is there a role for simulating the gross anatomy so that people are not spending so much time in that particular setting? Yes, there are several companies in the medical device industry that are really committed to creating that life-like tactile feedback through their products. The recess ante that you had mentioned has really not been around that long, but the evolution and the quality of the latex and in terms of procedures like cutting and inserting needles has evolved. Just the three-dimensional imaging of the body that can now be done through computer simulation is tremendous. Correct, yes. Their anatomy course at the College of Medicine does still utilize the cadaver base, but you're right in terms of the tissue itself. It's not quite the same. By the time students reach third-year medical school and they go into their surgery rotations, it's well known that much of that rotation involves holding retractors, which basically it's an important job, but you're basically just holding the tissue open so that the surgeon can see what they need to see and do what they need to do. But it's not particularly ennobling or even educational. Is there a role for simulators to help teach third-year medical students, for example, who are doing their surgical rotations? Yes, the clinical simulation lab at UVM, the Lawrence College of Medicine, again, they're completely committed to cutting-edge technology. They have simulation trainers for the surgeons or surgeons in training, medical students to practice laparoscopic skills as well as in our ultrasound curriculum performing ultrasound on task trainers. We also have a fairly robust community of simulated patients who will fulfill the role of actor or standardized patient and that's been a resource we've used quite a bit in terms of learning physical exam skills, interviewing. Of course, that's been around for many decades. One thing that has developed over the last 20 years is the robotic surgeries where a surgeon who are trained in this will actually not have their hands on the patient. They'll be sitting across the room looking into a screen and almost like a video game using remote sticks to do the surgery, send a signal to the scalpels that are actually doing the cutting. Tell us a little bit about that and how the work you do interacts with that. Yes, so I can't speak directly to those devices, but my understanding is, again, there's a training module or training simulator similar to what pilots go through and similar to what we're doing with our residents in terms of the higher risk procedures is we have a safe simulation space where it's okay to learn and make mistakes in a safe environment and then after you're credentialed and have gone through a certain number of scans or practice procedures but also going through a checklist by a verified educator, then you would be credentialed to perform those tasks on live patients. You mentioned pilots, of course, race car drivers and a number of other professions now using these high-tech simulators which are almost as realistic as being in the cockpit of a plane or a Formula One car. Have you learned, has the medical profession learned from these technologies? Are we borrowing from these technologies? I think so. I think, yeah, even seeing what Jacob is doing. Yeah, the virtual reality. Virtual reality. There's a lot of synergy there across industries, across businesses and healthcare. It's great to see this sort of leading and helping lead the way. Let's talk about cost because we're in a cost-efficiency environment now. It sounds like from Jacob's perspective, it saves time and money by not having to break open code carts and have extensive human training. What about what you're seeing? I know up front the costs are significant, as you mentioned, but do we recoup those? We envision simulation really as a cost-saving measure. Especially as it pertains to these high-acuity, low-occurrence procedures. For example, one of the scenarios we train across our network sites in New York and Vermont, and we've established a program to bring the simulation to these community hospitals. One of the scenarios is an emergency department delivery. In a scenario like that, if everyone in the emergency room in the middle of the night has never delivered a baby or been in that scenario, it could be very high stakes. We feel bringing this training certainly will lead to better patient outcomes and in theory would help save money from a risk management standpoint. Finally, I want to ask about artificial intelligence. We had two University of Vermont medical students who were about to graduate on this show last May, and I asked them this as well. Artificial intelligence, I assume, is going to interact in some way with what you're doing, but also the flip side of that question is the human touch. Do we risk by going to predominance if we're simulating that we lose that human contact and mentorship that is so important? Yeah, I think the artificial intelligence, again, it's extremely impressive to see where things are heading, but like you had mentioned with our standardized patients and our communication skills as providers and caregivers, those are skills that can't be done by artificial intelligence. We still really value the oldest training methods they've been doing at the College of Medicine for over 40, 50 years, even longer. So we like to blend a hybrid together and create a robust curriculum that encompasses both cutting-edge technology but also core fundamental communication skills. Jacob's nodding, so I think he agrees. Yeah, that the way we elective view it is that these new tools that they free up the people who are doing those in-person training that they can have their attention focused more on that, and then we see that these tools are sort of preparation to be able to interact with that. So if you can free the trainer up so that they can focus on their thing, then you have multiple people having access to a VR trainer or an AI simulated patient interaction that it gives them sort of a primer for when it goes to the actual interaction. So in no way does that need to be ever replaced. It's just give people a jump start on being able to interact. Well, this is a brave new world. We've just begun to introduce it. I really want to thank our guests. I know they brought some videos that the audience will be seeing about each of the technologies that they're working with. Jacob is now going to show us, using his virtual reality, what the program would look like for the nurses and nursing students who are learning how to use the crash cart. Yeah, so this is our main product that we're developing right now and that we have what we like to call a digital twin of a code cart here. And the intent is to make the cart have all the appearance of a hospital's actual cart as well as the location of all of the medications and everything inside should be true to how they have theirs arranged because these are not standardized and it's different for each. And so being able to go in and just pick things up and interact with them that we can also point to be able to get the names to scan through things because sometimes it would be a little difficult to read small text in the digital environment. And then the intent is that if you know how to use your hands you can interact with something in here as well as a nice feature is that you don't have to worry about cleanup. If it falls on the floor we can just automatically reset which is important because part of education is the more repetitions you can get the better the information sticks. And so with a physical cart you would have to reset everything, put everything back, verify that everything is indeed in the correct position and then you can start another repetition. With this it's as simple as just resetting the entire thing, everything's back in its place and you're ready to start another set of training. I have to say just being here watching you do this I'm astounded at your dexterity. Now some of the people, even younger people might not have ever used virtual reality that itself will be a bit of a learning curve. Yeah, one of the things we pride ourselves in is that the hand tracking as well as the pass through makes it so it's a lot more intuitive that you're not completely put in a foreign environment. You still see the room around you, you still see the people around you and that you're not separated from the content by having to learn a control scheme. You've just got your hands and you can just interact as you like. That's fantastic. Now we were talking also offline that as opposed to the several hundred thousand dollar simulating devices up at University of Vermont this virtual reality may go for as little as a thousand dollars. Yeah, for the actual hardware, yes. So there's the cost of the development of everything but the physical hardware necessary to run this is about in that range. So you would have to go into the institution and film this initially, right? Because as you said each crash cart may be different. Yes, so we envision an onboarding process where if a hospital was interested in using this for training at their location that we would go and take reference pictures of everything make sure it's accurate and then we'd create that digital twin and get it back to them and make sure everything's in its place and then they've got a one-to-one representation of their cart with easy access. That's fantastic. Jacob, thank you so much for showing this. Now your company again is called Tassetley. Tassetley Incorporated. Tassetley Incorporated out of Rutland, Vermont. So thank you so much. This is exciting stuff. Thank you. Thank you as well.