 Hey, thanks for joining us on Think Tech Hawaii. This is Security Matters Hawaii, and I am Andrew Lanning, your host of this episode. And today we're going to get a little bit into active shooting, but we're going to take a little look at, a little different look and talk about the responses, talk about some of the problems with those types of events and the ways that we can address them or mitigate that and maybe save more lives, which is important to all of us. I have some great guests here today. Liam and Fiona are here. Thank you so much for coming. Appreciate it. Thank you for having us. Thanks for taking time. They were vacationing, but I snagged them from the studio. So I'll tell you what, I usually like to find out, since you're security folks and security-minded, what keeps you up at night? Okay. That's a good question. So for me, what keeps you up at night is just not wanting to read another catastrophic story in the paper. Aren't we burned out by that already? So we know there's many events that are happening around university school campuses and how we can get to the people who need help the quickest and how we can prevent these incidences and crimes happening in the first place, constantly trying to tighten that time to get to those people, trying to find ways and ensure what we're doing actually does that. Yeah, yeah. It shortens the time to respond. It'll be part of the solution and not part of the problem. Exactly. Liam, what about yourself? Yeah. I think Andrew that every day we get up, we think about today is going to be the day that we save a life. That's our goal. Whether it is the impact from a serious crime such as shooting, aggravated assault, or whether it's a rape or sexual assault, we want to impact someone's life. So that's what we do every day. Awesome. Awesome. So let's get into sort of what your background is. Maybe we'll kind of go into the history of sort of how you maybe got into technology or as much as you want to give away anyway. And then Liam, we'll get it from you. So we'll kind of drive up to what got you here today or what got you to HALO SOS. So I've many years working in complex solutions, designing and building and in sales, complex solutions for very large telcos. So they would be sort of long sort of sales cycles where you try to help them solve a problem that they have and make sure that that solution is a good fit for them. So my background is in sales and marketing. I've also worked with entrepreneurs over the years as well, helping them get their marketing message right. And that's another part of my background. For me, this whole solution came up when I needed help really quickly, when I needed to get to my phone but couldn't because I needed to keep my eyes on the problem. So the solution, yeah, how can I call for help when I can't get to my phone? So that's where the idea came from. It was a particularly scary problem. There was a situation where I couldn't get out of because I needed to keep my eyes on what was going on. So for me, that's where the whole solution began to evolve from. Wow. Amazing. So it's driven from experience. I like that. Yeah, yeah, definitely. You went to find a solution and you just built one. Yeah. You built one yourself. What's a bit of your history there? I'm the builder, I guess. I've always worked in mission critical communications. Everything I've done is, I guess the theme of my career has always been about protecting people. So whether that's kids in theme parks from being abducted or prison officers in maximum security facilities or patients and clinical staff in hospitals, it's always been about using technology to deliver effective results to ensure that people stay safe. Everything we do is about making sure people are safe. Wow. Amazing. Well, technology's changed and keeps changing very quickly. So tell me, so we heard a little bit about maybe what got Halo SOS started. How did you two decide to embark down the path of building the solution? I think when the incident happened to Fiona and that was the germinated and that was the idea began to form, we began to look at environments where we could have an impact, where the problem was the largest and as it evolved, we realized that campus environments, they're essentially small cities. Some are large, almost large cities, some are bigger than small cities. And so you have this mix of people but in a very controlled environment. So our focus is on solutions where we control that environment to a degree and implementing technology that assists campus police officers, assists the students and assists the university as a whole. I think what we did was when we looked at existing solutions in the marketplace, we realized that from a practical point of view, the user experience didn't actually work. So if you take the situation where you have a girl walking home at night and she feels under threat, currently she has to get access to her phone, unlock her phone, find the app. It's simply not practical when you're nervous, your hands are shaking, you're trying to run or something else. You're trying to run. You're trying to run. Exactly. That user experience by making it as simple as possible. So now people know they don't even need access to their phone. They just say a pre-recorded keyword that's unique to them and will generate alerts to campus police. And that launches it and it gives their location and obviously that's a duress signal. Correct. So I guess that was the first part. The first part we did was we changed that user interface to make it more practical and more useful. The second bit that we kind of revolutionized was helping the campus police. It's all right to generate an alert, but campus police need to know exactly where you are. To respond quickly, yes, they must. And if you keep moving, then because obviously you're running from someone perhaps or something. So where you sent the alarm may not be where you still are. Correct. Well, the key thing is to get out of the way, get out of the zone, get out of the situation. So moving away from whatever is happening is really, really important and finding the right way out of that situation is really key as well. So in a situation where there's many people being affected, we can have a tremendous impact on how to get them out of the red zone, the hot zone, what's going on at the time. So we can identify when many, many people are in trouble. We have a solution that identifies where that's happening because many people are raised an alarm and we then allow them to find an egress or an escape route out of that zone. And we also, by doing that, we also show others on campus. So you might have 30,000 people on campus or more not to go into that zone. So that's the danger zone. We don't want you going there. We want to keep you safe. So between the two, the two variances between the single person requiring help, which we think about whether it's indoors or outdoors, we can find them. So indoors, we can identify what floor they're on or what room they're in potentially. Awesome. Depending on the granularity that we go into. And then outdoors as well. So somebody far away from the security control, we can then launch a drone to identify if they need help, what kind of help they need, what kind of situation they're in. We provide that advanced intelligence back to the security. That's awesome. Yeah. Yeah, you often hear that people are blind, right? So that your first responders are showing up and the information that they were even sent with maybe it's maybe there's a little bit of verbal stuff being told to them. We heard people said there were shots on the third floor. People said something, but they still don't know. So that that ability to get some situational awareness for what's truly happening on the site. And then even knowing how many people am I looking for makes sense to me. How many people do I need to get 100 people down the stairwell? Or do I have three people that could jump out of a window or something? I don't know. But knowing how many people I'm trying to help is also, I think, important. One of the things I think important for us, Andrew, is to realize that first responders aren't necessarily the guys with the uniform on and the badge on the shelf. For sure. It's individuals. It's individuals who are there at that moment in time. And we need to give them relevant information to that situation that has a context. I think the difficulty at the moment in these mass killers or these active killer situations is that the information that's driven out to people is, first, it's not timely. It's usually an SMS and it takes a long time. But it's also no context around it. It says things like shoot around campus. Yeah. That's, you know, eight thousand acres. First thing, that's the first thing I'm thinking of, where? Where? And am I on the campus? Where am I in relation to it, right? And even hearing gunshots, you just don't know where they're coming from. And we see that. The reports, and I guess, I don't know if you'd call it fortunately or unfortunately, been able to talk with folks who were like in the rooms and had been under fire at some of these events. And they, when the gunfire first began, they don't perceive it as gunfire. That it's just not what they're expecting to happen. So they don't know what's occurring. And that's the clocks ticking. Yeah. You know, while they're, you know. And that clock ticking is really, really important for us. So identifying the moment something happens and then getting people to that site as quickly as possible. So squeezing down and reducing the time to respond. Also, in that golden hour, in that time, it takes to get to somebody. That golden hour is where we can have an impact on so many people who actually bleed out or go into shock. We can't find them. We don't know where they are. So if we can identify where they are in the building, we can get to them a lot quicker. Are they able to give feedback as well? Could they, could they, will they, is it bad directional words so you could see them? Are they able to say I have been shot so you could maybe know? Or is it, do you see the ones moving and the ones that aren't probably got shot? Or what are we, what's the future for that look like? From the user perspective, we've multiple ways of doing that. We open a voice channel so you can talk back. OK. But in certain situations, you may not want to. Yeah, you want to be silent. You want to be silent. You're hiding. So we have, you know, push-button approaches if you need to do that as well. From the campus police point of view and the responder point of view, they are seeing messages saying I'm OK, that disappears from the screen, and we now focus on the ones who haven't responded. So we know where to focus our attention. Yeah, we know where to focus. And Ken, could I get more intel? Could I, could I perhaps open a voice, a sound, a listening channel? To me, that could be super valuable, right, because I could hear the person barking or screaming or, you know, ranting and then it gives me more location or more information about that location. If I'm in shouting distance of that phone, I'm pretty close by, or whatever it may be. So, interesting. Absolutely, yeah. I mean, your voice is really the most powerful weapon you have. It really is. So using that to call for help, to generate awareness, to provide intel, is vital to positive outcomes on these events. Yeah, and shortening that time frame, so the, you know, a lot of times these things are over so quickly, and there's been a lot of damage that happened very quickly. So the speed of response to me is also very appealing feature. That immediacy, quite often there's not a lot we can do about that, but it's the after effect. The immediacy of it happens. We need to know about it as soon as possible. But it's what could continue to happen? Who could go into shock? Who could bleed out? Who can be found by that shooter or shooters on campus as quickly as possible? Yeah. And the other thing I like about this particular solution is it doesn't just address like the large incident. It addresses the singular incident, right? And I think that's often overlooked because there's probably a lot more of that type of activity, you know, that someone needs help with, right, instead of like a potential rape or a potential assault or something, I mean, there's a lot of that that goes even unreported, right? We don't even know. Maybe people had the opportunity to call it in at the time that they feel threatened. I think we'd get better data on how much of that's really going on as well. Absolutely. That brings us nicely into a couple of other features that we have added into the solution. One of them is reporting safe harbor reporting. So for somebody who has been threatened in the past or who has suffered an assault and they don't know where to go or they don't feel comfortable about reporting, we do allow them to report that in a safe harbor situation until they are ready to report it. But also, prior to that even happening, we have another feature that allows them to report something that is not happening necessarily to them or is maybe going to happen in the future. So they can call out something that they're concerned about or want somebody in authority to take control over. So if they see a problem, they say it and they allow that then to be registered into the system. In the safe harbor reporting for anybody who has been assaulted, bullied, attacked, whatever the situation is for them that they feel vulnerable to or have suffered, once they report it, we then use analytics to tie other people into that same situation. So if there's any commonalities, we will connect them and allow them to take it further if that's what they want to do. Perfect. So let's take what we're going to do. We're going to take a short break. We're talking with the founders of HALO SOS. We'll be back in about one minute. Thank you. Good afternoon. My name is Howard Wigg. I am the proud host of Code Green, a program on Think Tech Hawaii. We show at three o'clock in the afternoon every other Monday. My guests are specialists both from here and the mainland on energy efficiency, which means you do more for less electricity and you're generally safer and more comfortable while you're keeping dollars in your pocket. Hi everyone. I'm Andrea Gabrieli, the host for Young Talent's Making Way here on Think Tech Hawaii. We talk every Tuesday at 11 a.m. about things that matter to tech, matter to science, to the people of Hawaii with some extraordinary guests, the students of our schools who are participating in science fair. So Young Talent's Making Way every Tuesday at 11 a.m. only on Think Tech Hawaii. Mahalo. Hey, welcome back to this episode of Security Matters Hawaii. I'm Andrew, the security guy. We're here on the Think Tech Hawaii studios with Fiona and Liam from HALO SOS and we are talking about mitigating some of the problems for not only active shooter events but for crime itself in general. How do we get help to somebody quicker and how do we do it almost hands-free? You know, a way that they can get a response to you when they may be struggling with something, can't get to their phone, so voice-activated type of technology that gives your location out, lets people know you need help. And so we were talking a little bit about also a safe harbor feature that you built in that allows people to report up an incident that they think could occur or may be occurring or they don't really want to go to the authorities yet. They're not sure, but it lets us start collecting data about some of these perhaps bad guys or bullies or whatever certain people do, bad things to people. And we know it's sometimes a few bad guys are doing a lot of bad stuff. It's not like everybody's causing trouble. So this is an interesting feature, let's delve into that a little bit more. What do you think the long-term development of something like that is built a database of bad boys, I'm thinking, at least the campus scoundrel, I don't know what you call him. So, look, I think it's a very interesting point. Under the Clery Act, you see colleges having to report the various crimes. And truthfully, truthfully, the vast majority of these crimes are committed by individuals. It's not the whole campus. So it is one or two individuals from the bad apple who's given the whole campus a bad name. The difficulty is, particularly in sexual assaults, people may find it very difficult to report that incident for any myriad of reasons. It's a very difficult thing. They feel responsible. They feel like that. And they're bullied or threatened or don't be a tattletale. There's a lot of stuff like that, right? Sure. Or those type of personal issues that people can't feel they can respond. So what we built is we built a very intelligent database, essentially, right, and it runs artificial intelligence in the background. And as you report the crime to the safe harbor element, so it's not coming to campus police, it's not going anywhere, it's sitting in a repository. Within that repository, we capture certain details. So for instance, one girl might report a crime that says, Liam Darling attacked me Friday night. She doesn't have to do anything else with it. She puts it in there. A couple of weeks later, another girl may report this guy, Liam Darling, attacked me on Saturday night. What actually happens then is the algorithms are looking for matches and they see this name that says Liam Darling, Liam Darling, and then we revert back to both people who reported the crime and flag it to them that you're not alone. So if you think of the Harvey Weinstein scenario, the reason he got away with that for so long was because nobody knew, nobody was talking. Then he was doing it to everybody. Exactly. So it's a small group of women to come together and put it together that this is what was happening to them in the past. He's like a serial criminal, this guy. Yeah. And a lot of these guys aren't. They are. And they're expert at it. And they're expert at hiding and they're expert at not being found because of the techniques and the strategies that they use. So I like the idea of giving it back to the people that first reported and letting them talk about their experience, because it could be the guy that just could probably get him out of the room. What was it when you say no means no or whatever you're taught and once I said no when you don't leave the room, maybe people need to know about that, especially if that's becoming a persistent behavior. So this person who's maybe not even hurting anyone yet, he's just not getting the message very clearly. Maybe he needs a little counseling on what no means. Just for example. Well I see a lot of outcomes from, it's kind of a sort of pre, perhaps pre bad event. Major event. Not that people don't feel violated just by telling them please leave me alone, I'm done. That's not what I intended or whatever. That should be enough. But when it's not and someone's not getting it, you know, maybe this thing gets a counselor sooner and stops something really bad from happening. It becomes preventative. If that's out on campus and everybody knows it's available and everybody's using it, it acts as a deterrent to the person. You know the U.S. has released some funding for autonomous reporting. I know they just put this in the, but I don't know if it got passed, but I know it came out in the budget to give to state Department of Education to implement something similar. So some autonomous reporting, because I guess they don't want to even be known. But I like, you know, yours if I've opted in obviously for the service and I've used the safe harbor service when I get some information back that's still protected between the service and myself. Correct. So it would seem to me that that type of autonomy, it must be enough science that there's value there, that people are more willing to report if there's a little bit of autonomy. You're not always ready to go that next step. Yeah. Interesting. Interesting. So let's talk a little bit about the drone. So we have some different rules here in the U.S. for drones, which is, I think, problematic for this type of response thing. But obviously where you're going with that, to bring situational awareness to, you know, you're given the geospatial information, I'm guessing, to the drone. It's going there on site. Is this some stuff you already tested and are playing with? Yeah. It's really powerful. The drones, everybody loves the drones. Well, of course. Yeah. We have to talk about drones. It doesn't love the drones. But they do serve a very serious purpose. Yes. All right. Okay. On any event, whether it is an assault aggravated or whether it is a shooter, your first responders are the people on the grant. So the next people that come are campus police or emergency personnel. But you do not want them stepping into something unknown. Sure. You simply don't want to do that. Right. That's what you say. You never send your neighbor over to stop the robber in your house who has a gun. Right. Can you see what's going on? Yeah. You get shot, right? So you've got to protect your people because if you don't protect the professional first responders, they aren't going to be able to provide the necessary service to help the people there. So you've got to protect them and you've got to provide them information. And the way we do that is the drone actually autonomously flies to the event. Awesome. So if it's outdoors, it flies to GPS coordinates, right? Mm-hmm. If it's indoors, what it actually does it performs a perimeter around the building, watching the exits. Exit egress. We have high definition cameras on that stream back to campus police on their devices en route to the incident. That's awesome. So the officer is not walking into something that he shouldn't be walking into. We need to give as much information as possible to the first responders who are in the event and the professional first responders. Yeah. And especially the law enforcement always talks about so they don't have the information. They're coming in, the guys are going that way. So when you've got that drone, now me, I want you to, I want you to shoot him with that green ink. You know, I want him marked so we see him for a week in case he does happen to run in the woods and get away. But I know we're probably a little further away from that. Well, you know, we love those, please. Yeah. My Navy training says let's weaponize that thing. I know. I'm sure it's, you know, I'm sure there's more to it. But I like the, I think that it's powerful just if he could even follow him as they egress. Someone's running away and now I've got eyes on that person getting away. We've got eyes on this. You know what I mean? And that's, that's, I mean, it's super powerful. The first responders can still go in and handle what's happened. Now we've got a responding force that can go run this guy. See, he may go to another building and hurt other people or wherever he, you know, we don't know. And again, when we have eyes on him, Andrew, right, we can provide intel to individual vehicles within that red zone to egress that area. So keeping, keeping a focus on them is really, really important. So it's about advanced intel really. I hope my friends at University of Hawaii are watching us today. Garrett, you out there. So we'll, we'll have to, so you're, you're, you've got a rollout down in, on the campus in Australia. I know you're trying to get also in the US, you're talking to folks here. So what's, what's, what's going on with that? And we're at campus safety in, in July in Virginia. Austin, yeah, Rob, Robin Hattersley has a huge event up there. So that'll be really good. So I'm looking forward to meeting some people there. Yeah, we have, we're based in Australia at the moment currently. We have a rollout happening in Melbourne with the university there, which is, which will be done, I think, at the... Yeah, I think, I think that we should be done by September. Awesome. It's a very interesting campus. It's a urban campus, which means the drone doesn't really have a place there because of its compact nature. But it is a series of large indoor multi-story space. And they want a high level of granularity into that building. Good. Yeah. Which means that our technology is pretty much the only technology that can give you a room-level position of a person in a JRS situation. That's amazing because you need a lot of towers, basically, to know the exact location of the phone. You sell towers to use in the cellular signal. So, well... And try some GPS as well. Yeah. So GPS outdoors. Yeah. Now we use very simple technology. Bluetooth positioning indoors. Oh, okay. Got you. Right. So it's very simple. I mean, the install is really good. But the power of that technology to deliver accurate granular results is just amazing. That's awesome. Absolutely. Exciting. Will you be able to talk about that more when you're up with the campus safety folks in Virginia? Absolutely. They'll be wanting to see that. And then, of course, here in the Security Matters Hawaii studio, we're going to want to get an update. So we'll... We do some remote. So maybe we can do a little remote broadcast at some point, maybe in the Q4 or something. You know, when you folks have kind of got that thing done down there, it's not lost on me that you've put a lot of effort into minimizing response time. Because people really need help. And everybody sees the headlines for all the big stuff, the terrible stuff that happens. But people don't often think about the smaller things that actually got this started. You know, that response like, I don't have time to pick up my phone and open an app. I just need to say the word and I got help on the way. I had my phone in my hand. I could not take my eyes off. Yeah. I couldn't take it off to find the... Sure. That makes total sense to me. So that's brilliant. And there's a, you know, it's interesting how voice has become this new interface, right? So everyone's talking. Yeah. I mean, I know we had some... I think it was... It's not serious. Alexa or somebody sent someone's voice in the wrong way. So we need some controls on that, obviously, and we've talked about those things. But I mean, in an emergency situation of all the things, being able to use your voice, as you said, is very powerful. And that's an interesting driver that I think will catch on. Because it, you know, the phone's listening, the phone can hear. And so if I can ask for help and help comes, that's like almost from the heavens. Is that where Halo SOS came from? Exactly. Like your angel. I don't know. We've got a few minutes or so left. We do like the idea of watching over. And again, when somebody... Somebody actually says a key phrase to activate. Yes. Like you would say, hey Google. So you say a key phrase. It opens up the device. It's not necessarily listening all the time. So maybe that's... Oh, yeah. Good point. But invoke as the product and Halo SOS as the business, definitely it's about calling in. It's about watching over you. So there is that, yeah. Great. We want to watch over people. Thank you. Thank you so much. Thanks for coming in today. No, thanks. I'm out next week. I'm going to be off island, so there will not be an episode. But thanks for joining us today in the ThinkTech Studio with Liam and Fiona. And we hope to hear more about Halo SOS in the future. We look forward to being back. Aloha. Thank you so much. Thanks, everybody.