 Hello everyone and welcome to the AI for Good, financing power to solve problems panel itself by self-left. My name is Lisa T from Intel Corporation and I'm a solution owner in our analytics and artificial intelligence engineering team and I am thrilled to be here to share with you some of the work we've been doing under our Intel Inside Safer Children Outside program and our collaborations with nonprofit and industry partners to drive the use of artificial intelligence to accelerate the ability to identify missing children and help to rescue children from sexual exploitation. With that in mind I would love to turn it over to my panelists that can do a brief introduction to share with you some of their different elements that they're working on in this space and we look forward to an engaging conversation with you today. We'll be doing a few seated questions for the first 40 minutes and then we'll leave 15 minutes at the end for audience questions and we would love to participate with you afterwards to see how we can engage you all and coming to the front lines of fighting this with us. So Mark with that, I appreciate if you can introduce yourself. Hello everyone, thanks for coming. My name is Mark Jan Turko. I'm the Chief Technology Officer for NICMIC which is a national center for missing and exploited children. We're a private nonprofit. We're funded by private donors and the government. We were started in 1984. Every parent's nightmare happened when John Walsh, who's the host of America's Most Wanted, his wife took their six year old son to a Florida shopping mall and their child was kidnapped from the mall and found two weeks later brutally murdered. And from that they wanted to create an institution that could help to coordinate law enforcement and provide resources to help children get recovered. We started up in 1984. We've assisted in the recovery of 220,000 missing children in that time. And we're also involved in recovering exploited children. We have five main programs, the missing children exploited children, a family advocacy group to help families going through the experience of trying to recover a missing child. We have programs where we educate law enforcement and how to prevent and how to quickly recover children. And we also have a training group with resources to help prevent abduction and other child safety issues. So with that I'll go ahead and turn it on and look forward to talking with you all about some of the issues we're challenged with today and getting help with through our various technology partners. Hey everyone, my name is Federico Gomez Suarez. My day job is as a technical program manager in Microsoft. But in this panel I'm here in my capacity as a technical advisor for Thorn also as a project leader for one of the projects. So let me tell you about Thorn. Thorn is a non-profit focus on driving technological innovation to fight and end child sexual exploitation. And the way that Thorn does it is by doing research to understand a lot more about the scope of this problem and by developing technology and solutions. Solutions that can be used to empower the people on the field who are the ones actually going and they're actually trying to fight to save these children. I got engaged in this phase because of Microsoft Hack for Good program which inquires all the employees to go and volunteer skills, volunteer our knowledge to help non-profits. That's how I got started. That's how I learned about this problem. And when I understood the scope of it and I realized that work that I could do, work that I could do to volunteer could help, I decided to spend weekends, spend nights just really go in and try to help. And something I learned by my volunteer work with Thorn is that they do not work in isolation. They work with other organizations like the National Center for Missing Children and they also have the support of the tech community that you can see on this panel because this is such a problem that is significant, it's novel, it's very nuanced. So we need to all work together to be able to actually find a solution. Thank you, Federico. Alison? Hi everyone, my name's Alison Yew. In my day job, I run social media globally for Cloudera and on the quote-unquote side, I also run Cloudera Cares which is our philanthropic arm at Cloudera. For those who don't know what Cloudera does, we actually use the power of open-source software in particular Hadoop. And what we do is we package different open-source projects to make it enterprise ready, easier to use. And with enterprises, obviously that leads to ROI, et cetera. But in my role at Cloudera Cares, I'm really trying to figure out how do we apply our software to real life problems. And I think this panel and who's sitting on this panel really represents how tech and nonprofits can really bind together to solve big massive issues such as missing and exploited children. So during my time as leading the Cloudera Cares program, I've helped partner with THORN and the National Center of Missing and Exploited Children. And we're really excited to be here and so proud of the work that we're doing and what we're going to do in the future. So. Thank you. Hi, I'm Nick Edmonds. I'm a software engineer at Google on our abuse team that protects all of Google's publicly-facing products. But more narrowly, my role in a small group of other engineers on the team is to use all those abuse tools to protect children online and to figure out ways of proactively finding new types of abuse and exploitation. And more broadly, Google has access to a lot of content. And this content, as it spread, re-victimizes those kids every time they're seen and exposes new people to extremely damaging and terrible sorts of content. So on my team, we tried to flip the mission just a little bit. And in this very narrow case, instead of making the world's information publicly accessible and useful, try to stamp out specific content that is damaging and harmful and victimizes children in this context. And that's the role of the child safety team at Google in this context. Thank you, Nick. Like many of you, I wasn't very well-knowledgeable in this area a couple years back. I watched the documentary and got tuned in to the problem of domestic human trafficking. And it really keyed me into a new area that I thought, my goodness, in my day job, we use tech every day to solve big problems. What can we do in this space? Mark, as the CTO of the National Center for Missing and Exploited Children, do you help us understand the role that you play in the ecosystem as a federally mandated reporting location and the context of the scale of the problem that you guys are finding on? Let's just call a spade a spade. Child pornography on the internet. Certainly. So something that's unique to us is that we're mandated by Congress as a clearinghouse for all national information on missing children and exploited children. So we have 22 programs and some of the programs involve collecting information, providing analytics on that information and making sure it gets distributed to the right people to take action on that information. One of our sides, one of our main sides, we work with missing children. We work with exploited children. On the exploited side, we've seen both with the ubiquity of the internet these days and the increased focus on reporting that we see through our technology partner, a sharp up ramp in the number of reports of child content or exploitative situations that we have through our automated cyber-tip line. The cyber-tip line started in 1998. We've gotten 16 million reports in that time, but 15 of those 16 million reports have happened in the last four years. And rough numbers, four years ago, we had a million reports, then two million reports, then four million reports. Last year, we had eight million reports. So there's definitely a large ongoing increase in the amount of exploitation that we have to respond to. So the keys for us are to find ways to force multiply, to be able to bring more resources to bear and to identify the truly egregious cases that should be prioritized ahead of others. And we have a few different mechanisms for doing that. One is internally, we're executing a technology strategy to use some cutting-edge tools to be able to analyze the information that we get better. We're also partnering with several prominent technology firms, some of which you see on stage here. And finally, we're doing things like trying to recruit software developers, technologists through hackathons. We have a hackathon here that's being sponsored by Intel and Cloudera where we've had 25 teams signed up. So there are a number of different mechanisms we're using to handle what is just ongoing increased volume. So this is a problem that's pervasive across the United States and really around the world. Is it safe to assume your head count is not doubling every year as a nonprofit along with the problem? Yeah, exactly. And we're very resource constrained. So anything we can do to help bring more resources to bear is incredibly valuable to us. So a great way to use artificial intelligence to augment the capabilities of the subject matter experts to be able to drive meaningful action in a coordinated national response scheme. Absolutely. And part of our relationship with Intel, we have a number of initiatives, one of which is being headed by an Intel chief data scientist who's right here, Bob Rogers, it's a spectacular work to help us determine among other things how to get down a pipeline from 30 days to one day for some of the less critical cases to be able to understand where to distribute some of the information where it's not intuitively obvious where you have counter indications of where it should go and to do some general analytics that they're researching now. And things like that, for me there are two main areas that we can help become more effective. One is automation and the other is innovation. And we're concentrating internally on some of the automation steps and the innovation is being driven by companies like Intel. So thank you very much for the work you all are doing on that. Nick, can you share with us on the child safety side up from Google? You guys are the first line of defense in detecting this information. How the landscape has changed from your lens? Yeah, so Mark touched on the cyber tip line and there were some really important developments that allowed us to magnify our efforts there. The body of legislation around this started, you know, existed since the 1980s and was really hard to apply in the digital domain. And by lobbying Congress and getting the approval and the legislative framework for us to collaborate and to report that data, you know, real life abuse leaves digital footprints and sometimes these days even start so on. And the ability for us to go out and find this abuse and pass it off to an agency that's empowered to do the legwork, to do the research, to reach out and actually affect change via law enforcement and other things is crucial for us. And as Mark also touched on, you know, the volume that we see at this point is, you know, there's an army of human beings behind the scenes that make sure on our side that we're making the right decisions, that we're doing the right things and that we're going out and, you know, we use AI and machine learning as much as we can but there's also human beings in the loop every step of the way because this is a serious problem with serious ramifications. And then when it gets to NECMEC, they also have the investigative resources to put the puzzle together. And the ability of that cyber-tip line to tie together the data that we start with and the outcomes that we need is really crucial. Federica from the Thorne side, can you help us understand the collaboration where Thorne begins and NECMEC ends and how we begin with the tool perspective? So if you may have seen the viral video from Ashton Kutcher that went live with this congressional testimony, Thorne is Ashton's nonprofit. So I think it can help us understand a little bit more. Yeah, when I think about Thorne and what I learned by my volunteer work with them is Thorne's all about really innovating, using technology to actually find new ways to find this problem. One of the tools that comes to mind is Spotlight. So Spotlight is a tool that Thorne developed and the focus of Spotlight is to make data related to online escort ads more easily available for investigations. And using this tool in the last year Thorne reported that around 2000 children, they were able to help identify them. And when you think about it, that's an average five children per day. And it's through this development of tools through this innovation that Thorne can empower the people on the field, people who are doing the work, people, the investigators to actually go and solve these problems better. So when I think a lot about Thorne, I think about that innovation aspect. And also I think a lot about also the research that they're doing to just really understand more about the scope of the problem. Federico is also involved in development of a tool now that helps compare images of known missing with other image sets. And one of the interesting statistics that we have for urgent runaways, so runaways that are likely to be in trouble is that one in six of them are sexually trafficked. So we have data sources inside NICMIC for missing kids and for exploited kids and having the capability to develop technologies to compare those images and find information, find relationships that are available in that information but have not yet been utilized is incredibly valuable to us. So some of the work you do is close directly to that. You know, and that's another you're talking about. It's a very clear scenario of how we can use AI to help. So in that case, it's about using face recognition to be able to say if we have an image of a victim, how can we help to identify that person quicker? And that's where AI comes in. And the advantage is that the technology on face recognition has been evolving so much in the last few years. But there's also special challenges that the technology has. One of them is that the child may have been missing for a couple of years. So we need to account for that. So a lot of the work that I do as volunteering thorn is thinking about that problem. How do I go and match an image of a missing child with another image and how we can really apply the latest technology, the latest on AI so that a person who has to go and review images, they don't have to look at thousands of them. They have to look not as many. So that's a key way that I feel we can really contribute and reduce the amount of work involved for people working on the field. For those of you that think like I did until a few years ago, that this is a problem happening overseas. This is a problem with people coming into our country. In California, we found that 72% of trafficking victims are US citizens. It is disproportionately affecting our foster youth population. These traffickers are looking for victims that can go missing that people aren't looking for. And that's why it's so important we band together for these children without families looking out to them to do the right thing by them. So all the trends are mobile. Do you mind if I break in here? So something you all should know about Lisa, she's been instrumental in developing a relationship between Intel and NECMEC to further technology solutions and you can see from the last sentence, she really lives and breathes this and we're incredibly fortunate to have people to help extend our resources like Lisa to help put things in place to find kids and get them home. So yeah. Collaborative group, we really have a vision of technology being part of the solution to drive this problem. The trends of social and mobile and cloud have kind of created a playing ground for some bad actors to really hijack our technology that was never intended for this purpose. And I'd love to hear a little bit more about some of the ways that you guys are creating tools and capabilities. I think Cloudera is actually the platform for the spotlight tool that Federico mentioned could you share with us a little bit more about how Cloudera plays in creating these solutions? Sure, so Spotlight, which Thorin created is actually built on Cloudera's technology. So when we think about open source, a lot of people think it's a little less secure or not as cutting edge and you kind of see the opposite come true because you have a community of people really working together regardless of where they were, company affiliation to really band together to have the best product, the best outcome that you can have and as you can see in this one case we're seeing open source technology really helping with identifying missing people. So I think last year in the 2016 impact report that Thorin published, even though Federico already said there was over 2,000 children who were identified, there were over 6,000 people in total so I don't wanna make sure that all those victims have been accounted for and you can really see the impact of how technology really is able to help in this space. Last year we also announced our strategic partnership with Thorin. A lot of that was backed by our co-founder, Mike Olson so we are very, very lucky at Cloudera to have co-founders and board members who are very, very, very passionate about giving back and making sure that what we do has a great impact on everyone else in our community. We launched a hackathon at Grace Hopper so every year Cloudera is part of the open source day which is a hackathon at Grace Hopper which for those who don't know it's the largest women in tech conference in North America so not only were we able to introduce this issue to younger women who are just entering their careers we're also able to help mentor them. In this hackathon we were really looking at the problem of the dark web so I don't know if everyone is aware but in the 80s and 90s before the internet exploded exploitation of children had been almost solved, right? We were looking at all the main channels we're freaking it out with the introduction of the internet came the dark web it introduced a lot of different ways for these perpetrators to find and discover new ways to communicate with each other to traffic these children to sell them and now we're finding out how to we're getting into the problem of how do we use our current technology to combat that? How do we go across company lines and work with different competitors? What we normally would see as competitors and I think everyone here can agree but we can see great leaps and bounds that the technology community has done to really ban around this problem and try to solve it. I'd like to give you the internal links on that so my career up until joining NECMEC about a year ago was early stage rapid growth entrepreneurial software technology companies and there's a lot of focus around maintaining your intellectual property making sure you keep all of what builds a value plane something that's been spectacular and we're building a technology group at NECMEC of our technology partners having a technology task force seeing these companies that all have incredibly important valuable IP just drop the barriers and work together to solve the problems with us is fantastic so it's been really interesting seeing that interplay between companies that are normally competitors to see them cooperating on these problems is just fantastic. And something I wanted to add from my experience as volunteer I work in my project together with engineers from Google, engineers from Intel and we're just all working together trying to find a problem like there's a demo we have back there and it's a great story about the demo because I started doing some work on the face recognition and then engineers from Intel thought hey, this is a great way to showcase it and it was by having the combination of ideas between all of us that we were able to have a really good story and to me I don't feel I would have that experience was I not working volunteering with a nonprofit so I have been getting this great value by just being able to work across the industry with a nonprofit and that's something I feel very proud and I know that Thorn they have about more than 350 engineers volunteered at some time and over 50 companies who have gone and say we want to help and it is this industry support that really empowers and really allows the nonprofit to just move forward. What's easy about my job now is at the end of the day I can ask one of two things is this technology help save kids? Does this technology help reduce exploitation? So for me, the metrics on whether or not we're doing something are very easy but having the capability to expand the resources through the partnerships, like a good example is the hackathon we have going now. 25 teams signed up to help with a specific missing kids area to help enrich data, help tell us find children and it's an unusual approach that it's going to be iterative. So it's going to be over several conferences, several iterations of fine-tuning the technologies and looking at ways and finding ways like you all have for us to move the technology forward is critical. As you know, the dark web is becoming used more and more often as a mechanism for exploiters to trade content and also over the past few years we're getting many more videos. So having technologies where we can identify when videos contain bad content often the exploiters they'll hide the CP, the child pornography within an innocuous section of video. So there are constantly things that we need help with in this cat and mouse game of being able to stop the mice and have good results. Lisa touched on one of the challenges sometimes depressing and Mark brought it back in a positive fashion, the scale of this abuse. Every one of those nine million reports is an individual action or group of actions that may indicate abuse in the real world. Each one of those could lead to a child and the challenge there is really to find those behaviors and that content but also to try to tie it back to real impact and that's one of the challenges that we're working on at Google these days is figuring out how in that giant stream of reports we can point that there's more likely to be a child behind this or this person is more likely to be abusing children in the real world to take those digital actions which are terrible and to prioritize them and find the greatest action points to really make an impact in our world and save kids. Federico, do you mind sharing a little bit about Microsoft's photo DNA project and your journey through this timeframe? You've been working in this space for a considerable time now and how that's evolved over the years? Absolutely, so the way I got involved in this space was really for my love of technology. I've always loved computer vision and there was an opportunity to work in a project in Microsoft called PhotoDNAB and PhotoDNAB was a technology that allowed you to determine whether a piece of content could be known on child sexual abuse material. So I remain working in that project as a developer and that project took off. We saw that other companies were adopting it and I remember thinking back that I felt that of the work that I have done in the last couple of years, that was one piece of work that I was very proud about. So I decided, you know what, I want to dedicate some of my time to keep helping on this space because I know that a good idea, I know technology can help. So after that project happened, you know, a couple of years back, Microsoft opened the PhotoDNAB cloud service and that's a service that allows other companies to call an API to determine whether content could be known child sexual abuse material. And you know, I met a group of people in Microsoft who were also as passionate as I am and then the hack for a good community and the company started to grow and you know, that's something that really inspired me to just continue working and then you know, two years ago in the Microsoft hackathon, the 2015, I thought, well, this National Center for Missing Children, you know, the work with Exploded Children, I want to think about Missing Children too and that was the start of the Child Finder service and you know, after that I connected with Thorn and I was able to start working with them on that project and you know, it's been a journey for me but something I can tell you as I've been, you know, doing volunteer work for many years is I've gotten so much back from it. I think much more than I have given. You know, not, you know, on the professional side, you know, I learned a lot of new skills. It's really been great for me and also the fact that I give the opportunity to do work to help others is something that keeps inspiring me. So you know, if anyone, you know, has a cause that they're passionate about, anything like, I think it's something great to just pursue. Do good to do well, right? I want to jump on that and give a plug for Nick, Nick, we actually have three software engineering positions open now, so if there's any interest in joining, please follow up with me directly. Reach out to me, I'd love to talk with anyone. So. Thank you. Any other comments about how AI is playing, how the tech community, how things have evolved over the years? Well, there's a couple things I want to touch on there. I mean, the first is that there's a lot of reactive behavior in the tech community to react to this sort of abuse but there's also opportunities in the scale of the problem to reach out and really affect change before the abuse starts too. So I think we have, and I'm going to butcher this, I apologize to the team's responsible. In 30 some odd countries now, we run house ads. When users come to Google and seek CSI material, we have query classifiers that use artificial intelligence to figure out when we think users are looking for this sort of content and to steer them to NGOs and help in their local country that can address these problems before they escalate the hands-on abuse. Thank you, Mayor. Well, I think as a tech community, we can all agree that we feel this call to action for all of us. This is such an important social issue. I think it defines us as humans that when we see something, we do something about it. And I think the community is really reacting to our platforms being misapplied, misused. This is never the intention of what they were. And I think it's the next generation that's going to come on to say, we are going to hack, we're going to innovate, we're going to do whatever we can to take back control and get ahead of these bad actors. Do you mind sharing a little bit with us, Ellison, since you run more of that Cloudera Cares that's filling that propic pro bono volunteerism perspective, how you see that we're evolving? Sure, I mean, I think the one thing that everyone should really keep in mind is even if you don't have a formal program at your company, there's always time to enact change. There's always time to make that be different. When I joined on in the company about a little over three years ago now, we didn't really have this program. So we announced it, it was employee led and driven. There was about five or six of us at the time that really thought, we're going to do this. Like, we don't care if we're just going to volunteer at a soup kitchen, if we're going to go to the food bank, we want to give back and we want our local offices to have that opportunity. Well, with that, we just kind of started snowball. So we started with those, then we started with a few hackathons and partnerships and then eventually we got to the space where we're committing to donating our software, our IP, our professional service hours, technology, everything. So there really is a lot of change that one person can really do, get a group of people together, start doing things and you never know what can happen. I think Cloudera Cares is a shining example of that, right? So we're now global, we're in every continent that we're at, we're in Latin America, Europe, Asia, North America. So you can kind of look at all the different places and you can see five people decided to not change and a few years later, we really are. And that's really the amazing thing about this space is I've never seen a more open exchange of information and tools and techniques and abilities amongst industry partners and NGOs. If you have a product and environment content or a situation where you think that there's the potential for abuse, you'll find a team of people willing to donate not just their time and expertise but conceivably technology, resources, investigative resources. There's a lot of tools in this space that you don't find in any other problem area and the ability to collaborate is really unprecedented. Yeah, and I'm gonna jump in on that too. As Mark's been talking about our hackathon, it's actually a virtual hackathon that runs for 30 days. We've seen many other companies jump in and volunteer, not only their time, but hey, here's a product that we already have. Is there a way we can integrate this if not into this hackathon, maybe the subsequent hackathons and really trying to figure out how do we become a lot more efficient at solving these issues? And we're really hoping that extends to our hackathon teams as well. It's not too late to sign up, but we're really hoping that people are banding together and really working for one cause versus siloing into different teams. So, we're really excited about what we've seen so far. Mark, can you talk a little bit about the role of volunteerism at Necmac and how it helps you guys to accomplish your mission? Sure, so I deal with all the technology there, but volunteering there is only a small part of the overall volunteer landscape at Necmac. We've actually formed different groups to help us with different facets of our mission composed of volunteers. There's a group called Team Adam, the boy who was killed, John Walsh's son was Adam Walsh. Team Adam helps, it's a network of retired, generally retired law enforcement professionals around the country. When there's an urgent missing case, they'll activate a team in the particular area to help gather information, coordinate the law enforcement response, and hopefully recover the child. We also have a group of volunteers, the team Hope that helped to work with families of missing children. And we have a project alert team for long-term missing cases. And they help deal with what we call comprehensive case reviews. Often if there's a long-term missing case in the period in which the individual has been missing, there have been significant changes in the technology that you can bring to bear, not only information technology, but also biometrics and other things. So we are both cognizant of the need for resources to help further our technology and also on a day-to-day basis help make us effective nationally as an organization. Thank you. I first want to say thank you to the panelists. We wanted to open up for more of an interactive experience now. So if anyone has any questions, we would love to hear them. Bye. She's here on the bench real quick. Okay. Where do you sign up? Oh, that's a really good question. So for foreign, for people who are interested, please visit we are thorn.org slash SXSW for South by Southwest. And you'll be able to sign up there. We will welcome volunteers. And we have great projects to work on. And we would love to have more people in our team. And anyone who's looking for the hackathon, you can find it at childfinder.hackerearth.com. And then I would say finally for Nicknett, the easiest website to remember is missingkids.org. I'm on the leadership page. I would love, you know, feel free to reach out to me directly. And I'd be happy to respond to any inquiries anyone might have. Thank you for shining a spotlight on this issue. It is a local issue also. We're seeing that in San Antonio and I'm involved locally, civic engagement. I think that's where it starts. Grassroots, bringing awareness to the issue. It is more than these kids being runaways. And I feel like a lot of times they're dismissed as runaways when the bigger issue is that there is child exploitation. So thank you for bringing attention to the issue. Talking about the local landscape, we do have an office in Austin. So if you have time to volunteer or know of anyone who's interested in becoming involved, there's an Austin, Texas office at the National Center. You can reach out to them. And there's a mandate right now for states. It's called Children Missing from Care, where foster homes and other organizations are required to report all of the missing kids. And you're right, it's been, I was struck by the fact that before I had joined NICMIC, I wasn't really aware of how pervasive the problem is. And once you're in the sphere and you're noticing everything happening, it really is all around us. So thank you for bringing it up. Two and a half million people in America are sold against their will every day. It is absolutely our modern day slavery. And we can all do something about this. Awareness doesn't necessarily equal change, but I do believe that the type of people that would show up to hear this panel with all the other cool things you could be doing at South by Southwest right now, we really appreciate you guys helping to carry that message forward to a broader audience. Can you hear me? Yeah. My name is Isabel. I run a social startup in Peru, but I grew up in Vienna, Austria. And I think Austria, unfortunately, is also known, among other things, for these two cases of these men having kept their daughters or sometimes just a random girl in their home for years, for decades sometimes. I grew up with this one, knowing about this one case of Natasha Kampusch, et cetera. So my question is, I guess, how do you identify these kind of cases when I don't know if in those cases these people watch pornography or produce child pornography with their victims or just sort of kept them in their home? And how do you find these cases? Because I guess they're invisible. So I can speak a little bit too. And we don't get a lot of feedback in this domain, but every month or two, I'll get a call from an assistant district attorney or a DA or some other thing. And you'll see poor stories in the press now and then. And unfortunately, at least, what we observe at Google is that there's a sad and terrible social aspect to this, that this may happen in isolation. But in a lot of cases, there are communities that organize around this that are terrible. There are all sorts of, it's very hard to avoid leaving digital footprints these days. And we see a lot of people, it only takes slipping up once or twice. And those behaviors tend to recur. There may be, I mean, yeah, I mean, there may be people out there who are capable of evading this, but my experience has been that people tend not to be terribly sophisticated. So all it takes is one message on a user board or sending a photo to a friend. And then that's the canary in the coal mine that we need to send a tip to NECMEC to put our investigative teams into action and for them to follow up and put all the pieces together. And that's the point. It doesn't work without us on the ESP side and the investigative resources. That's definitely teamwork. And we don't see what we do is we facilitate gathering information for law enforcement. So we're not law enforcement. We help give law enforcement the tools they need to track things down. But I thought there was also an aspect to your question of children perhaps being abused in environments where it's very private, where there's no visibility. And the only two things I can think that really help with that are education and prevention. So we do have resources, we provide training and we also have a couple of major websites. One is kids smarts, one is net smarts to help give children information on how to prevent or try to prevent being in situations where that might be likely to occur. Unfortunately, there's no panacea as you know. And unfortunately we're never, I mean, I'd love for us to become obsolete, but unfortunately I think it's necessary that we keep expanding the mission and becoming more effective and better equipped. Hi, I'm Mike Cronin. I cover technology for the Austin Business Journal and thank you very much for revealing this problem. I had no idea. What I would like to know though however is would you be able to provide two or three specific examples of specific AI tools being used to solve this problem? I heard yes, facial recognition, but the rest of your conversation has been we're all working together and we're sharing information, but we don't know what you're doing. Let me lead on this one. So I won't, and the reason you don't wanna do that is you don't wanna provide information that might help the people who are exploiting kids to know what to do and not do to avoid being caught. So for my position and Nick's position is we want to highlight partnerships, highlight advances, but really not give too many details. We can give details about cases and about successes we've had and areas in which new technology has helped, but we don't like talking about the specific technology facets that help us to make those connections. I understand that because as a journalist this is what we've been dealing with since 9-11. However, is it possible to share something so those of us who are lay people can get an idea? So there was a famous case, I'll give you an example of a comprehensive case review which means that technology advances so we can make connections that we couldn't have passed. There was a horrible case in the northeast referred to as the barrel case where at a national park barrel was found with two bodies inside and 10 years later this was in the 70s and 80s another barrel was found with two more bodies inside and they were able to determine forensically at the time so this was mid 80s that it was a woman and three children and they believed it was a family unit and they didn't really have many leads, they weren't able to move forward. We did a comprehensive case review. I'm not sure, I think it was 2013 where we looked again with techniques we have available now and one of the techniques that's available that wasn't in the past is isotope analysis so what you can do is through DNA, through bone fragments you can determine where people have lived what they've eaten, how they've traveled and they realized through this analysis that three of the four had been traveling and located together and the fourth, one of the children was not associated in the way a family group would be associated. From that they were able to focus the investigation they have a person in interest now who is the father of one of the children that was found in the barrels and had a geographic coincidence with a one child who had not traveled with the others. So that's one example of a technique that just wasn't available in the past. There are also novel ways from DNA now to retrieve hair color, eye color, skin tone, racial background, that was used in a case in South Carolina under a comprehensive case review. A fisherman found a skull in a net and they had thought that the skull was the skull of a girl, of a white girl. It turned out through a CCR a year or two ago that it was a black boy and I don't know that that case has been solved but certainly the techniques that we're able to bring to bear over time are instrumental in closing some of the long term cases that have been around quite a while. Do we have time for one more? Oh, wonderful, we have time for a few more, so please. Hey, obviously NICMIC is bound by the US, I'm assuming, right? But each of you are at companies that are international. So how are you taking the effort that you're doing here and bringing it to other parts of the globe? I'm in the UK right now, and I'll be able to turn my company's resources down. So the one thing we do internationally is we get all of these cases through our cyber tip line. 97% of them are international versus 3% that are US domestic. And so we do coordinate with international law enforcement. One of the areas that we're getting help from Intel on is determining where to distribute those international cases, but you're right, we're bounded by the US. I mean, you know, if I think about Thorn, I think the tools that we build, we build in technology, right? And it's all about making the technology available to the people who can use it. But I don't know how far we've gone internationally, but I think that, you know, as we build technology, we want to be in a position to share it with people who can actually use it, yeah. So NICMIC is our primary conduit for communicating with law enforcement internationally. And they do a fantastic job of sending the data that we reported to them as mandated by law to the appropriate authorities, which is supremely helpful. But we also have outreach in a number of countries. In fact, in the UK, I actually don't want to give the public policy first and same, so I'm not sure if she's still in charge. But we work with Interpol and there's a Canadian federal law enforcement agencies we work with. We partner with groups across the globe that have information that's useful in this pursuit. And also to partner with them to try to figure out what's useful. So in the US, it's very easy if we send a cyber tip to NICMIC and it goes to your local internet crimes against children's office in Austin for them to come back and request information from us. But overseas, that's not necessarily the case. So we have to produce actionable intelligence for these sorts of crimes that those local authorities can act on in isolation. And we work with them, you know, we communicate with those groups as best we can to try to make sure that when there's a crime in progress that we communicate enough information for them to stop it. So with the hackathon, I think one of the main things is, you know, we're trying to figure out, are there patterns? Are there things that we really need to look at? Can we apply, you know, some of these algorithms to data sets not only within NICMIC, but you know, obviously globally. I mean, NICMIC is a great partner globally. I mean, that's one of the main reasons why I think we were kind of brought in by Intel. So Intel, Lisa actually gone into contact with us saying, look, we know what you guys do with your technology. We want to be able to have, enable NICMIC to do this internationally, right? We're seeing this huge glutton of data that's just waiting to be processed and sent back to international agencies. And I think that's really, you know, we're able to look at that in a holistic view if we're not just helping one nonprofit, we're really helping, you know, more than that. And one last point on that. It's actually, you know, Mark gave some stats as far as where the abuse is localized, but it's actually incredibly difficult to determine this sometimes. So we did a project with NICMIC a while ago to try to provide information to localize the abuse in the presence of the fact that there's proxies and all sorts of other things that one has to look through to figure out where the actual abuser is. Well, we're waiting for the next question. I would really appreciate if you guys learned something today that you find interesting. Please share it on social media. Please use the hashtag Intel AI and AI for Good so that we can start making some of this trend and really raising awareness for everyone. Please go ahead. So yeah, it sounds like NICMIC is a corpus, a very large corpus and one that's exploding of data that has, that's obviously very sensitive. Data coming through your tip line. How do you make that or will you make that available to startups that have interesting technology that will be process it and distill it or draw insight from it? That's a great question. So we're starting a tech innovation lab this year and as part of that, Intel will be working with the data at NICMIC headquarters. There are some very stringent, obviously, for great reasons, rules about the inability, restrictions around sharing child pornography. So we have a large corpus that's maintained there that can't be shared, can't be copied, but is important for things like machine learning to be able to draw inferences and create good models to be able to recognize things. So the way we're planning on making it available is through partnerships inside our tech innovation lab. And it's worth noting that there are interchanges amongst industry partners as well for sharing this sort of content, not the content itself, but derived features that are suitable for learning and matching and other sorts of things like that, specifically imagery. Does NICMIC get a lot of calls? Do you have a lot of in-home calls? Yes, we do. In fact, so we started with a, we have a 24-7 hotline that we made and I'm not gonna tell you the number because I don't remember the exact number, but they're busy day and night and it was originally all missing. We have a large number of exploited calls now. We have people who have questions around child safety. So it's really for us, that's one of the two main ways we get the input from the public and the other is the cyber tip line. We also have a website or web form and other mechanisms to communicate with the call center. We have one in DC and one in Florida and they run all the time. I will also point out to everyone if you have an opportunity to be here on Tuesday from 9.30 to 10.30, Intel's executive Diane Bryant will be giving a keynote here and she'll be sharing a little bit more detail about some of the work that we're doing at Intel Insights for children outside. So we would love to see you guys come and see that. And we also have a demo of the child finder tool that Thorn is developing in the back of the room. So if anybody's interested in learning maybe a little bit more about the AI side of things, we welcome you to join the demo right back there. Yeah, it's right back there. And so, are there any other questions that we can answer for the audience? And I wanted to mention another hashtag that we have with Thorn which is Defend Happiness. And this hashtag is all about ending, you know, ending trafficking. And if you look it up, that's another hashtag that you can use when you share in social media. Thank you so much. We really appreciate the engaged audience we have today. So great to be here and thank you. Enjoy yourself, guys. I'm so curious.