 Live from Stanford University, it's theCUBE covering Global Women in Data Science Conference, brought to you by SiliconANGLE Media. Welcome back to theCUBE's continuing coverage of the fourth annual Women in Data Science Conference, WIDS, hashtag WIDS 2019 to join the conversation. Lisa Martin joined by one of the speakers on the career panel today at Stanford, Natalie Evans Harris, the co-founder and head of Strategic Initiatives at Brighthive. Natalie, it's a pleasure to have you on the program. So excited to be here, thank you. So you have, which I can't believe, 20 years experience advancing the public sector's strategic use of data. Nearly 20 years. I got more, 16-year career at the National Security Agency in 18 months with the Obama administration. You clearly were a child prodigy. Of course, I was born in 1992. So tell me a little bit about how you got involved with WIDS. This is such an interesting movement because that's exactly what it is. In such a short time period, they have amassed, they're expecting about 20,000 people watching the live stream today here from Stanford, but there's also 50-plus countries participating with 150-plus regional events. You're here on the career panel. Tell me a little bit about what attracted you to WIDS and some of the advice and learnings that you're going to deliver this afternoon. Sure, absolutely. So WIDS and the Women in Data Science program and conference and what it's evolved to are the exact type of community, collective impact initiatives we want to see. When we think about where we want data science to grow, we need to have diversity in the space. There's already been studies that have come out that talk about the majority of innovations and products that come out are built by white men. And built by white men and from that lens, you often lose out on the African-American experience or diverse racial or demographic experiences. So you want communities like Women in Data Science to come together and show we are a part of this community, we do have a voice and a seat at the table and we can be a part of the conversation and innovation. And that's what we want, right? So to come together and see thousands of people talking and walking into a room of diverse age and diverse experience, it feels good and it makes me hopeful about the future because people is what the greatest challenge to data science is going to be in the future. So let's talk about that because a lot of the topics around data science relate to data privacy and ethics, cybersecurity, but if we look at the amount of data that's generated every day, 2.5 quintillion pieces of data, tremendous amount of impact for the good. We think of cancer research and machine learning and cancer research, but we also think, wow, we're at this data revolution. I read this blog that you co-authored about a year ago called It's Time to Talk about Data Ethics. And I found it so interesting because how do we get control around this when we all know that yes, there's so many great applications for data that we benefit from every day, but there's also been a lack of transparency on a growing scale. In your perspective, how do, what's the human capital element and how does that become influenced to really manage data in a responsible way? I think that we're recognizing that data can solve all of these really hard problems and we're collecting these quintillion bites of data on a daily basis. So there's acknowledgement that there's things that humans just can't do. So AI and machine learning are great ways to increase access to that data. So we can use it to start to solve problems. What we also need to recognize is that no matter how good AI gets, there's still humans that need to be a part of that context because the algorithms are only as strong as the people that have developed them. So we need data scientists, we need women with diverse experiences, we need people with diverse thoughts because they're the ones that are gonna create those algorithms that make the machine learning and the algorithms and the technology more powerful, more diverse and more equal. So we need to see more growth and experiences in people and learnings. The things that I talk about when others ask me and what I'll mention on the career panel is when you think about data science, it's not just about teaching the technical skills. There's this empathy that needs to be a part of it. There's this skill of being able to ask questions in really interesting ways of the data. When I worked at National Security Agency and helped build the data science program there, every data scientist that came into the building, we of course taught them about working in our environment, but we also made every single one of them take a class on asking questions, the same class that we had our intelligence analysts take. So the same ways of the history and the foreign language experts needed to learn how to ask questions of data, we needed our data scientists to learn that as well. That's how you start to look beyond just the ones and zeros and start to really think about not just data, but the people that are impacted by the use of the data. Well it's really one of the things I find interesting about data science is how diverse, and I use that word specifically because we talked about thought diversity, but it's not just the technical skills as you mentioned, it's empathy, it's communication, it's collaboration. And those are so, it's such a, like I said, diverse opportunity. One of the things I think I read about in your blog, if we look at, okay, we need to not just train the people on how to analyze the data, but how to be confident enough to raise their hand and ask questions. How do you also train the people to handle data responsibly? You kind of mentioned there's this notion of sort of like a Hippocratic oath that medical doctors take for data scientists. And I thought that was really intriguing. Tell me a little bit more about that. And how do you think that data scientists in training and those that are working now can be trained, influenced, to actually take something like that in terms of really individualizing that responsibility for ethical treatment of data? So towards the end of my time at the White House, we, it was myself, DJ Patil, and a number of experts and thought leaders in the space of news and ethics and data science came together and had this conversation about the future of data ethics and what does it look like, especially with the rise of fake news and misinformation and all of these things. Born out of that conversation was just this realization that if you believe that inherently people want to do the good thing, want to do the right thing, how do they do that? What does that look like? So I worked with Data for Democracy in Bloomberg to issue a study and just say, look, data scientist, what keeps you up at night? What are the things that as you build these algorithms and you're doing this data sharing, keeps you up at night? And the things that came out of those conversations and the working groups and the community of practice now were just what you're talking about. How do we communicate responsibly around this? How do we, what does it look like to know that we've done enough to protect the data, to secure the data, to use the data in the most appropriate ways? And when we see a problem, what do we do to communicate that problem and address it? Out of that community of practice and those principles really came the starts of what an ethics, oh, the Hippocratic Oath could look like. It's a set of principles. It's not the answer, but it's a framework to help guide you down your own definition of what ethical behavior looks like when you use data. Also, it became a starting point for many companies to create their own manifestos and their own oaths to say as a company, these are the values that we are going to hold true to as we use data. And then they can create the environments that allow for data scientists to be able to communicate how they feel about what is happening around them and effect change. It's a form of empowerment, amazing. I love that. In the last 30 seconds, I just want to get your perspective on, here we are, spring of 2019. Where are we as a society on data equaling trust? Ooh, I love that we're having the conversation. And so we're at that point of just recognizing that data is more than ones and zeros and it's become such an integral part of who people are. And so we need some rules to this game. We need to recognize that privacy is more than just virus protection. That there is a trust that needs to be built between the individuals, the communities, and the companies that are using this data. What the answers are is what we're still figuring out. I argue that a large part of it is just human capital. It's just making sure that you have a diverse set of voices, almost a brain trust, as a part of the conversation. So you're not just going to the same three people and saying what should we do, but you're growing in each one, teach one, and building this community around collectively solving these problems. Well Natalie, it's been such a pleasure talking with you today. Thank you so much for spending some time and joining us on theCUBE and have a great time in the career panel this afternoon at WIDS. Thank you so much. This was a lot of fun. Good, my pleasure. We want to thank you. You're watching theCUBE from the fourth annual Women in Data Science Conference live from Stanford University. I'm Lisa Martin. I'll be back with my next guest after a short break.