 Live from Cambridge, Massachusetts, it's The Cube at the MIT Chief Data Officer and Information Quality Symposium with hosts, Jeff Kelly and Paul Gillan. Welcome back everybody, this is The Cube and we are live at the MIT CDO IQ Symposium here in Cambridge, Massachusetts. I'm Jeff Kelly, I'm here with my co-host, Paul Gillan, and we're here covering all things data, talking about the rise of the evolution of the Chief Data Officer position. Data governance, of course, is a big topic of conversation as well, and you know included in governance, of course, is privacy considerations. Happy to be joined now by Jennifer Barrick-Glasko, who's the Chief Privacy Officer at Axiom. Welcome to The Cube. Thank you. Glad to be here. So, for our audience members who maybe aren't that familiar with Axiom, maybe can you tell the audience a little bit about the company and your role as Chief Privacy Officer. I don't think we've ever had a Chief Privacy Officer on The Cube, Paul. So this is a first in looking forward to this conversation. Yes, I will. Axiom is a company that's been involved in all kinds of big data applications for three or four decades now, and obviously it's an evolving field and resulted in new positions like the Chief Data Officer. But as a privacy officer, I worry about the things that we do with data, separate from security where you're actually trying to protect it from unauthorized use, I worry about if it's authorized, what do you do with it? So Axiom has been involved in big data for decades. We have two primary lines of business. One is hosting services where we manage the information for our clients, host it, and help them make good big data decisions. We offer analytical support and also consulting to help them understand what they can do with big data. We also have a line of information products, both business and consumer, where we augment the information that our clients have with additional data. And so we worry and the Privacy Office focuses on both how we can help our clients manage information in a responsible way, but also what are the practices that we are being engaged in and the laws that we have to follow where there are laws around our own data products. So I think that second line of business is probably the line of business that most people know Axiom by. So you're collecting data from various sources, could be transactions, credit card transactions, any kind of transactions, could be credit scores, things like that. What are the kinds of data that you collect and use to augment your customer's database? Good question and one that is often misunderstood because what we're looking for is not the details of your life, we're not law enforcement agency, we're not involved in national security issues. We're looking for what are the kinds of things that you like to do and therefore what are the kinds of things that products and services that you're interested in. The vast majority of our business revolves around marketing activities and so a marketer wants to know not just what a company customer has bought from them thus far, but also things about do you have children at home, do you like to travel, what are some of your hobbies, golfing, tennis and so on. So we don't get transaction information, which is a common misunderstanding. We get aggregate information or summary information that just indicates the kinds of things that you're involved in and to create a little bit of a demographic view into your household because often we're marketing to the household and not just to you. Right, so just you're really helping clients essentially better present either offers or marketing campaigns to their customers, so really better targeting so that they're getting offers and information that's going to really apply to their lives versus more of a blanket approach. Exactly and it's both to their current customers as well as to people that they would like to have become customers. Right, okay so let's talk a little bit about privacy in that situation. So you know we've heard some of the horror stories out there about organizations such as Target that have done things where maybe they shouldn't have, not necessarily illegal or not necessarily anything against the rules or regulations per se, but questionable in terms of you know whether there are ethical concerns or privacy concerns. So how do you approach that question? We talked a little bit beforehand about just because technology and data allows you to do something doesn't mean you necessarily should. How do you advise customers when they're thinking about some of these privacy and ethical concerns? Well as you accurately state the gap between what we can technically do and what we should do or what consumers would feel is appropriate or not be shocked or horrified by is growing. Technology is fast outpacing law or even industry best practice. Even though industry I think is beginning to be more aggressive at closing that gap, certainly law is falling way behind. So companies have to ask themselves what would the consumer think about? How would they react? Are there risks that I'm posing or creating by this certain use of data that impact the consumer? Not just impact me as a business who's using the data. We often refer to these as privacy impact assessments and they are actually a risk analysis that the company should be undertaking to understand these risks and then attempt to mitigate them. We have such powerful tools today and marketers presumably are a major customer of yours. Yes. We have such powerful tools today to cross-reference information and to market very specifically to a creepy level you might say. What kind of guidance do you give your customers on how far to go with individualizing their marketing? As we move down this path towards a world of big data that's pretty much ubiquitous, I think we're beginning to push the comfort limits of consumers in a lot of areas. And so we can talk closely with our customers and build into our own data products, kind of a classification system that says how sensitive is the data we're working with? Or how sensitive is the use we're putting it to? And the measurement needs to include both aspects. And if the data is particularly sensitive to something about my medical situation or something specifically related to my finances, then there need to be extra protections and precautions with not only how it's used but who uses it. And we've built these in and we advise clients to do the same kinds of things. But it's not just the data you sell. It can also be the data that's inferred. And Jeff's example of Target, the famous New York Times story last year about Target identifying a customer who was pregnant before her family knew, that was a case of looking at buying patterns and relating that to larger scale patterns that they have seen. Do you, is that something where you think we're pushing the line as well? I do. We are creating far more modeled data, as in the example you gave today than we ever had before in our analytical tools to very accurately predict very sensitive things are growing. So often the examples that are given are where we start out with non-sensitive information that we're not as, we're not thinking need as much protection. And we end up predicting something that could be very sensitive. And so when we cross that path, we have to now say, aha, I really created new data. And now I have to classify that data appropriately and put the right kind of protections around it. Do you worry about some sort of a privacy Chernobyl occurring? Some event that really galvanizes public opinion about personal data and creates perhaps government attempts at regulation clamped down on what marketers do. Do you foresee that as a possibility? Well, I think it's something we need to be sensitive to. I don't foresee it as a near term possibility. But companies, I mean, I don't foresee it as a near term possibility because in particular our federal government is not engaged in these kinds of issues right now. And so I think we've got a year or two of a window where I think companies are going to have to take on concerns themselves and deal with them. Because quite frankly, market pushback can be worse than government pushback. And so consumers are more aware about these kinds of things than they've ever been before. We are moving down a path of a greater society of transparency, where people know more about what goes on behind the scenes or what goes on in the background. And so they're questioning these kinds of activities, whether they end up in government regulation or not. So industry as a whole, many industry sectors, are looking to create guidelines in lieu of legislation that will give good advice to companies about what they should and shouldn't be doing. We have new emerging guidelines in the area of behavioral advertising where the sites you visit on the internet create a profile that you use, then to target advertising, pretty strict guidelines, I might say. We are migrating those guidelines to the mobile space because obviously the technology that I'm working with is different there in the way that I would inform the consumer that this activity is going on in the way they could exercise a choice to say, I feel like it's creepy and I don't want to participate, it would be different. So just from the perspective of the consumer, we hear about services like Facebook, Twitter and other services. So you may have heard the saying, if you're not, the consumer is really the product in a way because they're giving their data to these companies who are then using it to drive their businesses. Facebook's revenue is largely driven by ad revenue. From the consumer's perspective, is privacy in this day and age just not possible anymore? I mean, is that a horse out of the barn? I don't think it's out of the barn. I think it is, but there's a balance that we're looking for and we're trying to decide what is the right balance. I think consumers have a right to have a say in particularly, let's talk about marketing because I think the rules may be slightly different. If we're talking about, say, marketing versus fraud prevention, as a society, we've decided that the use of data to prevent fraud is a societal, valuable thing and so we don't give consumers choice any more than we give them a choice to opt out of that late payment on their credit report. But if we're talking about marketing, historically, we have said consumers should have some say, we have access to a lot of information and some people feel like it's creepy, others thoroughly enjoy the benefits. So we don't want to minimize the benefits to those that want to enjoy them, but we want to respect those that want to bow out, so to speak. But is that possible? That seems like a big challenge. I think technology is helping us make it more possible. I just mentioned some of the self-regulatory initiatives that are in place today. And one of the controversial areas is around the area of behavioral advertising, where a profile of what you've done on the internet creates a picture of who you are. And therefore, we deduce that you like football and you love to travel broad. Because you go to sites that indicate that. Consumer might like to say, I don't want you building that profile in me. And so in every ad that's been targeted with that kind of data, there's a little icon that you can click on and it explains this practice. And if you want it to stop, you click the opt out button and it stops. So I think technology, while it may create other opportunities to collect data, it also creates opportunities for us to manage preferences. Do you really say, I mean, it's in the industry's best interest to regulate itself so that somebody else doesn't come in and do it. I want to ask you about our proclivity for, you mentioned that we share, we're sharing more and more data. And you've been in this, you've been in it actually on a long time. I'm sure you've seen these changes happen. People seem to be growing more and more comfortable over time with sharing more and more information about themselves. And is this, do you see this trend continuing? Are we going to continue to just be more promiscuous, I guess, in what we let others know? I don't know. I think we've turned a corner on it more so than we're on a trajectory going up. I think we've gone from a world where we didn't know much about anybody else other than close family or friends or neighbors to a world where we know a fair amount about a lot of people. And I think to some degree, there's a, you know, we're getting a little bit comfortable with that new level of visibility and transparency. I'm looking at your LinkedIn profile right now. Yeah. Well, or Google, you know, Google anybody. Right. Most people should Google themselves if they haven't lately to see what's out there. At least what's out there about someone with their name. But I don't, I don't see this like just continuing to skyrocket. I think we've kind of reached a new plateau. And I think we're beginning to realize that consumers should have a right to some level of privacy and the question is how do we give it to them? And we also are recognizing that not everybody feels the same way about it. Some people don't people don't mind having a lot known about them and others want to be very private. And the challenge I think we've got and we will have to address it. We're taking steps to but I don't think we're quite there is how to recognize those differences and allow those different attitudes to be respected. You mentioned Google and I'm glad you did because it brings up the issue of what happened in the EU a couple of months ago with Google being forced essentially to take data out of its out of its search results under certain conditions. Is that a dangerous precedent? Do you see that? I don't see that as a precedent that we're going to see popping up all over the world. That particular court decision was based specifically on some European law without getting into all the legal ramifications of it. It's not a blanket or a blanket opportunity to just take any data out. It has to meet certain conditions. And so it's not inconsistent with longstanding European law in the US. For example, we have a First Amendment like that really kind of sets us on the other side of that debate. So in your opinion, what role, if any, does the government have in kind of regulating some of the use of consumer data? Should they have a role? I know Congress is taking a look at this issue to some degree. They haven't gotten into it too deeply. But there's the potential for that to happen down the road. Do you think the government should have should play a role in this question? Well, I think they have a role to play. And I think to some degree, they already are. While we don't have a big, broad privacy law like we see in a lot of other countries in Europe and in Asia, we do have a very long list of very specific laws where we have seen practices that we feel like or have stepped kind of out of bounds. And so we've passed laws. We regulate financial data. We regulate health data. We regulate telemarketing. We regulate email marketing. And so we regulate collection and use of data on children and online and mobile space. So what we've chosen in the US to do is pick off some specific areas where there either wasn't good guidance or industry was pushing the envelope a little too far. And we've passed those kinds of laws. I don't see us in the short term passing a big, broad law here in the US. But what we are seeing is the regulators, the Federal Trade Commission in particular, has very broad powers under its Title V enforcement opportunities that include regulating what is called unfair or deceptive trade practices. And this means if a company is misrepresenting itself, and this would include in its use of data, or is doing something that is unfair from a consumer perspective, then the FTC can actually step in today. They don't need a new law and exercise their powers. So we are seeing increased enforcement in this area amongst a wide variety of industries. And I think we will continue to see the FTC go after companies that they feel are being either unfair to the consumer or deceptive. Gartner made a lot of coverage last year for its prediction that CMOs would spend more on technology than CIOs by 2017. Whether or not that's true, I'm certain you're seeing some anxiety or maybe some excitement among your marketing customers about this trend. What kinds of skills do you think marketers need to develop in the big data and analytics area to prepare for what's coming? Well, I think they either need to develop these skills or they need to partner with their privacy and data management organizations. And many of them we're seeing actually are partnering. And that is they need to have some sensitivity to how the consumer is going to react to the marketing opportunity that's in front of them. In other words, is the consumer going to be surprised? Is the consumer going to... Or some consumers, maybe even all or most consumers, going to be offended? I think we have sensitivities. Social sensitivities in a wide variety of areas, whether it's ethnicity or other kinds of considerations that people need to think about. So I think it's a little bit of a mindset change more so than it is necessarily a skill set change. So really, you're talking about soft skills though. You're talking about sensitivity. Yeah, we are talking about soft skills. And so the data that is available to us, we can do, today we can do things with it that we really, as a society, would probably say are inappropriate. So the question becomes, what is appropriate? And this is where I think company needs to ask that question themselves. But I think industries need to also ask that question because we have different data in the medical industry than we do in the financial industry. We have different data in the travel and entertainment industry. And they need to ask themselves, what are the sensitivities in my sector and what is either a best practice or a code of conduct that might help me stay out of the headlines? I have to ask you this, as an expert in data privacy, what are your own standards? What are your own practices with your online profiles that you practice? Well, sometimes people find this to be surprising, but I rarely opt out of anything. I like getting all kinds of things. I'm fairly open to sharing data. Now, I made questions certain data fields. I mean, if I'm signing up to purchase something from a catalog side and they want my social security number, I might pause and go, well, they don't really need that. But for the most part, I transact online. I transact on my mobile phone. I'm what we might call a privacy liberal. I guess you have to put faith in your customer's ability to handle that data responsibly. I do, I do. And what would you advise consumers out there when they're making similar decisions about their use of online services? Well, I think it's online services are maybe not a lot different from offline services. If the offer is too good to be true, it may be something that you need to watch out for. An example I just gave of be reasonable. I mean, people will ask for information, but if the information they're asking for seems kind of out of bounds for the engagement that you have with the company, that's probably a red flag. And I think we also need to be conscious of scams, whether they're online or offline. So that's an area where you might do a little research in terms of what the current scams are. FTC and other consumer protection agencies, state AGs, as well as a better business bureau can provide help there. And we only have time for one more question. So I'm curious, so Chief Privacy Officer, they mentioned we haven't had too many on. Deque, what's your background? How does one become a Chief Privacy Officer? And do you think most organizations need one or is this specific to the type of business that you're in? No, it's certainly not specific to our business. It is now a pretty well-established profession. About 12 years ago, a group called the International Association of Privacy Professionals was created. And when we first had our meeting, there was about 50 of us. And today there's 15,000 of us worldwide. I think most companies are realizing that having somebody that's looking out for these issues within the company, trying to educate the lines of business, including the marketing department on what, both what the laws are, as well as consensitivity to these issues is really important. I stumbled into it back, back in those days there weren't very many of us. And so I stumbled into it, I have a degree in technology and we were getting involved in data, our data products. And so I was asked to try to figure out how to use data responsibly and it's turned into a career of over 20 years. Well, fantastic. Well, you know, this is gonna continue to be a big issue I'm sure in this market. So Jennifer Beriglasco, thank you so much for coming on. I'm sure we'll have you on again, Ms. As I said, because this is an issue that's important to a lot of our audience, a lot of consumers out there, as well as the technology providers and others. So thanks again for coming on. Thanks for watching. We'll be right back after this short break.