 Welcome to the Future of Democracy, a show about the trends, ideas and disruptions changing the face of our democracy. I'm your host, Sam Gill. And for those of you who might be joining for the first time, the basic idea of this show is it's really the op-ed page of our democracy. It's where we take an issue, an idea, a topic, and we go a little deeper. We explore its ramifications, its contours. And in the wake of the election, which is still buffeting us as of this recording, the continuing role of technology in our democracy, which we've been discussing all summer, all fall on the show, has just been thrown into stark relief. To some extent, it seems that what you read online may dictate whether you think Joe Biden is the legitimate or illegitimate president-elect. And by the way, there is no evidence for any other conclusion than that he is the president-elect, which is PSA for a second here. But questions continue to abound about the role of technology not only in our civic and political discourse, but in the structure of our economy. And a lot of this discussion tends to focus on the so-called Fang companies, FANG, Facebook, Amazon, Netflix, and Google. But the senior statesman in the sector continues to be Microsoft. It was around, the company's been around for decades before these newer entrants. And as of the third quarter is the second largest company in the US by market cap. And so my guest today is well versed in all these topics. Mary Snap is no stranger to these discussions. She leads diverse strategic community and external efforts at Microsoft. And before that, she was a lawyer with the company during the antitrust battles of the 90s. We had a chance to sit down for a wide-ranging conversation, and I hope you enjoy it. All right, well, Mary, thank you so much for joining us. It's great to be here. Thanks. So I think the place I want to start is, you know, we're recording this in, you know, what is Week 2 of a presidential election process that's not close to over. And certainly, I think for the first time that I can remember, the question of technology and the internet is just right at the center of our discussion about the election. And that's because of concerns about the kind of information that's moving through the through through the internet and whether it's reinforcing concerns about voter fraud in part because the incumbent president himself is taking to to social media in particular. And today now concerns about the very policies kind of intended to curb misinformation getting in the way of this runoff election in Georgia where you've got candidates that want to actively advertise and take advantage of the democratizing aspect of technology. So in some ways, it's just, there's there's no question in our democracy that sort of feels disconnected from questions about technology and just as someone who's helping to lead a company that's been at the forefront of these questions for decades. So what, what are you talking about at Microsoft about about the kind of the role of the information technology in our democracy and what's new and what's not new for you. Yeah, that's a huge question. And I would say, you know, what aren't we talking about really, you know, everything from, you know, data and our Azure business all the way to thinking about social media and regulation but before we really get into it I just want to give a shout out speaking of technology to an article I saw recently about what are called the new chart throbs, the guys who are, you know, on television who shows the maps and they go in and out of the data and they can count the votes and they can do the math and the algorithms are all behind them and that to me is like a great example of the use of technology to keep us up to date and informed on all the things we can do. But you know there's just no question that social media has played a huge role in spreading good information and not so good information about the election itself. There's I mean, I was so just, you know, one and I and I don't want to put you in the position in this conversation of like speaking for technology as a as a as a sector. But you I know Microsoft is one of the most capitalized, best capitalized companies in the world and you play in all these spaces. You know, there is this particularly right now because social media companies are trying to adjust their policies and response to what's happening. There is this kind of cynicism about you know what do you just care about profit do you care about the democracy take us a little bit inside like when you're talking as corporate leaders as engineers and you're confronting these challenges whether it's deep fakes and synthetic media or misinformation. What's what's like the nature of the discussion. What are you grappling with. Well, it's really interesting because it's a multi disciplinary approach. So, you know, you're talking to engineers, you're talking to lawyers, you're talking to people who are public policy folks you're you're talking to actually ethicists. And I think it's really important for us to bring the voices of people in Microsoft research and others into into that conversation, and I would say who probably the biggest awakening for us was the Christ church killing about. I think about two years ago year and a half ago, when technology was used to preview that and to video that. And within like two days we had our lead lawyer in Australia, talking to the heads of government, and came back and tried to put together a coalition of tech companies to create some principles. So as we think about, you know, what we do, we sort of start from this framework about what are the principles that we should employ as a company and what are the principles with our competitors, you know, we can, we can agree on on some of the uses of technology. I would also say that, you know, Microsoft, perhaps because of its history, in part because of its history and part because of a leadership from a guy like Brad Smith and a CEO like Jackie who grew up, you know, in a, in a different sort of a regulated environment, frankly, in India. There is a little bit more of an acceptance of the role of regulation and managing some of these things. I mean, from our perspective, you know, these are really important decisions that you're making about privacy about ethical uses of artificial intelligence. And frankly, nobody elected us. Nobody voted for us. So, you know, it's important for us to talk about issues to put a stake in the ground on them. But ultimately, this stuff kind of needs to be legislated. And we need to make information broadly available both to citizens and our own employees. And I guess I say the last thing on this topic. Well, there's lots of things on this topic. But we can't underestimate increasingly the role that our own employees play in being activist on these issues and asking us to get involved and take public positions and to make ethical decisions about technology. And that's something we didn't see 10 years ago. I don't know that we even really saw it five years ago. But, you know, as a result of, I think kind of dissipation of trust in some parts of, of society and sectors in society. Our employees are telling us that they don't want to work for us. Frankly, unless you know they agree with some of the values and that's important too. Yeah, I think you know that isn't really interesting trend and I think it's taking over a lot of sectors, particularly in the knowledge economy that you've got a workforce that really wants to see its ethical commitments reflected in the in the company. And I want to come back to this question of responsibility and accountability but just to play that out a bit I mean, is this is do you see this is this is this ephemeral or do you think this is, this is going to be a new relationship between a company and its and its workforce that's ultimately going to shape the business. I, you know this is my personal point of view, but I would bet I would find others who agree with me this is not ephemeral. This is a generation of workers who are in high demand, who are increasingly saying they're going to work from home to, you know, so you got that level of distance now. I, I think this is a long term trend, I don't think it's ephemeral and I think, you know, you start back long before that but remember a couple of years ago when Larry think from BlackRock wrote the, you know, his end of year letter. He said, you know, consumers are and employees are both demanding that companies have a social purpose, and they make that purpose known and whether or not that's the brand of philanthropy in which they engage, whether it's the public issues that they want to out, whether it's questions related to do we put artificial intelligence in, you know, provide, you know, in products to the Department of Defense. I mean, these are all questions that I mean as technology hurdles forward, the ethical implications of it become more and more and more significant and, you know, that goes to our core so I don't think these issues are short lived I think they are, they are long term. Well that and that does bring me back to these these questions of responsibility that you raised about regulation, which is, you know, it's so it's to some extent, the idea that companies have some relationship to the public interest and to social purpose isn't wholly new the language is new but you know it's the corporate sector in America that first created paid leave didn't come from government and even even though at times we have to kind of codify the restraints of managerial capitalism. We didn't we didn't need the foreign practices act to know that bribery was wrong, we just we needed it to police the edges of behavior that we knew was not was not appropriate behavior. And that and that the and the balance between between corporate interest and public interest was was something that we were all committed to I guess when I listened to you it just strikes me that the challenges for figuring out one's responsibility are different in a technologically mediated era, and also that the something about the speed in the scale like Christ church brings home a kind of challenge right that is a kind of responsibility that maybe we didn't realize was going to be vested at least partially in the corporate sector is do you see do you see growing awareness about that like I think some people would say these are institutions now in in some sort of technical sense what what's what's your sensibility about this. I definitely think that you know corporations need to have a responsibility, obviously, to their stakeholders and that's not the shareholders I mean the stakeholders, that's, that's consumers, that's other businesses that take advantage of our products and services, and it's our own employees and that, you know, that is very different than, you know, some class you might have taken on business capitalism 30 years ago. I just don't think that you can you can get away from that. I also think that the speed you've talked about the speed of technology, you know as you sort of think about it, you know it took like almost 100 years for telephones, when they were first introduced to kind of reach this saturation point of like you know 80% of American households, you know I was just thinking about that this morning with this podcast and like less than 10 years, you know we're at about 35% saturation of people who listen to podcasts so and another 10 years, you know maybe 15, probably shorter because it's going to accelerate will have more people listening to podcasts at a faster rate than got telephones in the United States, and to some extent, you know, the infrastructure has been laid, you know, so we've got the technical lines. We've got, we don't have last mile connectivity, which is another big issue, but we do have broader connectivity so that infrastructure is laid. These technological advances are going to go faster and faster. And frankly, COVID has done nothing but accelerate all of the trends to more reliance on technology. And I think the flip side of that, I'm kind of riffing now as you can tell, but the flip side of that is that it also is increasing some social isolation. And if you think about that, I think that builds into an increased vulnerability to disinformation to deep fakes. And I think then you know you come back to what is the responsibility of technology to respond to that. And you can talk about it in terms of, you know, trying to get people more civic information or more media literacy, but there's a real technological gain going on between technology companies and those who are spreading disinformation with, you know, what, you know, we call synthetic media and manipulated synthetic media, not just how you stop it but how you prevent it. And that is going to, it already has accelerated as well. So you think about this last election, and, you know, we saw Microsoft taking responsibility like every starting, I don't know, six or eight months ago, every couple of months a guy named Tom Burt who leads our cybersecurity efforts was posting blogs about cyber attacks and what's been detected and the kinds of services that are being offered to campaigns and government officials. We participate, I mean, I really didn't think about it much until a month or so before the election, wondering what, you know, some of these bots that seek to infect computers for ransom. Whether or not ransomware would be part of the story we'd be telling, you know, in the 2020 elections, whether it's holes. I mean, what would happen if Pennsylvania had been shut down, because, you know, a hacker had gotten in and demanded ransom, you know, from the state before it undid the lock on the computer systems. So, you know, there's just more and more responsibility for tech companies to think about technological solutions. And first and foremost, that's, that's our greatest asset. That's where we know the most. And there's absolutely, you know, a role that we need to play, but it's ongoing. It is absolutely ongoing, you know, the bad guys get better and better and better. And, you know, you just have to keep that research going. So one, one dimension of this is, is the question of how to, of how to have a productive dialogue as a society and even if, to your point, the answer is that at some point you have to codify some some sort of new regulatory regime into law that that's going to happen through negotiation and compromise and discussion. And I, and certainly that feels elusive at the moment you've got, I think, for example, you mentioned Christchurch I mean I think congressional hearings about issues of content moderation have been among the worst technology hearings of the last year and in with respect to the level of sophistication. You've got a lot of frustration about, well, can't the company can't can't the major company just solve this problem quote unquote if they, if they wanted to see a lot of cynicism, but Microsoft very much unlike some of the kind of newer kids on the black, although it feels odd to call them that given their size, you know, is not a stranger to having to figure out ways to have a productive dialogue with regulators. And, you know, one of the things I really want to talk to you about is the experience of the late 90s and concerns about any trust in what they mean for today or not. Let me get into the into the specific kind of structural economic question. Just what do you got do you do you think you've learned something, you know in your career about how to make that conversation about what responsibility to look like more productive. I'm not kidding. I'm not sure I'd be here if I didn't learn, learn something like literally, you know, by the week on that topic. Yeah, just a little bit of background. You know, Sam that I was deputy general counsel during that time and I was explicitly responsible in my own work portfolio for Microsoft Windows. So, you know, I was at the antitrust trials in the late, you know, 90s, and I was part of the team along with the antitrust lawyers I supported the product development for complying with the consent decrees both in the United States and Europe so I, you know, at least monthly I was going to DC to report to the technical committee and quarterly hearings with the judge in that case, and about the same frequency going to the EU to talk to their equivalent not judges but you know the EU leaders. To be honest with you, you know, I started out. Look, I love Microsoft. I started out with my arms crossed in my jaw tight and you know how could this happen. And that might have been a bit of our culture back then. And I've come to the point where, you know, you just realize that you have to be able to talk from the beginning, work on issues, understand that democracy is compromised that regulation is not compromised. And I will say like a kind of a turning point for us was when Brad Smith came into the role of general counsel, the everything move from litigation to compliance, and Brad came into the role and said it's time to literally it's time to make peace. And we were deployed, each of us who were senior leaders. I had four or five state attorneys generals that I made quarterly trips to I talked to them about what we were doing and asked them what was on their mind. And that was a real change for us. We continue to keep those relationships alive, and it's gone far beyond and I trust we talk about, we talk about issues of privacy we talk about jobs in the economy we talk about skills and employability. So, you know, it's, it's, we talk about rural broadband. We now we're talking about race and social justice and issues of equity and broadband in urban areas. We're talking about how we can help support land grant institutions and historically black universities and colleges so the range of discussions is actually quite broad. And I personally have learned a lot along the way, I, you know, in the late 1990s, I learned that you can't just win a case in the courts, that you also had to win it, you know, part of public opinion. And that was my late 1990s learning and, you know, most of the last part of my career, both as a lawyer and, you know, working now on some of these other issues has been much more related to how we connect to again stakeholders who are not our shareholders and that's just an important part of how we think about our business these days. And to me there's something about, you know, and I, and I know some of some of some of our some of our audience will will will complain to me about for saying this but I know I've always I've always liked the, the that that famous line from Brad Smith about it's hard to make peace because I think that to what I take out of it is, we, we might have a first order interest in winning an election, or in profit, or in advancing our family, but we should all have a second order interest in the system, continuing to operate so that we can do all those things and have those aspirations. And I've never been a big believer that you have to be the most virtuous person, but I have been a big believer that you have to be able to check your avarice, or your appetite in in in accordance with that second order interest and so it is interesting to hear about your own I mean you use the word culture, which is not something that comes up a lot in these conversations and I think there's a real question right now about not only what the right regulatory solutions are or aren't but what it means to be committed to right idea that there's such a thing as responsibility and as elusive as we're going to have to kind of find it and what we find may not be the thing that we would have done unimpeded. Yeah, and I, you know, I, I did, I did use the word culture and it just came out very easily because it's something that we talk about all the time, actually, and Microsoft we sort of start by talking about culture and, you know, when Maria came into the role five years ago he deliberately set out to change the culture. And look, it wasn't just to make us, you know, a nicer software Microsoft, there, there was a lot of talk about culture in creating a more rigorous, research and development process, a more rigorous work ethic, frankly, a focus on on speed as opposed to process to some extent all in balance. But at the same time, you know, that combination of Brad, Amy Hood, the CFO and Satya, you know, they come from a world view that is different than, you know, those who entered, you know, this business in the 1980s. We are in a, we came from a time where we were talking about the internet for goodness sake, we came from a time when, you know, you couldn't even type a sentence in email and you, you know, you couldn't correct your spelling. You know, so that's the time at which I came into the company. And, you know, the impact and the ways in which technology impacts the world is just so different now. I mean, one of the things that we know we've we've started to talk about is, you know, it's certainly it's important for us to have more computer scientists and there aren't enough computer scientists going to school in the United States, but the computer scientists need to take history classes, computer scientists needs to those needs to go to the philosophy department and take an ethics class. I mean, it's really hard to talk about privacy by design in your products, if you don't have an understanding, for example, of what happened in Germany, you know, in the late 1930s. In terms of the amount of intrusion in into lives that the government had. So, you know, you, you have to, you have to, I think a similar like that, if you're really going to be part of sectors in society moving, moving us forward. I don't know, I could, I could go on about the, you know, the Microsoft stuff but I, you know, I think about, you know, just in terms of democracy, you know, there's the, there's the checks and balances that we actually went down. I went and read the Constitution again, like last week, you know, I probably am one of, I mean, I think there are a number of people who did but maybe not that many. I read it again. It was trending. But I realized that a lot of the things that we're saying, you know, we're just going to get rid of the electoral college. We're going to change the makeup of the Senate. That's in the Constitution. It's actually not that easy to change. And it was deliberately put there to put restraints on one, one group or sector getting more power over another. And then over time, you know, you think about, okay, now I'm going to be really careful about what I say, you think about what are the restraints right now in, in our own civility, and have those restraints been loosened and, and, and they disappeared a little bit because of social media is kind of a lack of accountability of people on platforms. And so, you know, that's one of the questions of our time, I think. Well, let me let me have the last issue I want to talk about in related related now more topically to the to the antitrust to the antitrust and competition question, kind of 21 years on from the from the initial court order for Microsoft but is actually about power so it's actually a great great So, you know, so much of the concern of the founders was, was the concentration of power and a particular kind of majoritarian tyranny and avoiding that. And that animated social policy, political, political culture, public culture and economic policy for a long time. And what's been really interesting in terms of a difference between the late 90s and today is that the, the competition question about technology is explicitly much more referential to the late 19th century and questions about power and corporate power as a threat to democracy, than it is about sort of 70s 80s consumer welfare doctrine what's what is what's in the interest of the consumer particularly focused on price but other things as well. And the question I don't want to ask you sort of an uncomfortable question about your competitors but let me ask a sort of different question along the lines of the nuance that you've been inviting us to embrace this conversation which is the way the questions are being posed now is have have technology companies become too big. And I guess the question I would ask you is, what are the right, what, let's take it a level down what are the right specific questions that we should be asking about technology and the size of companies in power that will help us move this discussion forward. I think on one of them. And that is, is the focus in, and how we gauge whether there's harm. Is that focus on competition, or is that focus on consumer harm. And that was a question that we really debated in the 90s and it was a was a question that was debated in the, you know, the oil companies and all of those kinds of issues to railroads, you know, back in the day. In those early doctrines it was all about, as long as consumers are benefiting, you know, if competition suffers a little, that's okay, as long as you know it went down to price the prices went up, then consumers weren't benefiting and you know if there were lower prices, it was kind of the end of the discussion. I think it's much more nuanced now I think we're, we're coming we actually I think coming back to that question about competitor harm. But I think that question of consumer harm is, is going to be really important. I think there is a real convergence now between traditional antitrust law and privacy. And intellectual, before in the 90s we were talking about power related to, in essence, intellectual property, you know, our technology copyright patents, did we allow people to use things did we allow people inside, you know that we talked about APIs, you know, and people have access to these things that let you into what we call the kernel of the system so that you could get all that wonky stuff. But, you know, now we're, we really honestly we're talking about data, talking about data now, who has access to how much data and will they share it and is it interoperable and how value is the data, and to consumers get value for their data, do they care to get value for their data. And I think that question about data, you know it's just kind of fundamentally and how data is then used to get more followers more advertising that fundamental about data is just really different. And the businesses that are impacted by, you know, tech are just much broader than they were before. You know everything from, you know, local media, what I know you really care about to retail to, you know what you would think about computers so the questions are just really, I think, much more woven together. It's almost like a carpet weaving of issues these days, compared to just to focus on, you know copyright and patent which is what we talked a lot about, you know in the I feel like we're ending on the equivalent of a technology cliffhanger with that with that provocative response but like the election and we must at some point so you can follow Mary on Twitter at Mary E snap, and also read many for public contributions on behalf of Microsoft blogs.microsoft.com, but Mary thank you so much for joining us. You are most welcome it was fun. Alright folks thanks for tuning in. Just remember we are changing our schedule a bit so always keep an eye at kf.org or at the Sam Gill on Twitter for info on new episodes. And as a reminder this episode will be up on the website later, you can listen to or watch this episode or any episode on demand at kf.org slash fd show. If you want to subscribe to the future of democracy podcast on Apple Google Spotify, or wherever you get your podcasts, email us anytime fd show at kf.org or if you have questions for me just send me a note on Twitter at the Sam Gill. As always we're going to get a serenade you out with the music of Miami singer songwriter Nick County, you can follow him on Spotify until next week stay safe. Thanks so much.