 Hello everyone, welcome to this special CUBE conversation here in the Palo Alto Studios of the CUBE. Part of our People First project with Mayfield's fund and co-creation with theCUBE, I'm John Furrier, your host. Very special guest, Judy Estrin, she's a CEO of JLabs and author of the book, Closing the Innovation Gap. She's also well known for being an internet entrepreneur pioneer, worked on the initial TCPIP protocol with Vinsurf, from went to UCLA, Stanford, great history in computer science. You have computer systems in your blood, and now you're mentoring a lot of companies, author, you do a lot of work, and you're lending your voice to some cutting edge issues here in Silicon Valley and around the world. Thanks for joining me today for the conversation. Thank you, it's fun to be here. So first of all, I love the fact that you're here. You're a celebrity in the computer industry circles. You were there at the beginning, when the computer systems or the internet were being connected. As they built out, it started the whole systems revolution in the 80s, and the rest is history. Now we have cloud computing, and now we're seeing a whole nother level step function of scale. And so you've kind of seen it all. You've seen all, I mean all the waves actually, something like me can say I've seen some of the waves, but you've seen all of them. The most compelling thing I think that's happening now is the convergence of social science and computer science, it's kind of our motto, Silicon Angle. You recently wrote two posts on Medium that has been kind of trending and going viral. I want to get your perspective on that, and they're interesting because they bring a little bit of computer science called the authoritarian technology, reclaiming control, part two, attention to part one, but you go into great detail to lay out some big picture computer industry discussions. What is it all about? What's the idea behind these stories? So let me back up a little bit in that, as you said, and we can go into this if you want, I was very involved in a lot of the innovation that happened in the Valley in terms of microprocessors, the internet, networking, everything that laid the foundation for a lot of the things we see today. Incredible opportunities for my career, for problems we solved. Over the last 10 years, 10, 12 years, I began to see a shift and a shift in the culture and a shift in the way technology was impacting us. And it's not all good or bad, it's that it felt like we were out of balance and that we were becoming shorter and shorter term focused. And actually my book in 2008, Closing the Innovation Gap, the main message there is let's not forget about the seeds you plant that all of this comes from because we're reaping the benefit of those seeds, we're not planting new seeds and that we were becoming in the Valley, in the nation, the way we thought about things, more and more short term focused. And technology was causing some of that and benefiting and not, and at a disadvantage because of that. So that started with my book in 2008. And then in 2014, I think it was, I did a TED talk, a TEDx talk, called Balancing Our Digital Diets. And I was even more concerned that we were out of whack in terms of the consequences of innovation and I drew an analogy to our food systems where so much innovation in creating cheap calories and energy and things like high fructose corn syrup that it took years to realize that, oh, there are some negative consequences of that innovation. And so that was kind of a warning that we weren't thinking enough about the consequences of, at that point, social media. That was before fake news. And I talked about tweets and how truth that lies went faster than truth. Not knowing how bad that situation was gonna be. And then leading up to the election and after the election, we all know and have all learned now about the impacts of these technologies on our democracy and I believe on our society and humanity. And I don't think it's just about our election system. I think it's about our psyches and how the technologies are impacting the way we think, our fear and anxiety level of our kids and us as adults. So I've been talking to people about it and advising and I finally decided as I was collaborating with people that I felt that a lot of the awareness was in pockets that we talked about data privacy or we talked about addiction. But these are things are all interrelated. And so I wanted to, one, add my voices as technologists because I think a lot of the people who are writing the, building the awareness and talking about it, if you are in government or a journalist or even a social scientist, people can, it's really easy to say, yeah, you say that, but you don't understand. It's more complicated than that. You don't understand the technology. So one, I do understand the technology. So I felt adding my voice as a technologist, but I'm also just increasingly concerned about what we do about it and that we take a more holistic view. So that's what the pieces are about and the reason I broke it into two pieces is because they're too long for most people even the way they are. But the first is to build awareness of the problems which we can dig into at a high level if you want. And then the second is to throw out ideas as we move towards discussing solutions. So let me take a breath because you were going to jump in and then I can. No, it's just good. You're connecting the foundational technology, foundation of technology, identifying impact, looking at pockets of awareness and then looking at how it's all kind of coming together. When you talk like that, the first thing in my mind is that oh, subsystem, interrupt bus, connection. So it's almost like an operating system. And I think the society that you're pointing out in the article, the first one in attention was they're all interrelated and I think that's the key part. I think that's interesting because we run into people all the time when we do our CUBE broadcasts that have awareness here and don't know what's going on there. So there's context that's highly cohesive but there's no connection. So they're decoupled but highly cohesive. That's kind of a systems architecture concept. So how do we create a robust technology, societal system where technology, and I think that's a thread that we're seeing. This is what I gleaned out of the articles was you're kind of raising the flag a little bit to the notion of big picture system, kind of a foundational. But let's look at consequences and interrelationships and how can we kind of orchestrate and figure out solutions. And so what was the reaction? Expand on that concept because this is where it was provocative to me. So I think there are two thought trains that I just went down. One is that one of the problems we have that has been created by technology and technology is suffering from. Again, it's both cause and effect is not enough system thinking. And so one issue which is not just, this is not just about social media and not just about AI but over the last 20 years. We've increasingly trained I think our engineers and computer scientists in more transactional thinking. And as we move quicker and quicker to solve problems, we are not training our leaders or training our technologists to think in terms of systems. And so what do I mean by systems is two things that you can break, any problems have pieces but those pieces are interconnected. We are interconnected and that you, if you don't keep those things in mind, then you will not design things in a way I believe that have the longevity and make the right type of decisions. The second is the law of consequences. When you have a system, if you do something here, it's going to impact something here. And so that whole notion of taking or thinking through consequences, I'm afraid that we're training people as we are focusing on being more and more agile, moving more and more quickly, that it's in technology and in society that we're losing some of that system thinking. And I think that the trade-off is always, I mean, when we had systems conversations in the past, but in my old systems hat on, trade-offs, we have overhead, so we have more memory. How do we handle things? So this is kind of a, that's just what happens, you talk about consequences. But we don't have all those, we, I'm older than you, but we started at a time when we were limited, we were limited by memory, we were limited by processing, we were limited by bandwidth. And at different times as the industry emerged, the constraints were in different areas. Today, you don't have any of those constraints. And so if you don't have any of those constraints, you don't get trained in thinking about trade-offs and thinking about consequences. So when we come into just what drove me to write this, one set of things are foundational issues. And what I mean by foundational is, it's our relationship to technology. And the fact of the matter is, as a society, we put technology on a pedestal, and we have, this is not to be taken at a con, it's not to be taken the extreme of talking about people. But overall, our relationship with technology is a bullying, controlling relationship. That's why I called it authoritarian. He's upgrade your iPhone to the new version. Well, whether it's as a user that you're giving up your authority to all these notifications and to your addiction, whether it is the fact that it is the control with the data, whether it is predictive AI algorithms that are reading your unconscious behaviors and telling you what you think, because if it's suggesting what you buy, putting things in front of you. So there are all of these behaviors that our relationship with technology is not a balanced relationship. And one you have a culture where the companies that have that power are driving towards, it's a culture of moving fast, growth only, don't think about the consequences. It's not just the unintended consequences, but it's the consequences of intended use. So the business models, which we don't need to go into because I think a lot of other people talk about that, all end up with a situation which is unhealthy for us as people and a humanity and for us as a society. So you take that part and it is, there's a parallel here and we should learn from what happened with the industrial revolution. We want progress, but if we don't pay attention to the harm, the harmful byproducts and trade-offs of progress, it's why we have issues with climate. It's why we have plastic in our oceans. It's because you judge everything by progress is just growth and industrialization without thinking about well-being or the consequences. Well, I believe we now face a similar challenge of digitization. So it's not industrialization, but it's digitization that has byproducts in a whole number of areas. And so what the article does is get into those specifics, whether it's data or anxiety, how we think our cognitive abilities, our ability to solve problems, all of those things are byproducts of progress. And so we should debate where we, what we're willing to give up. One last thing, and then I'll let you come in, which is one of the problems with both of these is humans value convenience. We get addicted to convenience. And if somebody gives us something that is gonna make things more convenient, it sure is hell to go backward. And that's one of the reasons the combination of measuring our goodness as a country or as globalization by economic growth and measuring our personal wellness by convenience. If something is more convenient, we're happier. Take those two together and it makes a dangerous combination because then our need for convenience gets manipulated for continued economic growth. And it doesn't necessarily end up in progress from a well-being perspective. It's an interesting point about the digitization because the digital revolution or the digital revolution that's happening has consequences we're seeing them. And you point them out in your post Facebook and fake news. There's also the global landscape. There's the political overlay. There's societal impact. There's not enough scholars that have been trained in the art of understanding interrelationships of technology impact. It used to be a nerd thing. And now my kids are growing up digital natives. Technology is mainstream. So there it is politics, the first hack election. Some of the troll, the first president actually trolled his way of the president. I said that on the cue, that was my position. He actually was a successful troll. And he got everyone trolled the media and he got the attention. These are new dynamics. This is reality. So as you look forward and bring these ideas up, I want to get your thoughts on ideas on how to bring people together. You've been on one as a CTO of Cisco Systems. I know you've been on boards. This is a cross-pollination opportunity to bring people together to think about this. How do you look at that? How do you view how to take the next steps as an industry, as a society, and as a global nation eventually? Because cyber security, privacy, is becoming polarized also on a geography basis. You've got China, they've got GDPR's hardcore there in Europe, you've got Asia with the Chinese, and you've got America being America. It's kind of complicated. As a system architect, thinking, how do you look at this? What is the playing field? Where are the guardrails? What's your thoughts on this? Because it's a hard one. Right, so it is a hard one. And it isn't easy to pave out a path that says it's solvable, nor does climate right now. But you have to believe we're gonna figure it out because we have to figure it out. So I think there are a lot of pieces that we need to start with, and then we need to adjust along the way. And one piece is, and let me back up. I am not, I don't believe we can leave this up to the industry to solve. The incentives and the value systems and the understanding of the issues, the industry is coming from an industry perspective. And you can't also, you also can't leave it just to technologists, because technologists have a technology perspective. I don't believe that you just can have governments solve it for a variety of reasons. One is it takes a spectrum of things. Two, legislation tends to be retroactive, not forward looking, and you need to be really careful not to come up with regulation that actually reinforces the status quo as opposed to making something better. But I think we need to, we do need to figure out how to govern in a way that includes all of these things. So one- Just a little bit of a second, it's clear that watching the Facebook hearing and watching Sundar Pekai in front of the house, our current elected officials actually don't even know how the internet works. So that's one challenge. So you have a shift. It's a reset. This is a big dynamic. And it's actually, if you think about the way legislation often gets made, one of the problems with our democracy right now, I'm not going to put it in quotes, but I want to put it in quotes, is that the influence of money on our democracy means that so often the input to legislation comes from an industry. So whether it's, again, big tech, big pharma, big oil, big, that's the way the cycle works. In places where we have had successful legislation, that industry input, which you need industry input, you just don't want industry to be the only input, that is balanced with other input. And so we need infrastructure in the world and the country that has policy ideas, technology. This needs to come from civil society, from the academy, from nonprofits. So you need the same way we have environmental sciences, we need to fund from government, not just industry funded that science. That's number one. And then we need ways to have conversations about influencing companies to do the right thing. Some of it is going to be through legislation, some of it is going to be through pressure. This in some ways is like tobacco, in some ways like it's like food, in some ways it's like climate. And it's so, and underlying any of this to happen, we need people to understand and to speak up because awareness amongst, whether it's individuals, parents, teachers, we need to give people the information to protect themselves and to push back on companies and to rally, push back on government. Because if there's not an awareness, if people are walking around saying, don't take away my service, don't make this less convenient, don't tax my soda, don't tell me what to eat. Don't tax my text messages. That's right. So, and I'm not saying taxes are the way, but if there isn't what I'm focused on is how do we build awareness? How do we get information out? How do we get companies like yours and others that this becomes part of our messaging of understanding so we can be talking about it? I think it's back to the glory days of the TCPIP, internet revolution. You send a packet from here to there, it's a step, take a first step. I personally listening to you talk feel, and I've said this on theCUBE many times, people know that, who know my rap know that I've been pounding this. There's a counterculture in there somewhere. Countercultures is where action happens. And I think, you know, tax regulation and the current generation is inherited. It is what it is. You're laying out essentially the current situation. John Markoff wrote a great book, what the doormouth said, talking about how the 60s counterculture influenced the computer industry from breaking in for getting a computer time for time sharing to the hippie revolution. What should I have for you? Put you on the spot is, is there a counterculture in your mind coming? A digital hippie quotes, because I feel it, I feel that, they let the air out of the balloon before it pops. Something has to happen. And I think it has to be a counterculture. I yet can't put my finger on it. Maybe it's a digital kind of a revolution, something compelling that says, whoa, time out. Right, I think we need a couple of countercultures in layers of it because I think there is going to be or is starting to be a counterculture amongst technologists and the technology industry and entrepreneurs who are some, it's still small, who are saying, you know what, this chasing unicorns and fastest growth and scale, you know, move fast and break things, but we want to move fast, but we want to think about whether we're breaking. What we're breaking is really dangerous. You know, move fast and break things is fine, but if it's oops, we broke democracy, that isn't something that is, I'm sorry. You have to think about and adapt more quickly. So I think there is our people who are talking about, let's talk openly about the harm. Let's not just be tech optimists. Let's understand that. It's small, but it's beginning, and you're seeing it in AI, for instance, the people who are saying, look, we're technologists. We want to be responsible. This is a powerful weapon or tool, and let's make sure we think about how we use it. Let me just say one thing, which is I think we need another kind of counterculture, which I'm hoping is happening in a number of areas, which is societally saying, you know, we have a slow food movement. Maybe we just need a slow down a little bit movement. So if you look at mindfulness, if you look at kids who are starting to say, you know what, I want to talk to someone in person. I don't, so we need some of that counter movement where I'm hoping the pedestal starts to come back in terms of people looking for real connectivity and not just numbers of connections. Yes, interesting. You know, everything has a symmetrical responsibility. I think about it for every fake news, payload and network effect is potentially an opposite reaction of quality and network effect. It's interesting. And I don't know where it is, but I think that's kind of could be filled certainly on the economic side by new entrepreneurial thinking. Like one observation I'm making is, you know, remember the old bad boys of tech, and you're smiling, now it's bad gals too, which has grown and still on lower numbers. So I think there's going to be a shift to the good folks, right? Well, she's a good entrepreneur. She's not just out there to make a quick buck or, hey, mission-driven is a signal we're seeing. So you start to see a little bit more of a swing to, whoa, hey, let's recognize that it's not about, you know, quick buck or... So yes, but between you and I, it's teeny compared to the other forces. So that's what those of us who believe that needs to happen need to continue to... What are those forces? Money-making, I think it's a combination of money and how much money draw celebrity culture. The forces, the power that's in place is so strong that it's hard to break through. Short-term thinking, not even being trained. So like so many things in our culture where you have entrenched power and then you see uprising and you get hope and that's where you need the hope. But we've seen it so often in so many movements from race to gender where you think, oh, that's solved. No, it's not solved. And then you come back at it and come back at it. So I just, I would argue that there is little bits of it but it needs fuel, it needs continuity and the reason I think we need some government regulation is it needs help because it's not going to happen naturally. Let me ask you a question. You know, some successes that I point out, Amazon Web Services, Google even, have that a long game kind of narrative. They're always kind of, we're misunderstood at first. I mean, remember Google was like, oh, my search is not doing too well and then the rest is history. Amazon was laughed at, Amazon Web Services was laughed at. So people who have the long game seem to be winning in these transitions and that's kind of what you're getting at. You think long-term, the long game. If you think in terms of the long-term vision, you then going to look at consequences differently. How many people do you run in the valley that actually think like that? Okay, so we're talking about two different things. One is long-term thinking and I do think that Apple, Google, Amazon have taken long-term thinking. So they're a good example, but if you look at them, if you look at the big companies in terms of the way they approach the market and competition and their potential negative impacts on overall society, they're part of the power. They're not doing anything to change the systems to not have them continue to benefit from the power. So this is why it's complicated. There are not good guys and bad guys. There are, these people are doing this and that. So do I think overall do I see more long-term thinking? Not really. I think that the incentives in the investment community, the incentives in the stock market, the incentives culturally are still very much around shorter-term thinking. Not that there aren't any, but. I would agree. I mean, it tends to be, hey, we're crushing it, we're winning, look at us, growth hack. I mean, just the language and semantics, you look at that. I think it's changing. I think Facebook is I think poster child of short-term thinking, growth hacks, move fast, break stuff and look where they are. They can't actually sustain a brand outside of Facebook. They have to buy Instagram and these other companies to actually get the kind of growth. But certainly Facebook has dominated on the financial performance, but they're kind of sitting in their situation. I think the brogrammer movement, I think it's kind of moving through the white-companeer culture of, okay, let's get some entrepreneurship going great, rah-rah, I think that's stabilizing. I think we're seeing with cloud real science and thinking about AI for good. So that's a positive sign. Well, I'm glad to hear that from you. You know, and I'll take- You're probably agreeing with it, yeah. No, no, no. I'll take that and take that into feeding my hope because I hope- Yeah. Well, the me too movement is classic. Look, we're not going to tolerate this anymore. I think transparency in my final question to you before we get to some of the more entrepreneurial questions is, if you look at the role of community and data science and connectedness, one of the things about being connected is you got potential for collective intelligence. So if you look at data, as I said, on networks, what if there was a way to kind of hone that network to get to the truth faster? Something that we've been working on here. And I think that's something that, you know, changes media, it changes the game, but collective intelligence and the role of the community now becomes a stakeholder and potentially laying out some of these problems. And you're part of the Mayfield community, which we're co-creating this video with, role of community is super important, people. The role of the person, your thoughts on- So I think community is a word that is, has, takes on a lot of meetings. And the problem is when you mean it one way and use it the other way, the same as data-driven. So I think there's at one level, which is community and connectivity that has to do with collecting input from lots of sources. And when you talk about investigative journalism or they're in environmental situations or all sorts of areas where the ability to collect information from lots of sources that are interested and analyze that information, that is one level of community and connectivity and networking because of people you know, which is great. There's another type, when people talk about community, they mean a sense of community in terms of what humans need and what that connectivity is. And most online networks don't give you that level. The online needs to be augmented by interpersonal understanding. And one of the problems I think with today's technology is we're fitting humans into bits that technology can support as opposed to recognizing what are our human needs that we want to hold on to and saying there are some things that are not going to fit into somebody's data set. So in that first type of community, then absolutely, I think there's lots of benefits of the cloud and wisdom of the crowd. But if you're talking about humans connecting in people, you don't have the same type of, that real community, online tools can help, but we should never confuse what happens in our online world with real human touch. Julie, final question for you. I know we're pushing the time here. Thank you for your spending the time. First of all, it's a great conversation. You've seen the movie with Venture Capital from the beginning. You know all the original players. You're seeing what is now. Just where has it come from? Where are we? What's the state of VC? Is there any hope for the future? Are they all adding value? How do you see that evolving and where are we with? You know, I would, I think Venture Capital has gone through a lot of different phases. And like so many things, especially those of us who are entrepreneurs, we like to lump them all together. They're not all together. There are some. There's some good ones out there. Yes, like they feel. And the, I do think though that something shifted in the lead up to the dot com and later the burst. And what shifted is Venture Capitalist before that time were company builders. They were the financiers, but they saw themselves with the entrepreneur building companies because of the expansion leading up to 2000 and the funds grew and the people coming into the field were, they became more bankers and they took more financial, as opposed to balancing finance and entrepreneurship, it felt like it moved more into, this is a private equity play. And I think the dynamic with entrepreneurs and the methodology overall shifted. And I don't know that that's changed. Now again, not across the board. I think there are some, those firms that have identified or partners within firms who still very much want to build companies and partner with entrepreneurs. But I think the dynamic shifted. And if you view them as, that's what they are as private equity investors and don't expect something else. If people need money, that's a go pick ones that are the best partners. Pick your partner. If you want a banker, go here. If you want a builder, go there. Key distinction. Judy, thanks for sharing that insight. We are Judy Estrin, CEO of J Labs, author of Closing Innovation Gap as well known entrepreneur, advisor, board member, formerly CTO of Cisco. And again, great guests. Thanks for coming on. I'm John Furrier with theCUBE Conversation, part of Mayfield's People First with theCUBE. Thanks for watching.