 So, thank you all for coming. So as 2023 comes to a close, we've seen our fair share of tech sector scandals this year, whether we're talking about the trial with FTX, the government probe against Binance right now in the US, or the failure of Silicon Valley Bank. This year has kept tech journalists like me on our toes with yet another round of scandals in the industry. And today I'm delighted to be speaking to Erica Chung, who's a molecular biologist by trade and has had first-hand experience of what happens when a company prioritizes growth and a good story over the checks and balances needed for development of critical technologies. After blowing the whistle on Theranos' malpractice, Erica is now the co-founder and executive director of Ethics and Entrepreneurship, a nonprofit focused on supporting ethical practices in business and advising founders on how to balance profit and integrity. So, thank you for being here today, and I think we can start with some of those scandals that I mentioned. If you look at the likes of FTX, Theranos that you have personally experienced with, or the likes of Wirecard, I was wondering if you could tell us, if you think that there's a through line, if there are commonalities between these sort of scandals that the tech industry has seen in recent years? Yeah, I think just generally, when you look at these fraud cases, it's really shocking how similar they are over time, and not just the new wave that we're seeing with internet companies or tech companies, but even with older cases in the U.S., like things like Enron or Worldcom, or other scandals like Tyco, or other business scandals that have happened and corporate collapses. And there's a book that I often quote around this called The Seven Signs of Ethical Collapse, and it was really radical when I read it, because basically it highlights what are the conditions frequently that encourage unethical activity, and some of these major collapses to happen. And some of them include things like pressure to maintain numbers. If you create a circumstance that people are required to hit certain metrics in unreasonable timelines, it tends to create an environment that they tend to take shortcuts. Things like cultures of fear and silence, where people internal the organization feel like they can't say something. The notion of a cult of personality leader, people who tend to have this big charismatic presence to them, and also have a body of staff who's very young and very zealous about the leader that they are working for, conflicts of interest. Some of them tend to be very philanthropic. And honestly have and tout this idea of innovation like no other. That they are so innovative, they're in a category of their own, that the basic laws and rules don't apply to them. And when I read this book, it was really funny, because if you look at things like Theranos, you look at things like Wirecard, you look at FTX, they kind of fit the bill of these things. And I think that was really shocking to me to realize we're always looking for the recipe of success for a company, but that's actually really hard to determine and very variable. But in fact, the recipe of failure might actually be fairly consistent and something that we can predict quite well. And I think that's, at least for me, being on the side of one of these big scandals, realizing that actually we may be repeating history more than we think we are. And so to look at some of those red flags one by one, I think it would be good for us to go through them. And one that you mentioned is this idea of the cult of personality. As a tech journalist, I speak to a lot of investors who say that they really look to the founder. They think that the founder is really what makes or breaks a business, but that, of course, draws a lot of attention to a lot of these founders who are very confident, who are able to speak well on stage, who are able to present their business. And what do you think is the issue there within the relationship between what investors are looking for and the founders themselves? Do you think that investors put too much stock behind these big personalities? So I think there's obvious reasons why you want a founder with a good story that has a good vision. We all understand this. We understand this because it can motivate your employees. It can encourage consumers to buy your product or service. And just generally because frequently you're entering into a space where it's not really certain where the company is going to go. And you want to imbue that sense of hope that you can trust the people that you're working with are going to be able to execute and pull off something, you kind of need that story. You need that. I think what happens if you depend too much on that and you anchor on that and don't use it for the utility of those things and just assume that the founder at the end of the day is infallible, that they are somehow the core identity of the business itself, then you start to run into some problems, right? Because ultimately the success of the business is the organization. It's not that individual. And frequently I think there's too much of this merging between the identity of the founder and the company itself when really the success and the execution of a lot of these organizations is going to be the whole part. And in terms of your responsibility as a founder and your responsibility as an investor, it's really on what is the success of the business. And solving the problem that's set it out to, is it entering into the territory that is going to be profitable? And I think the deficit sometimes of the emphasis of the founder, the founder is so important, the story is so important, who they are can really create a situation that it puts too much pressure on that founder and then it ties the identity of the business too closely to an individual when it's really should be tied to an organization. And to look a little more about the responsibility of investors in this space, we have heard in recent months as there's been a downturn in tech that people are moving away from the hyper growth model, but really that is something that has been encouraged by investors for a long time. Do you think that this grow, grow, grow mentality has fed potentially some of the corner cutting disregard for regulation within the tech industry? I think when you have any type of bull market and there's a lot of cash kind of floating around and people are trying to maximize their opportunities to get some access to that cash, you start to see really strange behavior. Start to see really people doing really bizarre things and I think with each hype cycle, we've seen a circumstance where it's great, it's wonderful, there's a lot of enthusiasm around what people are doing, there's a lot of excitement, but people are so blinded by the fact that we're excited by a new hype cycle that it's not too clear what's actually happening behind closed doors in a lot of these businesses, so then you see a sort of lag of all the cleanup and the frauds and the realities of certain circumstances when lots of money was running around and people were doing whatever they could to kind of get their hands on it. So I think we saw this with the first wave of internet companies, in terms of whatever revolutionary tech that was being developed, I think we saw this largely in the crypto space and I'd imagine the next round is going to be an AI. Basically all of these different hype cycles, you are likely to see a lot of people claiming you need to grow fast, you need to get on this train because they're more following the hype of the massive amounts of cash that are going into versus sometimes paying attention to the fundamentals of the business. When you look at those hype cycles though, there is also, I mean at the moment the one is AI in many aspects of that, that is sort of revolutionary technology or can be and I'm curious what you think about the idea that a lot of the ability to raise capital is about potential future growth or promises that you're making in the next two to five to ten years of the potentials of that technology. So there's always a little bit of an aspect of selling a dream. Do you think, where is the line between you're selling a dream versus you're really like having up a business that maybe isn't all there? Yeah, I think there's some key things. Like in the AI space, part of the reason I come and talk at these things, AI has huge potential to improve a lot of different industries from logistics to healthcare to agriculture all across the board. And I think the reason why we have to pay attention and put in proper safeguards to ensure these scandals do not happen in this space because it is a core technology that affects so many different industries is that ultimately when you take these hits, it creates this huge chill effect for a lot of these different companies that are in that space. So in terms, can you repeat the core question, Aggie? Yes, sure. I was just meant about the idea of like you're selling something with future potential. You're selling future potential. So you have that core thing and we can all see that and we can all see the fact that there are a lot of possibilities of where that can go. I think the line needs to be drawn where again you are business and you're contracting with different stakeholders who have certain expectations of how they're engaging with you and whether they can trust you. And I think where you get into the territory of fraud, it's things like where is the money? You know, what are you doing with that money? Who are you doing something with and can you explain that kind of relationship, right? You have a lot of issues in terms of vendor and suppliers where it's like, okay, you know, this unit actually costs $2 to buy, but I'm working with a manufacturer to sell it to me for $20 and then we share the cut. Then there's also like what's inside the box, right? We saw this in Theranos. What is the key product that you're developing and is that actually coming to fruition? And I think you can sell the hope, you can sell the dream, but at the same time when you articulate to people what it is you're building, who it is you're working with and where is your money? There are very clear lines that you can't cross in order to ensure that you don't end up in the territory of fraud, right? It's a difference between kind of exaggerating and having enthusiasm for a particular case versus getting into the territory of lies and manipulation and then potentially fraud. So I think it's clear like in the Theranos case saying you have business contracts with entities that you don't, that's actually added into your revenue. This is a clear boundary where you are entering into the territory of fraud. Saying you have a product that you don't actually have with very specific specs. This is again a clear territory of fraud and that people can, different regulators can pursue litigation against you. So I think just being realistic with where you are right now and communicating with that people and that not detracting from the fact that doesn't mean that you can't have success in the future is sort of a very important ability of communication that you have to have as a founder. And if we stick with the Theranos example for a second, if you look at for instance the idea that you're selling a proprietary technology that there is a need for a level of secrecy and in a lot of tech, in order to have that edge on your competitors, there is a demand on secrecy within the organization but also potentially externally as well. And you've talked about your time at Theranos and the issues around that culture of silence and an unwillingness to engage with that. How does a company manage to balance making sure that only the people who need to have certain information have it while also making sure that within the organization there isn't that culture of silence? So I would push back. I think generally depending on what type of product that you have, like it's probably better to have a speak up culture and a culture of candor in order to build really difficult things. Majority of these products are highly collaborative. You need multidisciplinary teams to be able to communicate with one another in order to develop the product effectively, in order to sell the product effectively, in order to manufacture it, all of these different aspects. So I think when companies sort of lock down internally in the organizations where employees don't feel comfortable enough to speak to other employees or potentially the executive management, that that should be a concern. It shouldn't be the case that you have a culture that is so adversarial that people who work for you who are ultimately on your team who are spending probably at a minimum 40 hours a week committing their life to your organization aren't gonna also have some investment into the success of that organization. And even though you do have to protect trade secrets, that's often covered by an NDA. And if a internal employee goes external to reveal any of those things on, there are mechanisms in place that kind of control that. But I think for me, it's more of a concern if you do have a company where people are unwilling to say what's going on at the early stages to kind of mitigate certain fires from growing too big. And I mean, in your example, as someone who had to come out and speak out against this organization that you were working for, something that I find would be interesting to hear is also what that means for the person who stands up and says that because within the tech industry, a lot of loyalty is expected from the people working within an organization. I think it's interesting to see that a lot of the buy-in that's expected is obviously rewarded with stock options with the potential of being on the ground floor of something major. But how does it feel, in your case or in general, to be sort of a naysayer and say actually something's not going right here and I need to stand up as an employee and say that? So I think whistleblowers are just such interesting characters. And when you dig into the backgrounds of their cases, they're a bit tragic characters. Because even if I think in my own experience of working at Theranos, I was 100% committed to the mission. I can tell you that there's very few times in life that you wish so badly that you're wrong. That what you are seeing is not actually what's happening and that you are doing everything in your power to basically say actually the company that I'm working for is developing a technology that works. Or, oh, there is a circumstance that things will work out in the end. Or that you're not dealing with kind of malevolent people. And I think people who decide to speak up, it's a huge barrier to say something. It's a barrier because of the risks to your career, to the risk of a promotion, to the risk of social rejection, to the risk of you take all these risks and nothing happens in what that does to you. So I think when you see people who kind of come forward and say something, they tend to be some of the most committed people to that organization. Like, majority of the whistleblowers that I know actually really were committed to the companies that they were a part of, and they felt if they didn't say something, that ultimately the company would be hit so much harder that they felt that hit. And that's what gave them the capacity to overcome all the normal barriers that you experience when you're trying to report something that isn't quite what it seems. So it is a bit of a trade-off. I think the reason why whistleblowers are such contentious figures and make people uncomfortable and tense is because there is that trade-off of doing the right thing and being loyal. And the expectations of what loyalty are can mean so many different things to so many different people. But really in the end, if it was the case that, say in the Theranos event, if we did listen early and say, hey, I'm not gonna take this unfinished product that doesn't work very well and commercialize early and start testing on patients, we're gonna pull the product, we're gonna manage our relationships with our investors to say, we're not on schedule and there's nothing we can do about it, but we know this is not where it's supposed to be. Maybe the company would have been in a far different place. Maybe it had enough cash that it could acquire other companies. There were lots of other decisions that could have been made if it was the case that they paid attention to the fact that this is a huge threat to the organization. Deciding to commercialize early just doesn't make any sense. So at that point it is a breach of loyalty but at the same time it was in the best interest of the overall sustainability of the company in the long run. And I think founders have to be careful that they don't become too myopic on their timelines versus the grand timeline of what can make the organization be successful in the long run. And do you think in the case of Theranos, for instance, there could have been structures in place to have made the experience within the company for people like you saying that it wasn't going right, do you think that there could have been a way for it to be communicated internally rather than people having to go externally to members of the media and make it clear that there was a real issue at play? A hundred percent, a hundred percent. Theranos is like the epitome of what not to do. It's so many different levels, it was terrible. But I mean, having management that you can turn, again, creating a speak up culture that people feel comfortable of reporting minor issues early on and that that's not gonna threaten their promotions, that's not gonna threaten their relationships with their teams, that they're able to actually have some sort of mechanism in place that that's addressed and taken with consideration in some level of care. Also having some sort of reporting line that's hard if you're a small startup. It doesn't quite make sense to do a reporting line. Everyone knows everyone in any case. That's why I think really having leadership in the executive management to have that commitment to say, hey, we wanna address problems early. We don't wanna wait until they become something that blisters up to being on the headline of a major news outlet. I think really thinking about governance really critically is really important and it's often very neglected in startups, frankly. It's just something that's not really paid attention to and if it is, it's more for vanity reasons than it is for actual how do I make sure I hold myself and my organization accountable? It has largely I think been used to get as much fundraising in the door as possible by giving these different board seats and that's quite dangerous, right? The good governance has been shown time and time again to be something that produces more profitable businesses and ultimately helps the organization mitigate a lot of these different risks of different threats that can happen to the reputation of the business. If we look at the examples of, for instance, Theranos and FTX and to an extent Wirecard as well, a lot of these examples happened in a funding environment very different to the one we're currently seeing. There was a lot more money, there were negative interest rates, it was a lot easier for these companies to secure capital. I'm curious, taking sort of a macro view of the tech industry, do you think that a huge amount of capital can sort of paper over the gaps where it comes to these sort of issues or do you think it will always inevitably come out in the wash? So you're always gonna have corporate scandals, you're always gonna have fraud. You've had it since the test of time, right? For centuries and centuries we've had fraud cases so I think that's gonna happen. I think in terms of where it tends to be more prevalent in the macro environment again is in these hype cycles, is in these instances where there's an immense amount of cash floating around and a sense of fear of missing out where people are trying to jump into different deals because they don't want to miss on the next Amazon or they don't want to miss on the next Facebook or something to that capacitor, Google. You also have it where when people start to get sloppy with due diligence, they start to bandwagon and say oh this person over here, they're really reputable so because they're very reputable I don't have to do my homework because I trust that person. I think in those instances you have to be a little concerned and it tends to be in these times where the macro environment is very bullish on particular arenas that I think are the times that you should be a little more vigilant because I think people are naturally gonna be more austere, they're gonna be more risk adverse when there isn't as much cash floating around so maybe we'll start to see this re-correction, maybe we'll learn from the past. I'm really hoping that happens in the context of all the capital that's going to AI because I do think it's a technology that if you start to make mistakes in it it's not the case that it's sort of isolated to a particular region or it's small mistakes really mishaps in these companies can have huge ricochets all around the world and I think that needs to be paid attention to and treated with a little more vigilance and ideally we'll learn from these past mistakes of the past to ensure that doesn't happen in that space. How do you think, just finally, because I don't know, we may not have much time left but I'm curious in the context of the broader environment so let's talk about AI as an example. Heaven forbid, a year from now we see a huge scandal of a company within AI got a huge amount of money, wasn't really performing as it was expected. In the case of Theranos or in the case of FTX we've seen potentially how that will then the individual bad actor then impacts the perception of an industry. Do you think that that's a fair assessment that when there's a really bad example of where money has been flown into a fraudulent business that that can have negative impacts of other people trying to raise, trying to build businesses in that environment? 100%, these scandals so rarely affect that singular business and the reputation of that singular business and the shutdown of that organization. You see it in the employees, where a lot of the employees because they came out of these circumstances where they were a part of a culture that was so drawing and that came with so much reputational damage that maybe they leave that industry that they were in or just generally are very adverse. You see it with regulators where regulators feel the need to kind of step in and what do they typically do? They use case studies like these one-off scandals to develop different regulations and so that can create a tense environment. You see it in terms of the industry. It was funny, I was on Twitter in Stanford had come out with this new innovation for a finger stick of blood to run all of these different blood tests and I was looking at their kind of post on Twitter sort of announcing this new technology that came out there and the whole time you're looking at the Twitter comments and this is almost a decade now since Theranos kind of went under and it's like, oh no, here we go again. Let's have Tyler Schultz's grandson confirm this first and you've seen this huge chill effect that's happened with point of care diagnostics which is a legitimate field which was a legitimate field prior to Theranos but now it's so difficult for these companies to raise and legitimate companies that of course the grand vision of Theranos was hokey in so many different ways but there were aspects of it that were very viable and now a lot of these companies are not receiving funding because of this one incident that happened and you're gonna, you'll keep seeing this and I think the same thing that happened with crypto of course there's a lot of skepticism, a lot of sort of retraction from the entire space and the blockchain space, the NFT, just the web three space, everyone sort of pulled back in big ways and so I think it is something to be conscious of because it doesn't just affect that one business. It has whole ripple effects for the whole ecosystem. Erica, thank you so much. This has been a really interesting talk. Thank you. Thank you.