 Have you been to it? Yeah. Thank you, guys. OK. So before we start the topic, I mean, Emery came all the way from San Francisco and we've been friends and doing investment together for a long time. Why don't we talk about our own background to let the audience know why we're entitled to talk about the topic? Yeah. All right. So first of all, thanks for having me. It's a pleasure to be here. It's a really, really cool setting. It's super exciting to see this community of supporting startups here in Tokyo and attract people from around the world. So me personally, first of all, Emery Cronin, I'm French, but I've lived overseas my entire life. I grew up in the US, Scotland, England, Brazil, Norway, Singapore, spent a lot of time in China in a prior position. And I've been doing corporate venture capital and business development for over 20 years. First for a major telecommunications company, both wireless and internet. Then for a consumer electronics supply chain company for Sandisk and now Western Digital. So my background historically, though, is in software engineering and systems and management. And over the years, I've backed startups who are doing some innovative hardware, like server appliances, one of the first generation smartphones that had its own app store way before systems from Symbian and the iPhone and others was been involved in companies doing wearables, internet of things, drones, semiconductors, a bunch of those areas. So it's a really exciting time around the hardware renaissance that we're experiencing. And a little bit about what we're doing at Western Digital, we're investing in anything that pushes the frontiers of how data is generated, stored, and managed. So that goes from components that improve the quality of data being gathered or how fast it can be processed. It can be software. It can be systems. It can be cloud service as well. And we are touching on AI. And as far as deep tech, we ourselves are a deep tech company in the sense that we design our own semiconductors. We have a lot of time and money spent in material science. We have thousands of patents. At one point, we had more patents than employees on the flash slide through Sandisk. So it's an area we're very excited about. Cool. So what you're saying is basically that you do not invest in Uber or Airbnb, because they're not PP now, I guess. Yeah, exactly. Yeah, where we're investing in, you know, those can be great. And to a certain extent, actually companies like Uber, when they start getting into self-driving vehicles, they're involving a lot of storage and a lot of data generation, right? The test mules on these self-driving vehicles are generating gigabytes of data per day. They need black boxes to analyze and record all that data for testing purposes. And historically, you would see flash technology be deployed on those, in addition to hard drives, in some cases, but flash is a little bit easier as you're driving around. Yeah, cool. OK, so my name is Jerry Yen. I'm the general partner of the Highway Club. So it's kind of funny, because Emerick was born in France, but then pretty much grew up in the States. And his French now has an American accent. And on my side, I worked in the States for a couple years, and I moved to Paris, so in the reverse direction. And then so my English now actually have a French accent, so it's kind of like interesting how the past crossed. And myself, I spent 12 years in semiconductor, so doing design of chips, analog chips, Wi-Fi, 3G, 4G, and all the interconnection. So, you know, semiconductor is a very, very heavy industry that VC has been shunning for more than a decade. But then now, VC are back in the semiconductor business. Now they are doing new kinds of chip investment, which is kind of exciting right now. And so both Emerick and I were both investors in a lot of hardware technologies, and so this is going to be a very interesting talk going forward. So the topic today is really something like from Lin to DeepTech. And so I'm not sure if it's already quite well known here in Japan, but last year, especially in the second half of 2016, that's been the talk of the shop, which basically is the first resemble. We all know that from 2004 to 2015, there was a big boom of Lin startups. Everything that you start with an idea, you iterate, and you roll out the MVP to your users, try to acquire users, and try to grow as fast as possible. Personally, I saw that kind of pick out in 2015 when you saw all those crazy valuations. I don't know what's your opinion on that one. Yeah, the valuations, it's interesting because you're seeing some companies struggle for a while to see those valuations, and you've seen some companies that turned out to have pretty good exits afterwards struggle for years, including doing rounds where most of the equity got wiped out and early investors took a bath. In a way, one way of looking at DeepTech sometimes is essentially it's long tech because it takes quite some time to actually design systems, iterate through them. Sometimes you find dead ends, and that burns money. And if you're doing things with semiconductors, in fact, it's really, really expensive, even if you're doing things around FPGA early on, and you're always looking at the milestones. In a way, it's also similar to because DeepTech isn't just IT stuff. It could be medical devices, genomics, drug discovery, and those face similar issues of needing to raise a fair amount of money to get to the point where they have proof points. And occasionally, you do have circumstances where it's so promising, and the potential of the team that is working on a technology is so massive that someone swoops in and does a preemptive buy way before the company is essentially ready for it. And they're not really buying it then at that point for what the company's accomplished, which is a usual way of looking at a lot of the M&A. But really, they're buying it for what they could do in the future, especially once they're inserted into the larger corporate players. So you can think of some of the automotive players, the larger companies buying the upstarts around self-driving technologies and vision processing and things like that, because they wanted to grab that before those teams got too mature to be too expensive and have a life of their own. Or sometimes, you see that with semiconductor makers branching out into new categories. So you've had pretty good examples of the major CPU players buying specialized chip makers around computer vision and other aspects because they really felt they needed to be in that area or buying FPGA makers as FPGAs become more common or being used by a bunch of people like Microsoft and others to do some really interesting things on the processing side. So I think it's fair to define, if we define deep take, back in the 90s, I remember the boom days, almost everything was like big take. It seems like Yahoo, by the way, I did not call Fang Yahoo in 1994. I just happened to have the same name with Jerry Yen. But back in the late 90s, yes, there were a lot of dot-com companies that died after the bubble. But if you look at the leading companies, many of them were doing like my previous companies, servers that went IPO late 90s. So there was a very, very deep Wi-Fi leading company, building chips. And so maturity of the technology back then was heavy. Then we transitioned into a big boom of leading startups that everything was starting. You have a leading team. You try to deploy things to the platform. And now we've seen the new re-booming of the deep take. So we're seeing things like, if we define that, I think it's fair to say, artificial intelligence, machine learning, deep learning, and robotic drones, which you yourself as an investor in city robotics, one of the igniters of the drone boom, seems like heavy new type of sensors, seems that scale in a systematic way instead of going to consumers and trying to acquire them. So I think it's very fair to say that whatever the reason is, it could be because the variations are too high for the startups. It could be just, OK, now the competition is just too competitive on the other side. Let's go back to technology. It seems to be very clear that now we're going back to the narrative where your defensive ability now has to shift back to technology. There has to be an entry barrier, something like that. I don't know if you're sure. Yeah, exactly. I mean, if you look deep tech, some of the attributes are complexity of technology and the problem there is attempting to solve. Natural barriers to entry because of that complexity or sometimes a combination of the complexity of what you're trying to do plus some regulatory challenges, especially in the case of things around pharma, life sciences, and others. And when you see startups taking a lead star methodology and trying to disrupt with air quotes the pharmaceutical or life sciences regulatory environments, you get things like Derenos where that is not ending well for the investors or for the company itself. And one of the inner bunch of meta-phenomenons that are affecting why people are able to do more deep startups. Sometimes these things take a long time and it's finally paying off now where they're on the right approach. You have a lot of people who have had successful careers and some of the large players in and around the valley. They've been able to generate enough personal fortune to launch and set themselves up and bootstrap themselves and some of these early technologies. You've seen enough acquisitions to fuel the fire. VCs for a while started shying away from summing up their companies because it was too challenging. There weren't exits. No one was investing in it. But he started having a couple billion dollar acquisitions or really quick multiple hundred million dollar acquisitions and all of a sudden that encourages people. They realize there's a market for it and it's worth putting money into it. And now you're seeing a lot more VCs. For a while it was just kind of basically two, three value VCs plus a ton of corporates who were investing in semiconductor companies. But now a bunch of others have branched out and you're seeing customized chips around artificial intelligence, things around self-driving. But even the self-driving boom is kind of interesting because when I went to Carnegie Mellon in 1990 we already had a self-driving truck that would drive around Pittsburgh. The Carnegie Mellon is the AI sort of like hot spot. Exactly, yeah. The robotics institute's been doing phenomenal things around robotics and artificial intelligence and you had a van that would drive around Pittsburgh by itself. The difference was the sensors took a massive roof rack. You still see a little bit today of small roof racks. The entire van just had racks of hardcore servers and CPUs. That processing power is essentially available here. And that's one of the reasons why we saw drones take off, no pun intended. All of a sudden you had massive computing power that could handle the avionics, the controls. You had the communication protocols. You had long distance wifi, which we didn't have before. In the old days you have phenomenal accelerometers, gyroscopes and all these different tools and all these different sensors that are available are inexpensive. And it's really, it's a payoff from the smartphone component wars. Where these things are done in such volumes that you have a lot of critical components that make it much easier to do some of these large holistic systems that are deeper. And you're seeing that same reason why you see people doing space. The phenomenon around CubeSats and microsatellites, those are essentially cell phones in space with a couple of things around it. Drones are flying cell phones with propellers and software specialized. So I mean, basically as in the many topics we can talk about, I think today we can just want, because I think in Japan people are talking about AI a lot. And I have my own views on sort of like AI startups, what startups can do and what startups cannot do. So let's just focus on one thing. I think there's a missing link on many of the pitches I heard from AI startups, which is the input data set. So I mean, today when we talk about AI is actually more about deep learnings which is part of the machine learning, which by the way was pioneered by a French guy, also a French guy called Yon Lecombe with Professor NYU, but he's French and he has a lot of students that did the deep learning sort of researches and there are many of them are French. And so deep learning is about feeding into database into your algorithm and train the system, recurring eventually build a network, neural network that can representative, that can take new inputs and have an output. I think a lot of the startups that pitched me, they're missing the input data set. When I question them where does it come from, that's actually the main problem. What do you think? Yeah, no, I totally agree. And that also makes me think of one of the other aspects of AI that got really over-hyped last year and a little bit of the year before that are essentially chat bots. People got taken away and like, hey, we can have conversational agents and you can chat. I'm like, that stuff's been around for years. I mean, even conceptually, you go back to 1987, Apple had the Knowledge Navigator video. You can still find on YouTube, John Scully keynoted that at a presentation and you had the notion of an intelligent agent interacting with someone and it demoed a lot of other technologies that weren't quite around then. You saw some of it with a Newton, with Apple data detectors where you could write fireside chat with Jerry in the afternoon at two o'clock and it would enter, it would find the right Jerry, make an assumption on the day and do all this stuff and it was gathering a lot of information. You had General Magic, which was an Apple spinoff that pioneered the notion of intelligent agents that would go do work on your behalf with a scripting environment and that also pioneered things around virtual machines. The idea of the cloud, you would send these agents out. They would actually do your bidding on across different machines, interact with others. So you had the beginning of web services, APIs, and that was very, very, very basic, pre-machine learning and everything, artificial intelligence, and now it's just you just have this phenomenal resource of data that's been out there and the compute power and the specializations to make this to extremely useful stuff and specialize applications. Yeah, I think for chapels, that's still kind of quite relatively easy because it's basically based on rules, language rules. Exactly, yeah. And it's text-based, you don't need to do a lot of analysis, so that's why it's like... And that's trivial, that's been done before. Yeah. The real difficult one, for example, let's take Amazon Echo, for example. And when Amazon Echo was launched, people saw that that was just about allowing me to buy things and using my voice without using my hands, so that's just Amazon trying to sell more things. But then over time, people realized, no, this is like a Trojan house. The main thing is so in the world of AI, there's one problem that hasn't been solved, which is what we call the cocktail party effect, right? So when you're in the cocktail party, you can understand the guy's talking to you, you understand the content, but if you put a machine there, you can put a microphone, but you would pick up all the conversation, you wouldn't be able to discern which conversation was directed toward him. So that has been one of the holy grail for at least on the audio side of AI, how to develop a system that can do that. But one of the problem of that is that you do not have a database that's big enough, that you can train the system, but then now people realize that by putting an Echo into American households, all my American friends, like their children, they're not talking to their parents now, they're talking to Echo, because Echo always replied, Alexa. It's turned into almost more a toy for the kids than something useful in some cases, yeah. Yeah, so I mean, you have children, I mean, when they're asking you all the question, what's the longest river in the world? Okay, what's the second longest? Oh no, when it gets to the, what's the seventh longest, it's really, really bugging. But Alexa is not tired, it will always answer the question. But during that process, Amazon is also continuing to get in all kinds of voice data, I continue to train its voice recognition software. I think it will be the first one I really solved the cocktail effect party. It could, yeah. And it'll be interesting to see where they go with it, because right now, you know, we're seeing better results out of, at least personally with our family out of Siri, in terms of informational queries and things like that. Wow. But in the case of Siri, it's actually quite limited because you're talking to the folks, so it's just one voice going into it. And so, Yeah, correct. Yeah, absolutely, it's different. And because of the nature of the way, you know, we've seen Apple do things, a lot of the information's done on the device itself, or there's more protections around it. And that also gets into some differences between sometimes European startups and US ones in terms of how they treat things. So, for instance, there are some companies doing connected video cameras in the IoT space that are doing visual recognition, face recognition on device in the camera itself, rather than sending things out to the cloud. A couple of reasons for that, one is out in the cloud takes a long time to process. And there you can do more efficient processing because you have a larger dataset. But there's a lag, which is not good for security if you have some unknown person or someone in person that you think might be a bad character you want to know sooner rather than later. But there's also privacy aspects that are very nice. The fact that it's only in that device and it's captive in there and it's not in some database that someone can hack into or some government can have access to legally or otherwise. And that's a significant plus. But that takes, that startup made the decision to have basically dozens of PhDs around optical processing and they took, instead of going and taking the easy solution of taking your classic kind of video processor chipset, they actually took a really, really beefy processor that normally would end up in high-end tablets and did a hard work of coding for that. Not first people kind of overlooked that, but now it's a significant competitive advantage for them as all these other companies in the space don't really have differentiating factors. And why do you want object recognition or facial recognition for a security camera? You don't want the motion noise alerts because that's just garbage data. Motion detected, motion detected, motion detected. You get that a hundred times a day for a shadow or bird or something like that. You ignore that. But if it's something in your driveway and a car pulls up and it tells you, a car pulled into your driveway and it's not yours, you may want to care about that. Someone walking towards your door, are they wearing a uniform of FedEx or UPS and from the colors or is it someone you recognize? Is it someone from your family, is it your cleaners at home, something else you can choose to record or not? So all the things are important. And now it goes towards basically taking advantage of actionable information, contextualizing that and deriving greater value. So I think I guess my suggestion for my advice for AI startups is that when you start an AI startup and okay, you can be the smartest person on earth. Okay, let's put it at IQ of 240. Very good in mathematics, very good in writing code, writing conversion network. But then always think about where are you going to get the data to train your machine? Because when we talk about for example, if we go back to the echo narrative, Amazon is not going to give you that audio library. You can have a very, very good model, but please join Amazon as an employee because Amazon, that's the reason why they roll out echo, they want to monopolize that. So same thing if you're doing self-driving software. I mean, you don't have the street video for you to train the system. If there's a dog coming out. So if you want to do AI startups, you shouldn't be starting as if I can do this part of the algorithm. All I need is an input. Then I go find someone to give me them. Nobody's going to give you that input. There's some public, you know, database like ImageNet, which is open source by Stanford. But then by now all the battlefields have shifted to getting the data, which is why a hardware company are bidding on full-deck AI startups. Starting from sensors, you build own sensors, the kind of contextual environment data you want to get and then you process all the way up. And so for that kind of startups we're bidding on and for there's no way we're going to let them give those data to other pure software startups. Yeah, I mean, data has so much value. I mean, that's one of the reasons why you've seen people band together to do things around, for instance, collecting better next generation maps because they need it for their own GPS purposes or the in-vehicle systems, but also they really need it for self-driving. The more accurate data you can have out there and sometimes you can do it on your own. Sometimes it takes alliances, big corporate giants to kind of cooperate around developing those data sets depending on who they are, especially if they're trying to catch up. And it's also an interesting thing around deep tech is because of the reticence some people had of investing in chip companies a couple of years ago, I've seen a number of cases where companies had a chip, they had a value proposition for them. They're looking for buyers and investors and people said, ah, not sure. And they said, all right, well, we'll do a use case. We'll show you what we can use this for, whether it be stuff around power management, speed of response for things like visual identification. And some of these startups have actually had, they're actually commonly known now under the brands of their consumer products or their enterprise products using their own chips. And they've gone to the point where they're so successful just selling the end devices that they're gonna keep those chips to themselves as a core intellectual property and they're raising money off the back of this new product category. And it's really kind of interesting. And these are things where at first BC said, ah, there won't be use, someone's gonna want to buy that. And then the sync took off and they're proven wrong. So sometimes these things are overnight successes that take five, six, 10, 15 years. So we still have three minutes and I want to talk a little bit about access. So what's interesting, I think I'm seeing two things. So for example, Machube famously said earlier this year that if people, if the first trade-in dollar company is gonna be the AI company, right? So he said that trade-in dollar market cap, you know, beyond Apple. But what I'm seeing is that actually no, so a lot of the new deep-take, they say AI side of startups, they're being picked up by Aquarius very fast. So like Intel already acquired a couple of them, Google pick up DeepMind, which by the way is a European startup based in London for $400 million. So they're doing that. So I don't really see what Mark, you know, Machube is one trade-in dollar company growing up because they're going to be picked up by the cash-raised companies. What do you see? Is it a good thing that the startups are being acquired by big guys for $300,000, $400,000 instead of trying to grow to a billion dollar company themselves? Yeah, well for some of those, I mean, I think it's a right amount of money at the right time. Sonya have been struggling for a while and then we're on a curve. Sometimes, you know, for an engineer, it's not always just about the money now, but also the long-term incentive packages people have when you're acquired. But also it's about maybe about seeing the impact your technology can have on the world. There's only so much you can do sometimes as a startup. And sometimes you look at what the corporate parent can do with that technology and the additional reach it can have and how much it can be iterated and progress in conjunction with other corporate resources. And that really is a force multiplier. And for someone who is creating technology, all of a sudden having this massive distribution potential can be really exciting because you can have even more people using it. And I think, you know, to jump back, it's a riff on the Mark Cuban thing. Maybe the first trillion dollars, because the AI is going to be a financial services AI that's going to run rampant and mess up the stock markets and jack itself up. Or maybe, yeah, I see you're right. I tend to feel, because I also study economics, I tend to feel that if you have something that is very powerful, it's going to get into complete competition unless you have a natural sort of monopoly. And if AI, so AI is also democratizing, right? So I think the one trillion dollar company, it might be through MMI. It's probably not because you're doing something great and you become one trillion dollar company. And any kind of technology always back and forth. You remember, 10 years ago, so to wrap it up, 10 years ago, we're seeing like ImageNet, so people are using ImageNet, but they're getting like 20% only correct rate. Now they're competing on 6.7% versus 6.5% error rate. So you see that happen. It's not like five companies. Like 20 companies competing on that kind of thing. And that margin of sort of benefit you can get from competing in something that is quickly democratizing, it's going to go away. So as an AI star, you have to rethink about your long-term sort of value, where that is, and how do you view the defensive ability? I don't know what you're saying. Yeah, I mean, we've got like 20 seconds. I don't think I could address it, but maybe also going back to Trillion Dollar, I mean, AI now is kind of permeating a lot of companies, right? You have a lot of the classic companies in Silicon Valley from semiconductor, from systems, to smartphones, doing a lot under the covers with AI. They don't always chat about it a lot because they will see that as competitive advantage in terms of what you're going to apply it towards. So I think from that perspective, it's going to be very exciting going forward. Cool. Well, I think our time is up. Thank you very much, Eric. Thank you. It's great to have you.