 Hello, everyone. Thank you for joining us today and welcome to the latest session in our series of workshops and interviews. My name is Pamela Robertson, program manager for the SPIE AR VR MR conference taking place in March of 2021. Today, we'll be hearing the latest on laser beam scanning for near eye display applications from key players in the industry. And I would very much like to thank all of our speakers today for joining us. They're joining us from all over the globe. We have them Finland, Taiwan, Japan, Germany and the US. So yes, lots of time zones and I so appreciate everybody accommodating different times and making it today. And thank you everybody for joining us in the audience and I'm just so glad you're here. Before we start, I'd just like to go over a little bit of housekeeping. Okay, so just so you know, and I think you saw it before, this is sessions being recorded and the link will be on the website probably early next week. So just check back and look on the website, not only for the recording, but other upcoming events that we have coming up. There's we have sessions going on all the way from today up until March workshops and interviews. So going back here. Okay, so now I'd like to introduce the chair and creator of this important workshop. He has vast experience consumer electronics from his positions at Dolby, Panavision, Microvision and Toshiba. He's currently director of strategic marketing development at ST Microelectronics. I'd like to welcome Barath Raj Group Plannan. Oh, I'm sorry. Over to you Barath. Thank you. No worries. Thank you so much. Let me get my video up here and running a stored video. You got it. Great. Can you hear me, Pam? Is it clear? Yeah. Great. Great. Thank you so much. Pamela. Well, first of all, thanks to you Pamela and SPIE for hosting this event and inviting us to discuss laser scanning solutions for an AI display. So we really appreciate it. And like you, I want to second you in thanking our panelists as well and our speakers from all over the world. So much appreciated their participation. And also I want to also thank Bernard and Kristoff as well for their guidance in this. It's been a really collaborative effort and we really enjoy working with SPIE and we're looking forward to this event as well as looking forward to the conference next year. And also, of course, all of you out there listening to this. Thank you for your participation without wishing this would be possible. So thank you so much. So with that, let me go ahead and share my screen and start the presentation. Very good. Is this coming through Pamela? You're good. Great. Wonderful. Great. So thank you again. My name is Bharath Raj Gopalan. And as Pamela mentioned, I'm the director of traditional marketing at SDMech electronics. And I'm really pleased here today with my speakers and colleagues to talk to you about laser beam scanning for an AI display. So let me give you a quick synopsis of what we hope to cover today in today's session. Really, the idea of today's session is only an hour and a half long. And this topic is so rich and dense we can spend hours and five days talking about it. But the idea today is just to give you a little snippet or snapshot of the seminar workshop that we have scheduled for next March. And so this is sort of a teaser event to feel an appetizer for the main course, which will be next year. And hopefully it will wets your appetite to engage more, learn more, and certainly come next year for the workshop. So this is all about laser beam scanning. And what we want to really focus on today are the key technologies and solutions that enable all the wearable AR devices, which is a holy grail that the industry has been discussing for many, many years now. And as we come to a position where a lot of technologies are maturing, it is time right now to start to put together the various disparate components and think about the holistic framework under which they can come together in a system integration framework, as well as look at all the manufacturing aspects to make this a reality. One of the things that you'll find through the course of today's discussion is the need and the importance for monolithic design, holistic design, systems view, because very often a lot of the focus has been on individual component, individual device, or individual technology. And what's really left out is looking at the entire system that has come together in order to enable a lot of the key characteristics that drives AR glasses. And really, to make all that happen, it's also critical and important to have an ecosystem as well. And I'll share with you later on the recent press release that Ladi probably saw. To really talk about the fact that above and beyond having technology, above and beyond having solutions, it's also important to have an ecosystem and a platform which could be then offered to the marketplace to help build and drive and grow and catalyze the market. So with that, let's go into more details now. So the goal again is what's called head-up hands-free all-day wearable devices, all-day wearable glasses. That is the Holy Grail. At the end of the day, we want something that looks like what I'm wearing right now that looks like any of the glasses but has all the really advanced technologies that allows for augmenting information on the physical world. And some of the key things in order to enable utility of these kind of devices and these kind of applications are these factors over here. This is not the exhaustive list by any means, but these are some of the key criteria that's important in order to facilitate all-day wearable glasses. Certainly, we want this to be in dual-outdoor use for maximum utility. What that really means is the peak brightness coming to the eye needs to be really in excess of $1,500 per square meter or nits. What that really means is it allows you to have glasses that are less tinted, more transparent than going to a sunglass type of device. Certainly, the form factor is absolutely crucial and critical. A lot of you probably saw the recent announcement between Facebook and Las Vegas, talking about their collaborations on smart glasses for the future. And that really highlights the importance of fashion and form. And that's absolutely vital. And I think when you start designing these kind of systems, you can also start thinking about the end and working your way backwards. So from fashion back to hardware and software and technology. Power is obviously important, clearly. And really, in order to make this viable, cost-effective weight and size, it has to really be less than one watt. That's total system power. That is the optics for tonics, electronics, the sensors, everything needs to really be under that envelope. The weight's critical, of course. As you wear these glasses all day long, if you're much above 70 grams, you start to put out a pressure on the bridge of your nose, start to get headaches. So it's really important to manage the weight as well. Latency. By latency, I'm talking about motion to photon latency. Particularly for immersive applications like XR and MR, you need to be really under 4 milliseconds. AR is more forgiving. It's not as critical. But as we get into the holy grail of a fully immersive environment, latency becomes extremely important. Field of view. Field of view is very interesting. Lots of people talk about field of view. I think it's really important to talk about applications in terms of field of view. We believe, and a lot of our customers and a lot of our partners believe, that when you separate augmented reality from MR and XR, I'm talking about applications where we assemble symbology, text, graphics, overlay, informatics, if you will, kind of application, versus a fully immersive holographic rendering type of application. Field of view is very different. And so certainly for a simple AR, if I can call it that, smart glasses, if you will. 30 to 40 degree diagonal field of view is more than enough. Any larger, it's unusable for the application, quite frankly. Certainly, if you want to go immersive, it's absolutely important to have a very large field of view, at least 80 degrees, if not larger, so that you can have much more immersive and realistic interactions with the media and the content around you. A likewise resolution as well. Again, in depending on application, resolution will dictate. Again, for AR, again, symbology, text, graphics, overlay, 720p is more than adequate. You actually don't need more than that. But certainly, when you go into an immersive environment, 1440p is a minimum. It's probably much higher than that. In order to have the highest visual fidelity possible for the rendering of these images in the virtual space. And lastly, but not least, is the iBox size. Given the large range of variations of the IPD or the interpupillary distance, a large iBox is important in order to mitigate the number of SKUs you have and or mitigate the need for any kind of adjustments for IPD in the glasses. And so being greater than 10 millimeter is certainly really important criteria in that regard. So those are some of the requirements. Again, there are more, but at a top level, these are some of the key things need to come together that really drives the technology selection choices as well as integration scheme and your overall design. So, laser beam, how does that fit into it? Well, if it's done quite nicely, should be no surprise to you to hear me say that. It fits in really nicely for the new iDisplay for several reasons. Number one, size matters. And in this case, smaller the better. And laser beams can type of display device or micro displays gives you very small form factors. These typically use a MEMS micro mirrors for laser beam scanning. So down below, you'll see some images and pictures of what they look like. On the bottom left hand side, you'll see the US one cent coin. And next to it is the MEMS die. This is an example of a electromagnetic actuated mirror die. And then right above it is the MEMS package because it's electromagnetic actuated. It has magnets. But of course, there are other actuation schema, which allows for different kinds of packaging. Next to it, in the quarter, is a fully packaged optical light engine. You see the flexible PCB connections for both the laser module as well as for the MEMS. And that is a Pico projector engine. Even though it's small, it has a quarter, it's still too big for near to eye for glasses. So my colleague, Mark, I'll talk about what's new in the horizon that we can share in terms of small form factors that can enable for near to eye. And then how it works, as you can see in the little diagram below, it's very simple. You have three illumination sources of light sources, blue, green, and red lasers. As you know, as you can imagine, these are all edge mating lasers. Therefore, have a very high divergence angle. So you need to be collimated through a set of collimating lenses. And the beams need to be combined so they're cold in here and overlap. And they go through a dichroic element or beam combiners, hits the MEMS mirror, mirror scans, back and forth, either, yeah, one to two 1D mirrors or one 2D mirror to do the full roster scan. So that's the operating principle of a laser beam scan device. As I mentioned, you know, in order to get the size, you start with the MEMS, but you also have to have very small illumination sources. And so it's an RGEMs here and they talk about, you know, advanced laser diode packaging enables a very small factor, form factor, very efficient laser diode packaging that can support these applications. Between those two, we can generate well over a million nits, which is really important to have that kind of excess overhead of optical power as you have the various losses of the system. The overall system, we can drive less than a watt with such a configuration. And then certainly, you know, at the end, you have to have an imaging source. You have to image in the imaging plane. And so, you know, trying to have a very small form factor glasses, light weight glasses, using other de-fractive or holographic optical element is a very good choice to do that and to couple into the optical output from the optical light engine. And so, this will be here and applied materials here to talk about, you know, how that couples into a waveguide technology. And then finally, you know, we'll talk a little bit more detail about the different kind of actuation schema that's available for laser beam scanning. So, you know, design considerations. And this is, I want to spend a little bit of time here. This is really important. What we are finding, we talked to a lot of people in the market, a lot of people in industry, is that, you know, people tend to focus on one thing in terms of optimization. And this is a very complicated problem. It's a lot of you can probably appreciate and imagine. And this example diagram, this radar or spider web diagram here, just illustrates the fact that you have to, you know, simultaneously optimize on multiple domains over here, right? So, you know, you have to optimize power and so on and so forth. And these are all interact, as you can well imagine. And so, trade-offs have to be made very, very carefully and you're not going to get everything. It's just not going to happen that way. So, it's really important to have bespoke design, if you will, or if you will, a holistic design, a mildly designed, that takes into account all these considerations under the framework of the application. So, the application will determine where you need to be and what you want to trade off and what penalty you pay and what benefits you receive from there. For example, as I said earlier, in the case of simple AR, we have text, symbology, and graphical overlay. You know, talking about 30, 40-degree field of view, 10-millimeter i-box, gradient thousand nits, and so on and so forth, they can read all the specs there. And that will then determine basically how you'll optimize for the different parameters of the system and get to the kind of design that you want that'll make for a very appealing and sellable product in the marketplace. Conversely, when you then look at a different application and when you go to a fully immersive experience and holographic rendering, then you talk about a different set of characteristics in there, brightness is important, but not as critical because a lot of the use cases are indoor, typically, for now anyway. And so, you can get away with founded nits or at least tint your optical waveguide elements or your lenses. And certainly, the weight's important. You don't want to be too heavy, but certainly not as driven to be eyeglass type, rather, HMD type, so you can sacrifice some weight but you got to be under 200 grams, for example, and that allows some flexibility in the trade-off of design as well. And so, the point here is that, again, I just want to emphasize that it's really, really important to design the system and then work your way back down into the very components and the architecture, not the other way around. And the nice thing about a laser beam scanning, we believe is that at least from the power, form factor, bright scalability, and a lot of other parameters, we believe the laser beam scanning can address a lot of these constraints and give a lot of knobs to turn for the design and end product. And finally, it takes a village to do this, right? And so, we are pleased to announce to you the LBC ecosystem. Some of you may have seen it, some of you may not, but we recently announced the laser alliance. A laser stands for laser scanning for augmented reality. It's an alliance consortium of companies, members here are the founding members of this. And the idea really here is to make it easy for companies to go ahead and develop products. These are very complicated. Every piece of the puzzle is very, very complicated. So, our goal is to work holistically amongst ourselves in order to develop the technology platforms, the technology framework, develop, you know, interface schema and mechanisms, and even business models to enable, you know, anyone who desires to build HMDs or Aero glasses. And that's a no-sale alliance. And so, the key elements that came together, the illumination sources, the imaging, the wave guides, also the end products, which you talked about earlier, you have, you know, people like Mega One who have to then take all these components and put it to a lot of the light engine. And then that then gets put into an overall end product. And so, a quanta, for example, you know, as a little world leader in developing a whole range of end products. And so, you have to really bring all these pieces together in order to drive the marketplace. So, our goal is to facilitate as much as we can, to accept that we can offer the ecosystem and facilitate, you know, a much lower, lower frictional path to allow for various players in the industry to go to market. You will see two empty boxes here for OEMs and application processors. Suffice to say, we're talking to a lot of different people. OEMs, we cannot put their names. OEMs for those of you who don't know the vernacular are the end customers and brand-new customers. You know, a lot of them don't want to talk about their plans right now, so that will be a white box sometime for sure. But, in that case, the process is very, very important. So, you'll see that box being filled in pretty soon. And, you know, what I want to close it with is that more partners to come. We're open to others as well. This is not, this is open society, not a closed society. So, you know, join the alliance, you know, and come learn more about us. We'd love to engage with you. And we really look forward to, you know, we'll engage with you and, you know, as the days and months go by. And next year, at the full event, we'll have a much deeper, full-day discussion from all the people here on this call, as well as others to talk about all aspects for laser beam scanning for near-eye display. So, with that, I want to be sensitive of time. I want to then turn it over to my friend and colleague, Marco Angelici, he's the director of micro-actuation and estimate electronics. He's responsible for, among other things, developing the business around MIMS micro-mirrors. And so, he'll go into more detail about MIMS micro-mirrors, how they work, what's out there, and tease you about some of the developments that we're engaging with. So, with that, Marco, let me turn it over to you. Thank you, Barathe. Thank you for passing the torch to me. Let me show my presentation. So, tell me if you can see it, share it. Can you see it? Yeah, can you put it in? There you go. Looking good, Marco. Perfect. So, thank you, everybody. Thank you, Barathe. Let me introduce a preview of a laser beam scanning solution from ST Stempoint for near to wide display. First of all, talking about MIMS mirrors, one of the key components in laser beam scanning. Let me give you a brief history of MIMS at ST. As you can see from this chart, basically, ST has been developing and selling MIMS for 20 years, at least. And we are committed to both sensors and actuators. We are serving several markets and applications. You can see from consumer, mobile, personal electronics, industrial, automotive, medical, et cetera, et cetera. And for sure, we are committed to volumes. As you can see in the bottom part, we shifted to date 19 billion MIMS sensors and 5 billion MIMS actuators in the market. Taking a little bit, one step down into the details, MIMS actuator technologies. ST is in mass production and is heavily investing in all the MIMS actuator technologies from thermal standpoint for inkjet, printed, dispenser, atomizers, and other kind of fluidic actuators to electrostatic, electromagnetic technologies mostly related to MIMS mirror, and the piezoelectric, where we are heavily investing in the last years for not only MIMS mirror, but also for MIMS speaker, autofocus cameras, and ultrasound P-mute components. Talking about MIMS, it's not just having the MIMS actuators, just the muscles. You need also to drive those muscles. So we need also to build, and we are committed to provide our customers with the full system, with mirror drivers and laser drivers. So we need to drive those actuators. And as I said before, being committed to volumes, we are focusing not only on development, but also on mass volumes manufacturing. So we have two dedicated MIMS fabs in the world, in Milan and Singapore, and we have also two dedicated BCD fabs. I just mentioned BCD that are the most used in terms of our laser beam scanning drivers technology. So that's very important also in this time of COVID, we see the need to provide our customers with second sources and volume manufacturing, because we believe augmented reality, for instance, will be a big volume market where we are investing. Going again, another step down, I would like to show you what we are doing on our MIMS mirrors, that is a key technology, is a key pillar into the laser beam scanning projectors. And ST here is a wide range of mirror technologies, as we are providing our customers with electrostatic, electromagnetic and piezoelectric mirrors, all of them in manufacturing, in mass production, all of them integrated into our MIMS lines production. And thanks to the three technologies, three technologies to fulfill our customers, the requirements of our customers for all their needs, from ultra low power to the need of big displacement, big masses. So we can cover the full spectrum of requirements and these capabilities are making ST the undisputed leader in laser beam scanning solution in the market with more than 12 million mirrors already shipped to date. So that's our leadership. We have it and we plan to maintain it. And that's where we're also showing the results into augmented reality. Of course, not only on augmented reality, we are investing in LiDAR and other technologies and the recent tear down of products in both augmented reality and LiDAR showing the presence of ST in the market already in an every way. And that's the key, I would say, slide of the presentation. I want to give you a preview of what ST is providing. ST is a one stop shop for laser beam scanning. And as you can see on the right side of the slide, I'm focusing on all the components required for laser beam scanning. MEMS mirrors I already discussed about but it's not just MEMS mirror. We develop, we produce mirror drivers for all the technologies I was mentioning, the electrostatic, electromagnetic, and piezo-actuation technologies focusing on performances and power consumption. So energy recovery driver, we have several patents and results already tangible on how to move mirrors with high field of view with ultra low power and very low power. We also develop dedicated laser diode drivers for augmented reality, in particular to have a 300 picosecond rise time and full time, 300 mega pixel clock to provide a true crisp pixel performance with ultra low power characteristics optimized for augmented reality where the content is harsh and you don't have all the pixel illuminated at the time. So we have developed the power management unit in order to be up and down, let me say, to move from a standby low power condition to full operation in few nanoseconds and being able really to go in ultra low power mode in a few pixels, or black pixels basically. Of course we provide three and four channels for RGB driving or RGB plus infrared. And to go down also in the availability and the support to our customer, we provide a full system support. So we have developed hardware and software control loops for mirror and for laser. We provide the eye safety, we provide all the required calibration for manufacturing, the video processing, the pre-distortion, temperature, pressure, compensation and all those kind of support required to develop a full system, not just the component. And we have only now bought for components, development and manufacturing, but also at system level. We have our team supporting this portion. Last but not least, we have recently engaged also in the relay optics because working with our partners, as Bharat was mentioning at the beginning, this is an holistic approach. We have to work from the lasers to the eye. And working with our partners and the wave guides, we have designed and patented a couple of designs to maximize the performances of the coupling between our laser beam coming out from the mirrors and the in coupling of the wave guides. If you go to the left side of this presentation, you can see the building blocks. And as I said, out of the laser diodes for which we rely on our key partner and the optics who are basically in terms of manufacturing is not ST to manufacture those parts. All the rest is developed and manufactured by ST. And you see on the bottom side on the left an image of a real temporal arm reference design, including all our electronics. Those are all ST components. And the optical engine on the right side basically that is generating the, including the laser diodes, the mirror and the optics to provide a beam outside. Talking about the optical engine, you can see basically the evolution. And that's just a preview and the teasing of what will be detailed in the next presentations and what will be presented next year at Photonics West as SPIE. Basically, we moved from our Pico projector of 1.7 cubic centimeters developed, sold in the market in 2015 for which we built a proof of concept. You can see on the bottom left an example of a proof of concept. It was a technology proof of concept. Then we have today available this new engine. We call it Helen. You see the real image where we are basically, you see a lens coming from the top that is a collimation lens using for the collimation process there in our labs. So we have built the design of the optical engine. The dimension of this optical engine is 0.75 cubic centimeter, similar performance to the previous Pico projector, but much, much more compact thanks to the 3-in-1 module from Osram, the laser diode modules. And thanks to the laser alliance that has been announced two days ago yesterday, basically we are increasing our capability with our partners to manufacture the optical engine that we have designed and the capability to go to a final product. So a full pair of glasses that should be available in the market by next year. We have a roadmap as well, and it's more than a roadmap. 2021, we have already most of the components ready. New mirrors, what we call star one here on the right. It's an optical engine going to 0.65 cubic centimeter, increasing the field of view, increasing the resolution. Most important, this will be 50% power consumption versus what we have in the center and what we have in the center is already much below one watt. That was what Barat was mentioning. So the target is to be below one watt in 2021 with a binocular solution. And this, thanks to our thin field piezo mirror technology and the energy recovery, thanks to our piezo technology, we have enabled an ultra low power solution providing a very good performance. But stay tuned, I would say, we will wait for you in the next weeks and months to see the evolution of the alliance and the evolution of those products. And of course, we are open to answer more questions related to those products. Thank you very much. Thank you. Thank you, Mark, very much. I really appreciate that. That's very illuminating. Great. Thank you. So our next speaker is Stefan Morgat from Osram Optosemic Hunters. And for those of you who don't know he's calling it very early in his time in Germany. So I really appreciate Stefan for taking the opportunity to be with us. Thank you, Barat, for this introduction. So let me share my screen with you. So I hope it's now in presentation mode. Okay. So now let's talk about the light source for the laser beam scanning systems. So as you can see here, for laser beam scanning, you need three lasers. So a red, a green, and a blue one. And so far, for these laser beam scanning systems, individual packages were used. So it's one package for red, for green, and blue. And mostly were used the so-called TOCAN packages, which are industry standard for laser diodes. But as mentioned before, it's important for you to eye or for AR classes to make the protection engine very small. And therefore, there's, of course, the idea to put all three colors into one package. And you see here a nice example from ST. So on the left side, you see an engine using TO lasers. So you see the illumination unit is quite big due to the big laser packages and the optics you need to collimate and collect the beams. And on the opposite, on the right side, you can see what you can achieve by putting all three colors into one laser package. And this is exactly what we are doing now. And so here you can see the image of the package. So this is a kind of target specification, which we are now fulfilling already. And so this is a very small device. Footprint is only 7 millimeter times 4.6 millimeter. And behind is 1.2 millimeter. It's an SMD device. And it's a so-called top looker. So RGB beams are emitted to the top. And this is achieved by using edge emitting lasers, which are placed inside the package with a certain gap in between. So in this case, we have chosen 2.3 millimeter spacing. And for each color of each beam, prism is used to deflect a beam from a horizontal to a vertical direction. So to get a top looker device. So there's no beam collimation, no beam combination. And also these arrows you see here is only the optical axis. Of course, as you know, as characteristic for edge emitting lasers, the beam is emitted in a kind of cone with an emission angle of 7 degrees times about 22 degrees. The power levels are chosen 100 milliwatt for red, 50 for green and 80 for blue. I will come to that on the next slides. Because this is important, of course, you use the three colors to mix the colors. So you can see here the CIE 1931 color chart. There you see the three laser wavelengths of primaries, red, green, blue. And now by operating these three lasers, you can get any color within this triangle. Triangle. And you have to operate the lasers by applying a forward current above the threshold. And this has to be done in a very short time. So you have only some nanoseconds to achieve a resolution of 720p and a refresh rate of 60 hertz. So the lasers have to be pulsed very fast and very in an exact way to mix the right color for each pixel. And coming back to a brightness, it's always confusing optical power, luminous flux, luminance. So here's an example to achieve a good or D65 white point, a good balance white point of 22 lumen. You need about 57 milliwatt of red, 36 of green and 21 of blue. And if you would make a full white screen, which is maybe not typical for the application, but it's always for protection of reference, a total white screen, you need about a power of one watt to a laser diode. And now coming from a laser diode to the eye, we use this eye, so this is kind of the optical train. So putting one watt to a laser means you emit about 25 lumen out of a laser. And now assuming 30% efficiency of the optical engine, you get out about 10 lumen. And this 10 lumen are then feed into the optical combiner. And assuming the optical combiner has 100 nits per lumen, you will get 1000 nits to the eye. So this is an example how the electrical power is transferred to the luminance of a whole glass. So there are some slides about the lasers. Thank you very much. Stefan, thank you so much for that. I really appreciate that. Thank you again. Our next speaker is Josh Littlefield from Dispalex. Josh is the executive vice president of North America Sales at Dispalex, and he'll talk to us about wave guides. Josh, take it away when you're ready. Perfect. You see my slides? Not yet, Josh. Perfect. Hold on a second. There we go. Got it. Thank you. All right. It's nice to meet everybody. My name is Josh Littlefield, and I'm with Dispalex. I've actually just joined a little over a month ago. So you guys are all stuck with me today. Our technical team is more than happy to answer any detailed questions in the future. But until then, like I said, I'm going to walk you through where we're currently at. So we're here, obviously, to talk about diffractive wave guides and LBS. So we at Dispalex are extremely excited about the overall display industry, and in particular the trajectory of the AR performance innovation. So we're going to talk a little bit about some of the technology that's currently out. So BirdBath and Elcos and DLP. They have got some amazing technology that's in the marketplace that are leveraging these two tech. BirdBath, it does some really good stuff. It's been around for a long time, and the use cases that it's used for, it does what it's supposed to do, but it does have limitations. Elcos and DLP, if you know my background, I actually just joined Dispalex from Magic Leap. So I am a massive fan of the Magic Leap one. I'm a big believer in the technology and where it's going. As a matter of fact, just earlier this week, Dispalex put out a press release on our new 30, 40, and 50 degree FOV wave guides that actually support Elcos and DLP. We can go much higher than that on the FOV as well. So we're huge proponents of this technology. With that being said, though, there are some limitations to Elcos and DLP. All of us on the call today are here because we are dedicated to helping whether you're a consumer or an individual or if you're in the enterprise space or heavy industry or in defense. All of us are committed in bringing augmented reality to everybody. And in order to do that, we have to go past some of those limitations. So with laser beam scanning, we are able to finally start hitting those issues. And those issues are in form, they're in function, they're in quality, and they're in cost at scale. Our goal is to make it so where if you're in industry or if you're in enterprise or in defense, that you're focused less on the device that's on your head and focused more on the work that you're doing and leveraging that tool that you're wearing without having to worry about it falling off or where it's at or the weight, things like that. Or if you're a consumer, you want to be able to look the way you're used to looking, right? So you don't feel out of sorts and that you're able to see what's around and again, not deal with the weight. So we're extremely excited about where laser beam scanning is going. Again, there's a lot of advantages that come with LBS. Higher resolution, lower power, a much better contrast and greater transparency. And then of course the smaller form factor and the lighter weight. But in order to get there, there's a lot of technical challenges for the waveguides that you have to get over. Coherency in today's laser sources actually contribute to the interference, the speckle, and then of course the compromise image sharpness. And then in order to get smaller, there's optical consequences. We could go on and on and on about all the different challenges that are there to go into the much smaller LBS. With that being said, we were able to address that, right? And to conquer the challenges and the hurdles to go from an LBS in a lab to actually get it ready to be a product. We actually have LBS demos that are out in the wild right now. We have many customers who are coming back to us and telling us exactly where we're at in the industry. A lot of them feel that we are, you know, 12 to 18 months ahead of the marketplace. Our goal is to help our customers have those compact eyewear with really thin, really light and thin waveguides and really have a large eyebox and lightweight so people can start wearing it all day. There's a question in the Q&A where somebody had asked, you know, what are some of the use cases for all day? We could talk about that for a long time, right? But the big thing is, is that we want to be able to give people, whether you, again, your consumer or your enterprise or your defense, help people have the opportunity to choose what use cases they want to use it for. Whether they're using it for five minutes or they're using it for 10 hours, they need to be able to wear it in a way where it's comfortable. And so that's our goal. So how do we do that? I'm a big believer in the Lazare Alliance. In order to have the form, the function, the quality and the cost at scale, the only way that we can do that is through the right ecosystem. So working with people like ST and Ostrom, Applied Materials, Quanta and Mega, we were able to, you know, address all those major issues to get the product out so then people can start testing and start building for their product's future. There's a lot of complex AR technology components, very individualized components that everything has to work seamlessly. In order to accomplish that, there has to be the collaboration between all of us. And so that's what this partnership has done. So as an example, Applied Materials is the manufacturer of all of our waveguides. We know that whenever we work with them, whether we're doing a standard high quality display or we're doing a custom built or an NRE project, we know that the production actually is manufactured at every single waveguide of the highest quality. We also know that whatever project that we're working on, whether if it's a light engine from Mega or we have worked with Quanta or ST or Ostrom, that every single interaction that they are bringing their best people to allow our customers to be able to actually achieve their goals and their visions. Again, I'm here more on the business side, although we have an awful lot of technology folks that are more than happy to answer your questions. Feel free to email me, and I'm more than happy to put you in touch with my CTO. Thank you. Thank you so much, Josh. I appreciate that. Perfect segue and great. So yes, we talked about devices so far. Now we need to get into how do you realize these devices are some kind of a manufacturer of products. So I think it's a great segue to apply materials. So let me introduce to you Nama Argamant, and she's the director of product and marketing applied materials. And she's going to talk to us about we've got manufacturing. So Nama, over to you, please. Great. Thank you so much for us. And thank you, Josh, for the kind words. Can you see my screen here? Yeah, looks good. All right. So let's start with a short video. If it works. Your smartphone is a computer that's millions of times more powerful, 40,000 times smaller, and 16,000 times less expensive than the first mainframe computer. And there are more than 2.5 billion in use today. How did that happen? Industrial scale. Every year, nearly 30 million waivers are processed for smartphones. Each 300-millimeter wafer contains hundreds of chips, all meeting the same rigorous specifications. Applied's world-leading expertise in materials engineering and building equipment to manufacture chips has made this industrial scale possible. In partnership with our customers, we continue to make chips faster, more energy efficient, and more affordable. Today, our systems touch virtually every chip and display made. Even after more than 50 years of creating new possibilities for our customers to embrace, we're just getting started. The world is moving rapidly into an era of self-driving cars, robotic manufacturing, the Internet of Things, virtual reality, and more. Digital devices are becoming extensions of ourselves, sharpening our vision, deepening our insights, and enriching our experiences. These devices will be powered by a new generation of chips, requiring new materials and new ways to build them. Applied materials is at the forefront of making this new world possible. Come join us. Right. 2.5 billion smartphones in use today, they're made on over 30 million waivers per year. So Applied's industrial-scale material engineering has made smartphones ubiquitous. And imagine a world where this industrial scale is used to put augmented reality glasses on every face. Applied is the world's number one semiconductor and display equipment company, investing over $2 billion a year in R&D. And with 21,000 people across the globe, all these people develop an incredible variety of materials engineering technologies in deposition, etch, patterning, filling and removing layers at an atomic precision using cutting-edge robotics, control, and software. Applied technology is in every phone, every TV, and every electronic device in the world. And we are committed to bringing this technology to the next wave of innovation in photonics devices, including AR. Our group within Applied, called Engineered Optics, where am I? is using all the breadth and depth of capabilities developed within Applied Materials for the semiconductor industry and applying it into optics. For the first time in history, the semiconductor industry can now fabricate deep sub-wavelength features in high-quality optical materials to open up new possibilities in an industry that is on the verge of inflection. We're developing our technology on 300-millimeter platform. But Applied has been making glass equipment for over 20 years in our display business. And today, build Gen10 display tools and roll-to-roll platforms that we could leverage in the future, which gives us a huge flexibility as we look into the future. Within our group, we have a large team developing these capabilities over quite some time now, and we will soon be ready to launch a real product. And here are two of the first two applications that we're looking at. Surface-relief-rating waveguides for augmented reality and flat lenses for consumer electronics. You can see two 300-millimeter wafer examples below. On the left are our high-index edge waveguides demonstrating the best performance we have seen to date on a single heat, lightweight, and thin substrate. On the right are our near-infrared lenses for 3D sensing, demonstrating over 90% efficiency. Deep sub-wavelength optics is a big inflection that we believe will enable new emerging industry. It's bigger than waveguides, it's bigger than simple lenses. We expect many more applications to come and revolutionize the way we see optics. We're already getting interest for new applications, so if you think you could use our platform, my contact info is at the end. But we're not done working on waveguide technology. Engineered optics has developed a unique set of capabilities in order to push waveguide performance forward. First of all, our ability to process 300-millimeter transparent substrate with high-quality films at over 2.0 index and edge-to-waveguide pushed the envelope on productivity, reliability, but most important, image quality. We developed slanted structures that increased the waveguide efficiency and, by that, allow for lower power consumption by the light engine and true all-day wearable glasses even in daylight. So, we developed slanted structures that could reduce relay optics form factor. Our local area processing and gradient features increase the image quality and color uniformity. And we can do all that with double-sided processing to allow for a single sheet while maintaining a wild field of view. And you can see our inline wafer flipper in the center-bottom image. And thin, lightweight wafers with nanostructures patterned fully-automated back-end line to drive singulation, high-index, edge-to-black, and stacking as needed. And with all that, with high-index films that are inorganic and low-haze, deposited on a mature semiconductor process, so you can leave your smart glasses on your car dashboard in a hot summer day without worrying about yellowing or delamination. Applied 300-millimeter silicon platform is mature and most advanced, processing today's 5-nanometer features. But processing substrates other than silicon require a whole new thinking on mechanics, optics, chemistry, physics. Engineered optics as part of applied materials has developed custom tools over several years to build a full suite of equipment that enables all process steps on transparent substrates. This is why applied is uniquely positioned to drive high-volume manufacturing in the space. The platforms we're using are mature, high-productivity, high-yield, and we take advantage of 50 years of materials engineering expertise from our parent company while building a strong optics team to bring out our dedicated metrology and characterization tools for waveguides. In comparison, nanoimprint technologies are still lagging. They're not used at scale today, and we see them as a risk. We offer way for services from GDS files all the way to yielded parts and on any scale, from prototyping to millions of units per year, all having high-volume manufacturing in mind. And if you'd like to learn more on what we're doing, our GM Wayne McMillan will give a fireside chat to this forum in December, and I encourage you to attend. Okay, Nama, thank you so much. I appreciate that. Thank you very much. So, along those themes of, you know, high laser integration, so we talked about waveguide manufacturing. Next, our speaker is Makoto Masuda, Masuda-san, he's a chief operating officer at Chitokanaju Office at MECO-1, and he'll talk to you about how to bring all together, how do you bring in all the various components together into a compact optical light engine? So, Masuda-san, please take over. Thank you. Is it clear? It's very clear. Thank you. My name is Masuda. I am CEO of MegaOne. I will start the presentation of MegaOne. Thanks, SPRA and ST Micro invited MegaOne to join the SPI Webinar section. Here is the agent that I would like to share with. Number one, MegaOne capability. Number two, factory tour. Number three, LBS, AR technology. Number four, low-loan laser. At first, MegaOne capability. MegaOne is the department of mega-force, which is specialized in plastic and composite material injection. MegaOne spin-off in 2055 and cooperate with ST Micro since then. We calibrated closely with ST Micro Mem's technology to discover laser optical engine. Gradually, gain the mass production for the high precision manufacturing and process handling. I joined MegaOne, which in my knowledge, laser is the perfect solution for AR. MegaOne was honored to be part of ST Micro Laser Alliance System. MegaOne designed our own automation system and on the key of collimated laser and positioned the optical element. Next, second, factory tour. Here, I would like to introduce our production line, which gives you a better idea of MegaOne factory floor. From weher label part, we receive die-bond wire bonding and optical part AC at collimation and calibration. And to form an optical engine, from weher to an optical engine, all by self-designed AOI automation equipment. All process is done by automation from our talented members. And we are proud to announce that. And next, LBS AR technologies. MegaOne has the ability to do optical simulation familiar with optical and electric parts, supply chains, as well as image quality evolution. In other words, optical design co-abode development and its evolution are possible. MegaOne can also tune image quality based on them. Therefore, MegaOne can provide the most stable laser module system for customers. We also specialize in ultra precision manufacturing that cumulative the experience in MegaOne for 10 years. Final page, role in laser alliance. And MegaOne role in the laser system is the mass production manufacture of laser optical engine. As well as we have the ability to evolve the image quality and fine tune with our key supplier, which are ST and Othlam and Dysprix. MegaOne glad to devote our strengths on compact design and precision manufacturing ability for our customers. Thanks for SPI and ST micro event. Invite MegaOne to join the section. We wish you have a nice evening. Thanks. Thank you so much for that. I appreciate it. And final last but by no means least is our last speaker. Let me introduce him to you. So Yun Cheng Yu is senior director optical engineering division at Quanta computer. And like a lot of other panelists, he's also making sacrifice on vacation today actually up in the hills, but he is kindly accepted to participate here. And so YC, I will leave it over for you to do your presentation. Thank you. Thank you. Thank you. Thank you for us. Thank you, Marco. Thank you for SPI and to hold this event with my honor and pleasure to join. So today I would talk more about the MIA display for the Hemang display and the smart glasses as a system point of view. The first page, I would like to categorize the two, the four MIA display into two groups. One is the Hemang display, like the HoloLens one, META, METUDIVE. And the second is the smart glasses leading by usually the Google glasses. And there's Epson, music, vocal, and the north glass too. And we have Unreal coming. And this is my authorization for the Hemang display. There's an emerging trend. There's a customization for professional occupation. It is a very good thing. You could see all of this product comes from technology design, from a company trend. But now we got a new requirement, the real demand from the customer side. It's a good thing for the whole AR market. And the second is we have a trend from the smart glasses. But we have tracing to make the smart glasses, like a normal glasses. And now there's more and more material technology. For example, we have the prime material to join, to manufacture the wave style. It's a good thing. And we have a mine called Olay. It makes the Unreal this kind of product available. And the best thing is we have a laser beam scanning. It is a belief it could make the smart glasses real. And this is a new industry for professional purpose. So we got the requirement, the real demand from the doctor, from a warehouse worker, from a firefighter. However, they start their research, they start to explore their requirements from a hardware. But while they're thinking about what ideal MII display they would like to have, they think about it should be match their occupational image. For a doctor, they would like something like a search code headline like this. And for a warehouse worker, they need wear, the safety helmet. For a firefighter, they need a facial mask. So for a field of view requirement, it's around 30 to 50. It's Singapore. For a camera and a sensor requirement, it is very interesting. For a doctor, they don't need so many sensors. They just want to connect to their endoscope. And even more, they would like to see if we could add a LED headline on this Heman display. And for a warehouse worker or a postman, they need a buckle scanning to scan their stock. And they even think about if we could have a precision major of the box, the gift box. For a firefighter, they need a thermal camera. For a police, they need a high-definition camera. And especially, they may require the facial detection. And the smart glasses is the most wanted during this trial period. And of course, we joined the SPIE for this forum. It's especially for a smart glasses. And beyond those factors from Paras, I want to elaborate more. For glasses, we are looking for a prescription lens, it's embedded or attached for. And for glasses, we hope there's a foldable hinge. And since the glasses is one factor of the personal image, so it's a changeable stylish visor for maybe a changeable frame. And the near-eye display is terminated the whole form factor of the smart glasses. And we believe the laser beam scanning can make the slimmest temple. You could see this part near the frame. It could be further reduced if we apply the laser beam scanning. And here, I would like to talk about design smart glasses. You could image, you made it's just only one inch, the optical engine. It could happen the interference of the human head. Normally, we would like to have the red angle to have a nice look of the glasses. But with the red angle and the optical engine, if the lens goes to 34 millimeters, it will happen the interference. But if we reduce the red angle to zero, if we keep the optical engine lens to 34 millimeters, the gap between the engine and the head is still very small. It's very impossible to make a glasses. So we could see how important of the optical engine effects the form factor of the glasses. This page, I would like to share the hinge consideration. You could see the picture A. The hinge is close to the frame and with the shorter lens of the optical engine. So the open angle of the temple is larger than the picture B. It's very important. Since the head width is a global feeding consideration, it's a technical turn. It's not just an outlook. We need to think about the head width range. And in the middle column, we could see we change the location of the hinge. For example, C is the ideal case. It's a normal glasses case. However, you need to break the optical engine and the up-width guide. You need to break the optical head. It is very critical to align a very good inclined angle to the width guide. However, this structure will break the loop. So it is very, very challenging for designers. Normally, we will take the scenario B to fix the optical engine to the width guide. However, you need to reduce, you need to control the optical engine size. If we just use a single module along with the microprocessor board, the whole optical engine module will be very long. So you would like the scenario E. It's not good for the outlook of the glasses. So with the small phone factor, laser beam scanning is very easy to make the hinge to accommodate the most global head width. So QANTA is committed to AR glasses. We have experience of the near-ight straight technology. We have proven a record of the optical engine production and we are trying to establish the laser beam scanning protection to work with ST now. So we believe with the laser beam scanning technology we could make smart glasses just look like normal glasses. And we also work for a head mount display and a smart glasses production. We have a five-year experience working on this part. And so we could do a very good design review for a good global fitting simulation, drop simulation, thermal simulation, and manufacturing and everything about AR. Of course, we could support prescription mass customization because everyone's prescription is not the same. So we would establish a system to control this kind of prescription lens once we got the prescription from you and also from you and we target to send you the whole AR glasses within 10 days. It's a challenge of the operation and of the whole supply chain but I will commit to do that. And thank you. Thank you for your time. Thank you very much. I really appreciate it. Again, thank you for being available on your holiday. So that concludes the main presentation portion of our session today. Let me turn my video on here so you can all see me. So again, once again, thank you to the panelists. This is a bit of a challenge. We tried to allocate everyone 10 minutes per session. So really time for Q&A. It's quite challenging to try to compress all this information into such a small timeframe. But I think you did a pretty good job. So thank you for that. Clearly there's a lot more to discuss. And again, the idea of this session was to give you a little bit of a teaser on a preamble or a PV for next year. So hopefully it will excite you enough to not only come next year but also in the interim to ask and feel any questions and also join the Alliance. So, you know, as we've been going along, I've been, as well, other speakers here have been trying to answer the questions. So we've answered as many as you could in real time. So that was a pretty interesting way of working, which I like, I must say. And so I think there's time to answer some questions live as well. In particular, surprisingly, lots and lots and lots of questions of microvision. I didn't quite expect that, which is a good thing, by the way. I mean that in a good way. I think you heard in the outset that Pamela mentioned that I was at microvision as well in the past. That's a great company. I actually have great technology. And so I can't answer every single question. There's too many answers here. But the general theme is pretty clear. So what else is the following? ST and microvision are very close parts. They're very important to us. They're a customer and they're a partner. And we really enjoy the collaboration and the partnership. Obviously, I can't disclose to you any more details about business arrangements or technologies or product roadmaps or investments and those kind of things. Obviously, as you can imagine, that's something that we can address. But suffice it to say, they are continued to be a very strong and close partner of ours across the board. And the question of joining the Alliance, as I mentioned earlier to you, is what we created. It is not a closed alliance. It's a growing alliance. In fact, I look at the audience and I look at you, but I look at you virtually. Or I look at you and I look to you to see if you're interested in joining or participating. You're all welcome as well. And so as more companies join and show interest, we will publish that more and more. This is an open tent. This is not a closed tent. A product market development and really drive the consumer market so we can draw the violence we need to be and we can all benefit. If you have any further questions about microvision and how it fits in and without any of the detailed financial type of questions, please send me an email and we'll be happy to have a discussion with you offline about that. Pamela, are you still there? Yep, I'm here. Just weigh in and jump in Do you want to bring everybody on here? Absolutely. Bring everybody on. We want to discuss amongst ourselves maybe some of the things that I don't know. I think we get to have fun in the last 18 minutes here. We got lots of questions. Is there anything that anybody would like to address? Thank you so much. That was fantastic talks and very important for our industry. I can't tell you how much activity that we've had, whether chat and even in the background here. Lots of activity, so thank you. Do you have any comments? I'll let you sort of guide that or ask if there's anything else. Sure. Let me go around the table if I will for comments first. I'll go in the order which you went. Marco, do you have any comments? You've seen a lot of the questions as well. We have to answer the question here generally if you'd like. You can have fun. Not too much fun, but some fun. Thank you. I see lots of questions around augmented reality about what's next. As discussed today we put in place an alliance to enable the market first and to provide the right components in mass production. Best players on the table to answer some of the questions. Prescription lenses can be embedded into weight guides so that in terms of ST components we are committed that we have already mass production, MEMS mirror we have mass production, all the components so we are ready to start at the beginning of next year. We have already this year in mass production products. There are questions on where we are. There are lots of questions about how we can build a table balance just coming out last weeks. Two weeks I would say on augmented realities and on LiDAR. I will not disclose. You can buy those reports and find who is there in terms of mirrors. That would be interesting. I have no other comments from my side. Thank you, Mokko. Appreciate it. Then I think after you came Stefan. Stefan, do you have any comments or anything to add? No, I think that if there is any specific questions regarding weight guides just feel free to send me an email if you have any questions. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you Mokko. Thank you. Thank you. Thank you, so much for the questions regarding details, issues we have We guarantee some information on the planets of the math problem which the population would like to just for this market. It's very, very unique. And we have high throughput in mind, right? Because we all want to keep cost low. So we've developed something especially for this market. And we're using it wherever we can. Terrific. Thank you so much for that. Masuda-san, any comments for you or anything you want to add? Thank you very much. I have no questions. I'm glad to meet you. Thank you very much. And YC, last but not least, any comments or questions? List of questions you had? I see a little question about the prescription. Yes. Now, Nick, I could answer easily. Now we are thinking about something like a sandwich. One method is to name the prescription onto the West sky. But it's dangerous if my interference with some microstructure on the West sky. And the other method is the air pump to give about 0.3 minutes of air gap in between. And another potassium glass in the front. There's a normal structure. So it's undergoing now. I think every smart glasses maker is designed this way. So he is on the way. Great. Thank you so much. Now I won't speak out a turn for dyspolysis and applied materials, but I'm sure you guys are thinking about this as well. And so it's a great question. There's some stuff going on that we probably can't talk about. Maybe next year at the conference we'll have more talk about. But I think it's a great question. It's an obvious question. And clearly, if you can, the prescription, why bother? And so I think everyone knows that pretty well. And so we are looking at that. And I'm sorry, you can't give you more details that you want to hear now. But suffice it to say that there's quite a bit of technology development around that. OK, then a few more questions. And now we have a little bit more time. I do want to be mindful of this. I got somebody who unfortunately typed their name. It's kind of like ASSSD. I'm not sure who they are. But anyway, it's fine. We had an interesting question. How does this group plan to navigate around the current pandemic? Is there a bigger sense of urgency concerning reaching a product ready for market? Great question. The cool thing is, a lot of us have not met face-to-face in this alliance, in fact. None of us have. I mean, stuff I don't know if I'm a suit or something, but Nama, for example, and Josh, we haven't met face-to-face. Mark hasn't met those guys. So we haven't met face-to-face. But yet, we're able to get an alliance together, able to collaborate, able to do things. As Pamela mentioned, the Displex team is in Finland, the development team. Applied team is in, of course, California. YC and the quanta team is in Taiwan, Masuda-San in Japan and Taiwan. And Stefan is in Lansberg in Germany. And then Marko's team is in Israel and Milan. No one can travel. But guess what? We're getting stuff done. So it's been interesting, right? We were able to navigate around it. And this is called alone. I was a little nervous about having a conference call with so many people. And it's pretty smooth by and large, I think, anyway. So I think it's a navigate. Well, since urgency, yes. What I would say, what we're seeing is that we're seeing a lot more interest in remote work, obviously, because of COVID. And as we extend COVID more and more, these kind of tools, having AR glasses and having these kind of tools, is gaining a lot more interest, both in the public press, as well as in the technology press. So I would say that I would say necessarily urgency, but I would say acceleration for sure in terms of having these kind of tools and assets available to the general marketplace of large. This is what I would say. Question for Amat. Amat lady, okay? She's an Amat lady. It says, are you using the high index wafer for the Waveguide meta lens products? This is an interesting question because it's not a question for, or it's partly a question for us, but it's a question for the Alliance because there is a very tight feedback between the design and manufacturing and the system consideration and architecture. So the choice of which type of glass to use for each one of our products is actually, it's a collaboration between all the members working on productizing this technology. We cannot do this alone. We cannot dictate, all right, we're only gonna use this and nothing else. It all comes from system consideration. It goes up and down the value chain through a lot of feedback and collaboration. And for each type of Waveguide and each type of meta-surface that we make, it has its own substrate. That's a great answer and that just goes back to the same theme throughout. We have to design the system together holistically. That's really what you're referring to as well. And if we don't do that, then we will sub-optimize everywhere. And so it's really important to look at the requirement top down and then partition the sub-requirements into the necessary subsystems, right? And sort of waterfall it down is a way to really do things. And so hopefully answers your question. I'll give this one to Marco, even though I could answer it, I want him to talk. Marco, does the LBS have resolution loss problem? Yi Xing had that question. Marco, are you a mute, Marco? Yeah, I am muted. So sorry. So basically for a little bit scanning, we are able to target different resolution. And I take the same answer that was just given, basically. Resolution is part of the problem. It's part of the challenge, not the problem. It's when we are working, the laser beam scanning itself is providing the resolution given by the resonance frequency of the mirrors and the dimension of the mirror. Bigger mirror is giving smaller spot higher resolution. We are providing already to the market to some custom developed products for key customers. We are already providing 100 degrees field of view diagonal with 1080p resolution, laser beam scanning. Then you need to combine these with all the optical path up to the wave guides and up to the eye. So there are for sure trade-offs in all the system development. And I take exactly what Naama was saying before. It's a matter to work at system level. And it's interesting because what I loved participating to the technical discussions is to see the team in Osram working with the team of ST and the team of Dispelix and the mega one together to simulate the full system and validate. First of all, the specific performances and second the manufacturability of what we are doing. So having at the table applied material and we are talking about manufacturability volumes and quality. So that's a, I hope I'm answering there. That's a very good, very good. Again, the same theme, right? Marco and team, right? Again, I go back to that spider web diagram. You have to look at the total system optimization holistically and that is really the key. I think it's most important. Thank you, Marco. That thing is a very good answer. Let's have some fun, maybe some fun questions. You told us to have fun, right? Pamela, okay. So let's have some fun. What are those particular requirements criteria for joining the laser alliance? Okay. Well, to me, we are very open. I will quote Henry Ford. You can have any micro-display you want as long as laser beam scanning, you know? And so the only requirement is, of course, it's a laser beam scan architecture. But we certainly welcome all companies that are really engaged in that space. I think it's time for two more questions as well. Let's see. So I was looking through over here. How should group partner economically? Will we make strategic investments together? Again, I will take that one quickly. This is a collection of like-minded, collaborative companies, right? This is not financial commitments. We're not asking people to make and buy and sell things. However, we all see a common interest. We all see a common value. We all recognize each other's assets. We all believe that we can collectively define, build, and grow the market. And so, you know, and so that's approach we take over here. And so we're not partner economically from that perspective. And certainly we care about costs. We certainly talk about costs. We wanna make sure the costs are consistent with the marketplace we'll want. And we all do our portions to manage and control costs. And as Nama and Marco pointed out, look at the system and architecture and all the aspects to make sure manufactability, the costing, the volumes, and everything are there. But other than that, there is no direct, there is no direct economic type of incentive. Okay, fun question, YC. Somebody else, fun question for YC. Yung Ching Liu, where are you currently broadcasting from? Oh, I'm in a hotel of South Park of Taiwan. So it's a holiday village. Okay, thank you for the fun question. I like that, very good. Let's see, or somebody else over here. Who else? Where is everybody presenting from? I think everybody tell where they're presenting from. So yeah. Sorry, yes. Barath, you're in California. Yep, right behind me is a Golden Gate Bridge. Did you see? I'm outside enjoying the bridge now. It's a backdrop by Edmonton, Francisco. Yes. We're in Bellingham, Washington, just between Vancouver, British Columbia and Seattle. And I think Marco, you're just south of me, right? I'm in Redmond, Washington. Just moved here three years ago from Italy. That's very nice. Josh. Los Angeles. And this, Stefan. Yeah, I'm in Syria, Switzerland. Oh, it doesn't look as good. Very nice, Nama. I'm in the South Bay, Calisthenics. OK. I think Stefan wins on that it's 1 o'clock in the morning. Wait. No, no, no. It's more like 4 o'clock. It's 2.30. OK, good. So I hope everybody appreciates that all of these people came together today to really have this impressive session. Barath, is there anything else or anybody else that have any questions? I think I have a couple of things just to wrap up. No, I think we're within the time. I think, look, I appreciate all the questions. Apologize if we can't answer everything. We try to go as fast as we can without losing sight of the conference and typing. We do want to take your attention. So thank you all the panelists. Thank you for your questions. Hopefully, we have all of the emails. If you don't, we'll have it at the end of the presentation. That'll be loaded up. We can get it from there. But please feel free to reach out to any of us. We want to hear from you, actually. We'd love to hear your ideas. Even the controversial ones are OK. We love to engage in conversations. So please feel free to reach out to us. And let's connect. And look, the last I'll leave is following. The market is still not there yet. We all say it is. It's a lot of hype out there. We know it's going to be all-believe. We're all investing lots of money, lots of time. Everyone of us here and others. But we have to work together, collect to this industry, to bring things together. A lot of questions about comparison. What do you think of health costs and this and that? I like everybody. I want everyone to succeed. We have to grow the market and build the market. That's really important. And the view we take as alliance and as individuals, so we want the market to grow and thrive. Because that is really most important thing. So thank you for your participation, everybody. And thank you for your attention and your questions. Please continue the conversation. Please continue to all the SPI, great job, Pamela, and all the talks. You have the fireside chats. Now I mentioned one to you coming up with Wayne McMillan in December, for example, others. So stay tuned. Participate in there. Come to the photonics next year. And above all, engage. Reach out. Don't be shy. Thanks, Brett. That's exactly it. I'm so just proud that Bernard and I were able to start this, what, in 2017, 2018. And really sort of driving this forward. I mean, we're not there yet. But I think this, how we've brought all of, you know, we're competitors, we're partners, but everybody's working on this together. And I just, I couldn't be more proud and honored to work with everybody that I work with. So, so thankful that we're all working together, too, for the XR hardware. And yeah, I'm just looking forward to the next event. So this, it doesn't stop here. You know, I'll just share my screen. We've got a few more things coming up here. So there we go. Yeah. So in October, we're going to be interviewing Lucas with Bosch. We've got Wayne coming up. We've got Mike Brown. Mike Lee with compound photonics, Bernard. He always gives a great, if anybody hasn't done any of the fireside chats, please join. Because Bernard always gives a really great quick update of what's going on in the industry overall. And so that's always great. And then we'll also have other interviews going on. There's also other workshops going on as well. So I don't, I guess if I didn't share my screen, but here, here is all the sessions coming up. So yeah, join any of these. And then check the website back, you know, always check the website. There's more workshops. You know, Brad, there's a whole, we're going to have a one day workshop on this in March. And so there is two, and just so you know, there's two conferences with the photonics West and then ARBR conferences separate. So you'll be able to attend both, no problem. But join in, again, we're in this together. And I really appreciate everybody joining tonight. And look forward to working with you more in the future. So. Thank you Pamela, Andy, my fellow thinkers, guys. Thank you so much. And particularly the guys overseas. Thank you so much. And particularly Stefan, thank you so much. We really appreciate it. Follow you guys. Thank you. Thank you so much guys. And look forward to seeing you again very soon. Indeed. Thank you all so much. Thank you. Bye bye now. Bye bye. Bye bye.