 My name is Sean Frayn. I'm one of the co-founders and CEO of a company based in Brooklyn called Looking Glass Factory. We also operate an office out of Hong Kong where I'm calling you all from now and we make a new type of interface which we call the Looking Glass and it's the first type of holographic light field display that folks can get on their desks to pull any type of 3D creations into it. It looks so awesome. It's like I mean you've been working on it for a long time right on the on the light field the holographic displays, but getting to something that could be mass-market and that could be affordable and that people can actually have on their desk. It's like it's I think it's a big deal. Yeah, it's a it's a big deal. We've been chasing a way for folks to get even a piece of the promise of the hologram onto their desk for the last over six years now in the company Looking Glass and we think we've made an important step with the Looking Glass Portrait, which is the system that you're showing in the video here. It's a 7.9 inch holographic light field display. It ends up pushing out dozens of different perspectives of a three-dimensional scene simultaneously for groups of people. One, two, three, four, however many folks can gather around the system, they can see a view into a three-dimensional memory that you might have recorded, a video message that you might have received from somebody from the future, or a or things like 3D digital creations that are fully synthetic that you might make in Unity or Unreal. Because I was a fan of I've always been a fan of Nintendo and at one point they had this portable parallax barrier, so two-eye kind of like 3D effect, right? But you had to sit in the middle and look at your 3DS, right? That was kind of awesome and cool, but that was you know, that's you were you were doing the parallax barrier kind of or but everywhere. Is that true? Yeah, it's not parallax barrier-based but the same idea that you're putting out there where you know, the 3D, the Nintendo 3DS allowed one person that is tracked in the newer 3DSs or in the- Are we still there? Yeah, it's you. Okay. Yeah, it allowed one person to see a three-dimensional scene of a game that they were playing, but if someone looked over your shoulder, they saw sort of a garbily version of what you were looking at, whereas the looking-last portrait and the whole looking-last lineup that we've been putting out there and that a lot of folks have been creating 3D work for over the last couple years. This is the first system that is super multi-view, meaning a number of folks doesn't have to be just one person. A number of folks can gather around this display and they're awash in different three-dimensional super stereoscopic perspectives of whatever you've put into the system. So I've seen, so at the SID display week, there were some prototypes, for example, from JDI they were showing I think 14-inch 8K display and that was kind of like splitting it up and I don't know how many you need to split up to for it to be great, right? But when I look at your videos, there are some on your Kickstarter page and stuff. It looks like you're talking about 72 or something at 48. I'm not totally sure. How do you talk about the specifics in terms of how many times you have to split up the image to make it work? Yeah, it's between 45 to 100 depending on the contents and a few other factors that go into that particular decision for a particular piece of content. The JDI system, I don't really recall how many views they're pumping out there, but there certainly have been other systems where they pump out two perspectives. In some cases, they pump out four perspectives, but you've always had to be in a particular sweet spot to receive the feeling of the three-dimensionality of what the display was putting out there and with the Looking Glass portrait in this whole new line-up of Looking Glass holographic light field displays, this is, it is very truly the first time there's been a truly group-viewable three-dimensional holographic experience that folks can buy. I mean, the Looking Glass portrait is under $300 currently on our site, so it's really accessible and you know, folks focus on the display because that is the cool piece of technology that's sitting on your desk, but to make this whole thing work, to pump out dozens of different perspectives of a three-dimensional scene of all types of different content at 60 frames per second, there's a big, there are a lot of advances in the software stack that our team has had to make as well and folks get access to that when they start to play around with their Looking Glass. You ship it with a built-in Raspberry Pi, right? You can believe it, yeah. So that's, I mean, there have been progressing these over the years and you know, like the newest one has, I guess, like maybe a quad core, octa core, kind of arm core takes a 53 kind of CPU and some GPU on there and are you able to run the full kind of like, because there's so many, you said between 45 and 100 views at the same time, so that's like playing a video 45 times in real time or what does it mean, the process that you need there? So this system that you're showing, the Looking Glass Portrait, which is our newest system, it it has two modes. So one is standalone mode and one is desktop mode. In the standalone mode, that is run fully on the Pi 4 that's built into the system and we had to do a lot of tricks. Almost, you can think of it as the tricks that early desktop PCs in the late 70s and early 80s had to do, to squeeze as much potential out of what was available at the time. And so you can play back static, depth photos, RGBB photos, like a whole lot of different types of three dimensional media and moving media. So a video that you might have recorded with one of the newer iPhones that has a depth camera on it, as well as recorded clips from a Unity app or an Unreal application, all of those play back in standalone at 60 frames per second. No interactivity, though, in standalone mode. So if folks want to add a peripheral to interact live with the holographic content, that is possible in the systems as well. And you just have to plug the system into a PC or Mac, which gives it the additional computational horsepower that's necessary. And you just plug it in over the included HDMI and USB-C cable. And then you're off to the races with fully interactive holographic content as well. So here's another video. You posted this on your Twitter. So there I can imagine that I mean, the job that you have right now and getting this project to reality, this could affect the I mean, everybody's watched the Star Wars movie when you see the Princess Leia on the table. You want your kids to see to have a future where they can actually do holographic zoom calls with their with yourself, with your with your friends. Is that is this going to happen? Can the portrait potentially like I see you have a kind of like a holographic camera going on there, right? So you could potentially make it work as a holographic zoom call? You could. So some folks in our community, which is growing by the day, are creating demos of holographic communication. It's it's long been our dream in the company, and is actually the foundational objective of the company that one day, we will be able to have a conversation with one another across the world, just as we are right now, not represented as two dimensional rectangular videos. But it'll feel like we're sitting across the table from one another, as we look at one another through a looking glass. In that future, we had thought for a while was 1020 30 years away. But what's been happening over the last couple years is folks have phones that have depth cameras in them. Computers are getting faster at a phenomenal rate. The display technology like the looking glass portrait is getting to the level where all of these things can be combined almost into a system that does allow for holographic zoom. The looking glass portrait right now can already out of the box do holographic video messages. So I can record that recording, for instance, I can record that clip and then send it to my mom who bought a looking glass portrait. And she can have me and her two grandkids running around as holograms on her desk in her looking glass portrait almost instantaneously. It is a very small step to go from that video messaging capability that exists today and that folks are already starting to use today to a live version of that, which is a dream that is not as far away as I think a lot of folks think it might be. The videos you've posted on your Kickstarter, they look really awesome. I guess you work with some talented creatives to get this done. We do. Shout out to Postcard, which is the video crew in Hong Kong that pulled together this clip during a challenging time in the pandemic. So obviously shooting this was tricky and to do in a safe way. And so they pulled it off and we're able to really represent this new step for the technology in a way that is very challenging to do over 2D media. So they pulled it off, I think well. So one thing that's totally awesome and that I understand from watching your videos is that you're saying that everybody who has a smartphone that has this portrait mode that basically blurs the background already takes kind of like 3D information that would be enough for you to take all these portrait mode photos and convert them into something that looks totally awesome on your portrait. Yeah, absolutely. So the, well the clip that was showing just then with the Time Magazine shot of Fauci, that is a service that we offer converting two dimensional archival photos that you might have sitting in your drawer that you might scan in of you or your parents' kids and so on. We create a depth map from those photographs and then that is transformed through our software instantly into a holographic version of that photo. But what you're mentioning it has even wider implications of every phone that's capable of taking portrait mode photos and that is most of the newer iPhones and quite a few Android phones as well. Every time you snap a portrait mode photo and they're recording a depth map behind the color image that everyone sees and shares that depth map right now is just being used to blur out the background as you're pointing out. But we can also use that depth information which is basically just a grayscale version of that image where the lighter color is closer to you and the darker color is further away. We can use that depth information to generate a high quality holographic still of those photographic memories. And we have a software package that's for free for folks who get a looking glass that allows folks to do that without any programming which is a big change. We've had looking glasses out there over the last couple years but they've really been systems that required folks to be extremely technical to use. And while this new looking glass isn't for everybody we're widening the aperture to maybe like a hundred fold or 500 fold more folks than it's ever been available to before. So I'm like a YouTuber video blogger going around. I like to get like a little bit better camera because I want to have this depth of field kind of thing the background blurring and all this. But I'm always very curious when the smartphone vendors are going to be able to support like blurry background video. So for example this content right here is that possible to do with a smartphone a video that has this depth information? It is. So this one was actually shot with a separate depth camera called the Azor connect which is a dedicated depth camera that folks can buy on their own. It's 400 bucks or so although very limited supply right now is the word on the street. That's right. It's that one sitting at the top there. So those are our friends from Heloxica. So they built a communication demo on top of the looking glass 8k which is the bigger 32 inch system that you see there. So that's what was showing Missy in that clip in the portrait before. The quality of depth video through iPhone 10, 11 and 12 including the pro and pro max versions of those phones is almost as good as the Azor connect dedicated depth camera. In some cases it's even better and of course there's an advantage to having this thing in your pocket instead of having to connect it to a PC or Mac to do the recording and there are third party applications. A great one is called record 3d on the app store and we're friends with the developer and friends through integration that they made with looking glass portrait. So now folks can record something with record 3d on one of those newer iPhones and there is an export to looking glass button and then you export into the format that the looking glass needs to see and then you can run it in your looking glass portrait and I've been doing this. This just came out over the last I don't even know if this is fully public yet but it's fine to chat about here. I think I've been recording like crazy over this weekend. My kids telling like we're talking about dreams talking about outside skateboarding going around town and so now I have a lot more holographic memories that I can put into my looking glass than I had even a week ago through these third party apps that are integrating with the looking glass. Sorry I can't hear you. There's one thing that's kind of fun with the iPhone X I think or XS they took away the fingerprint scanner and they everybody's scanning with their phone face ID and so that means they have these kind of like sensors that do the I don't know if it's some kind of basic LiDAR or something like that right and do phones not do that on the back also where they do they do this kind of LiDAR stuff or they just do a they use two lenses to combine and figure out distance and stuff for. Yeah a bunch of phones do the portrait mode photos differently and capture depth for a variety of applications including some AR applications some looking glass things differently. The iPhone X has the true depth camera on the front and it is projecting points of light and a camera then looks at the separation of those points of light to build a pretty good depth rendering of your face when you're doing the unlock of your phone but also you can use it for things like animating a character, recording a depth video and so on and we've got a bunch of videos about this on our that we're sharing on Twitter and in our Discord and elsewhere. The back-facing camera of the newest phones like the iPhone 12 Pro and Pro Max they have the addition of a LiDAR camera and that in some cases combined with some machine learning algorithms gives superior depth maps of captured content. It's sort of the combination of all of these things of LiDAR gets a basic version of the depth of a scene and then machine learning algorithms that recognize oh this is a person this is a pet this is a scene they can clean up the depth map in particular ways. This is an example from a great creator in Japan in our community who has one of the early experimental versions of the Looking Glass portrait and I think that's a real owl that he took with his iPhone. But how did he get around the owl? You know like it looks like he's going around it. So from a single perspective there's just enough quality in the 58 degree view cone of the Looking Glass that you don't really see it start to break apart and there's some tricks that we end up applying and I'm not sure what this particular creator ended up applying for his import here but if he was using our software we apply some infill techniques some things to disguise or hallucinate the missing parts of the scene. Is it you kind of like inventing what you think is there? There's a lot of techniques that end up doing that we end up doing something that's more real time than requiring the back end machine learning process but for for instance the the 2D to 3D conversion we work with a partner that's developed a great process that does use AI machine learning processes to discern a depth map from a two-dimensional photo. So all of this stuff of more capable cameras that are capturing true three-dimensionality from the scene that millions of folks have in their pockets combined with these new algorithms that clean up those depth maps are all leading to I think an important threshold moment where the quality of holographic display and the accessibility of folks being able to get these systems as you can see there for under a few hundred dollars we're hoping it's a big moment for this field that a lot of folks have dreamed of for many many decades including I think it's it's just huge and you're right now in Hong Kong are you right there and we're in the area where things might be getting mass produced? Yeah so we have a couple we operate out of a couple floors in our facility here in Hong Kong and then of course we work with contract manufacturers as well and individual suppliers for optics and driver boards and whatnot we make the full stack or as much as the full widget as a team of a few dozen folks can possibly do so you know we design the optics we design the driver boards we design the whole structure we design the full software stack hoping to leverage the elements that other folks are developing in the ecosystem like the game engines like Unity and Unreal like the cameras built into people's phones like the transmission networks to carry you know the 5g transmission networks to carry increasingly high fidelity light field information to these displays of the future so we do as much as we can and on our end and then hope that that enables us to work with more and more partners so I like those hey google sorry I triggered a bunch of hey google's right now but the hey google alexa and all these kind of speakers I think is a good start let's say right and but the next level which maybe should be totally based around the portrait would be that I want my friend or my family member to be a smart my smart assistant with their voice with their face you know and the whole AI and the way they're speaking also so there's a lot of things that I guess google and stuff need to add to to their I mean they already have it all this AI stuff but wouldn't it be awesome if you could have your friend talk to you in the glass with the full AI the full you know like and the way of speaking is it gonna happen so awesome absolutely gonna well I think there's a very good chance you never know what's gonna happen for sure but I think there's a good chance that it's going to happen there's certainly no technological barriers to that happening in the background we've built a lot of infrastructure that will open up that next step of these widespread devices where it's not only tens of thousands of systems but many millions of systems that folks are using every day that is our goal and integration with some of those applications that you're describing is an important element of that if that I don't know if that's going to be happening at that scale next year or if that's a few years down the road but I would be surprised if that doesn't happen sooner than folks realize and folks having holographic displays that are integrated with other platforms like voice AI systems like that Google makes or that Amazon makes that would allow for a virtual avatar that's speaking to you then transforms into a memory that you've recorded with your phone of your kids or family running around and then you snap your fingers and it's photosynthesis happening in three dimensions and then you snap your fingers and you're back to like you know Woody from Toy Story talking to you through the looking glass I think all of these things are going to happen and what we're really excited about at this new level of accessibility for a wider audience obviously not everybody it's still an expensive piece of technology even at a few hundred dollars less than a few hundred dollars but still is an expensive piece of technology but for folks who are really excited about dipping their toes into this piece of the future it's certainly a lot more accessible than it's ever been in the past and you know we we hope that it's a apple too style moment where a lot more folks are able to create here I see this image and I'm trying to understand where's the metadata in all these millions and actually billions of portrait mode photos out there are they actually right there like people didn't lose that metadata right is it still there and all this information is it is it in the jpeg or in what what format is it it's still there it's hidden in a jpeg or in some of the new high efficiency formats it's hidden in there as well but it's there so that's actually that's a photo of my daughter Jane and is the portrait outside it's just portrait mode it's a portrait just a portrait mode photo and it the craziest thing in the world is that behind all of these photographs that you already have so this is not only for new photographs that you're taking for photos that are sitting in your photo album on your phone right now if you took them in portrait mode they have that depth information behind them and our software extracts it and then uses it to calculate out dozens of different perspectives of the scene and bada bing bada boom you have a holograph saved on all these platforms like google photos and stuff on the cloud and it's it doesn't disappear like all this information is still there hopefully yeah it's preserved it's preserved in most online platforms so like google photos google drive you know apple cloud whatever it's called iCloud those appear to save it i don't know if you go back like three years if they change their policy or whatnot but uploads to those systems in recent times still preserve that metadata the only time that that gets stripped is in a few cases cross-platform when you're sharing something with a colleague on Slack for instance there's an optimization that ends up happening so Slack is one of the few platforms that actually strips the metadata out but otherwise you can let the email you around stuff that's sitting on your hard drive still has it stuff that you know is in your online repositories still has it and certainly stuff that's in your album on your phone has that metadata because google photos has the high quality mode to save bandwidth right where they compress everything and they add it it's actually free unlimited until june 1st 2021 but i would guess that google photos for example they have all these effects or what they call it and they want to improve the how it looks and stuff so they wouldn't want to delete that information it would just stay there and even in the compressed versions of their photos hopefully because there's billions and billions of photos out there people can just display them in a like a totally new way it's like it's really it's mind-blowing what what how how attractive your product is on people's desks it's going to be amazing thanks yeah i mean i i better double check on the google photo specifically google drive preserves it i cloud preserves it a number of the online platforms do i'll double check for anyone that's curious about if google google photos itself has preserved that metadata i would be surprised if it didn't but you never know these folks change their policies as the years go on but again the photos that are on your phone right now or in most of the places that you would be storing them does have that metadata and it's just i mean there's um for super nerds out there there's a tool called xf tool that extracts metadata out of image information and you can actually i mean you can extract and see the number of images that are hidden behind the jpeg sometimes it's more than one sometimes it's not only a depth image sometimes they're showing you know the outline of your eyes to do better overlay of different effects and whatnot um sometimes there's six or seven images that are hidden behind that one image that you're you're seeing as the color they have all kinds of tricks where to do the hdr with multiple images and stuff i don't know if some of that is there that you can just like improve even more what they affect because you when i was looking at the image where you were editing the image with your daughter up there uh there it looks like you have a tool where you can kind of like edit the the color the angle of the lighting or something like that is a part of your tool yeah so that there's um uh one key uh hub for converting a variety of different types of three-dimensional content including portrait mode photos into holograms for the whole looking glass lineup so this includes older looking glasses newer looking glasses um and that's called hollow play studio um and this is sort of our lightweight editing framing conversion tool um and so there there's uh some basic you know refocusing and framing you can do there and then we have a couple extensions to and all this stuff is free for folks who get system a couple extensions the one that you were referencing um with my daughter jane that had showing the lights uh the sort of the disco lights on her that's through one of the hollow play studio extensions called diorama and in that you can add like animated gifts you can change the background you can change the lighting you can add effects um as something that we wanted folks to be able to experiment with out of the box is there any chance that uh you know when people sign up for face id on their phones they have to come like move it around their face or something like that uh could you in theory maybe record something like this uh this uh you you call it a 10 second holographic birthday message uh holographic videos uh but there is also says that uh a good idea is to have the azure connect or until a real sense cameras something like that but uh you also mentioned the iphone so people can just record a bunch of videos with their front facing iphone cameras front facing or back facing um for the newer cameras and and again that the phone video recording was done in collaboration with the folks that made this amazing app called record 3d um and we're actually going to be putting out I think in the next few weeks um four or five key tutorials that really walk through and demystify this process because I know for a lot of folks it's almost it's like unbelievable that you could actually record a real three-dimensional holographic recording with the phone in your pocket and then see it on a holographic display um so we're you know we want everyone to be able to do this um that wants to and so we're putting out some tutorials to show how easy it is it's it's really just a few clicks um of your phone and then a few clicks in um our software um on pc or mac and then you've got it in your in your system um the the idea that you're bringing up of waving around a phone in front of you um to capture a higher fidelity um scene there is actually some great work that's being done by a number of research groups um the sort of shorthand for it is called nerfies and nerfies are um capturing a full light field or all of the directionality of light around a scene um and getting to really high fidelity static capture of people's faces and scenes by moving around a phone like that so those types of advanced techniques are things that we're keeping a close eye on not yet productized on our side but if anyone out there is in one of those research communities we would love to connect with you all and see um you know when that makes sense to actually bring to uh looking last users of course folks can do it experimentally right now but it it's kind of that element of it is at the leading bleeding edge and is um uh you know only for the uh only for the folks who are really unafraid of getting their uh hands dirty in code and whatnot most of everything else that I'm talking about doesn't require any programming whatsoever uh and there's of course the whole world of 3d stuff because you're talking about unity on real engine kind of characters or imports or something like that I guess is a there's a format for these uh that could be potentially imported into their uh maybe maybe uh you could even like could you like could you use it to play video games oh yeah holographic mode oh absolutely so when you're in desktop mode and connected to a pc or mac and you have that additional horsepower which is necessary to calculate the full holographic light field 60 times a second which is necessary for interactive games when you're in that mode um there's a whole lot of unity unreal um applications that folks have built over the last couple years on the previous generation of looking glass that um folks can start to experiment with on this new newer version of the systems um so the aspect ratio might be a little bit off but for folks who want to make their own holographic apps unity and unreal have really well supported plugins that we provide again for free for folks who are using it um for their own personal use um and what it ends up doing basically is uh you drop the plug in into unity when you're making an app for any unity or unreal developers out there and you see a little box that appears in your scene um anything in that box in your scene in unity appears live in your looking glass and then you can either record a 10 second clip of what you're seeing in your looking glass in case you want to run it on standalone mode or you can build that application into a full blown app and then folks can play it as a game um or interactive application in terms of interacting um is there a touch is it touch touchscreen there's no touchscreen built in um and that decision was mostly made because we wanted to get the core functionality as refined as possible and at a price level where it got into more folks hands um uh but any peripheral that folks use in 3d land or in 2d land can be used with the looking glass so in that clip that you're showing of those sort of blobs that are bouncing around um that person's hand is being tracked by a peripheral called the leap motion controller and this is an infrared camera that's sort of pointing upwards and it's automatically estimating where the joints of your hand are and a synthetic version of your hand is then created inside of this holographic scene in your looking glass in real time and um uh this is a popular way for folks to interact with three-dimensional content and anyone buy this controller oh yeah so it's available as an add-on on our website um and then it's still available a few places online so if you look for leap motion controller you would find it it's available on our site um which is lookingglassfactory.com as well but um it potentially there could be something that could be very very mass produced that would be a little usb add-on is there like usb ports or um so for interactive applications this system needs the additional horsepower of a pc or mac so in the fully standalone mode it's only designed to run media so like videos or photos um and that's just the speed of computers at these price points um at the current moment shortly this will be different three or four years from now um but because most pcs and macs have extra usb ports we didn't build additional peripheral ports into the looking glass portrait itself so in that case the leap motion controller is plugged into the same pc that the looking glass portrait is plugged into um so there's hdmi and then these uh somehow i don't understand how you do it but you you you can send 45 to 100 perspectives over the hdmi cable yeah at the same time real time yeah that's right so they're interlaced together into a particular video signal that's unique to each device um there's a little bit of um information that travels over the usb c cable as well in that desktop mode and um yeah and that's how that's how that works um for uh desktop mode yeah and so it's so awesome that you can bring it to something that's affordable because your company has been doing holographic professional display for how many years now about six uh over six years now yeah and so it's a big deal to go i think uh i think once uh you know the ipad got mass market it kind of changed how all the pros were using tablets also right so if if everybody starts using this then maybe that will also change the you know like the the ideas of the pro world and they will also upgrade everything exactly i mean that's part of why we did this is the distinction between enterprise and consumer um faded over the last year even more so than it had been um before as we're working from home home studios and whatnot um and so having a holographic display on your desk at home happens to also be your desk that you're working from um during the pandemic and so those lines have certainly blurred a lot and we take the point of view that um certainly there are different types of applications that professional users and individual consumers um will prefer to use but having a system that can easily go between those two worlds um just as you're referencing a lot of apple products have in the past is i mean that's our that's our strategy um moving forward so uh can you explain a little bit without telling too many to any secrets or anything the how how do you how do uh how does optics work in that kind of a system how's it possible to have like it's like uh i this is like a a product that's just going to be such a uh what you call it uh the start of the show when people have this at home at a party everybody is just going to be like looking at it for hours no is that possible possible i think that um at least that's what we've observed happens with the first generation dev kits and then in the limited number of looking glass portraits that are out there again we're just starting to um uh have some beta units out there and what have you so um and the major shipments are starting uh end of next month um so we'll see uh if it's the start of the show we certainly hope it is um and hope that folks can get content that's important to them into the systems easily um uh in terms of the question of how the optics work um uh basically we have a few optical overlays that end up chopping up on a sub pixel as in the rgb sub pixels that make up a high density lcd screen we chop that up into a number of different perspectives and then that ends up resulting in this three-dimensional holographic view that a group of people can see um a lot of folks with pen and paper right now might say like okay so if you're taking you know two million pixels divide that by 45 is the output resolution really so-and-so um and the answer is not really because we're pumping out one view about every degree or so um you're getting a huge amount of stereo information that is changing as you're just gently moving and looking around the scene so the perceived um resolution ends up being much higher than the source video signal or source display resolution divided by the number of views might end up um having folks believe that that's a question that we get a lot and unfortunately as the first holographic light field display that's um out there um there aren't really great ways to communicate what the perceived resolution is um other than seeing it for yourself um and you know surely that will change as there are more and more systems out there on the market i mean it looks like uh nice size it's not it's not uh it's like uh is it it's not like an a4 page but kind of or what kind of size are you it's about a ipad mini size so um i mean i was just imagining that to make this work the only way you could do is to have an ak display that can fit in that size but you you're not you're not quite needing that no and we do we do have an ak system so the um 32 inch looking glass 8k is our super premium offering mostly for um folks in enterprises just because of the costs associated with that system um there are very few computers that can actually run at 8k and calculate out all the different perspectives at 60 frames per second so um having it would be great to have a small 7.9 inch 8k standalone system right now there's um it'll be a few years yet before that is possible but um this is the beginning of this whole future of additional improvements from a resolution perspective from integration with you know capturing the ambient lighting in a room and making stuff in the looking glass look better and all these other wonderful things that are bubbling up in the community this is the starting point i would think that maybe all these display suppliers have some ak stuff even getting smaller and smaller in the lab at least they're showing it that sid display week once in a while but if one of them one of these amazing companies have a display that they would like to supply to you what more do you need to do to that ak little display to add all this optical stuff to make it work uh so some modules um work and then some don'ts just based on how the sub pixels are arranged in the modules um but um you know then we uh end up you know without going into the the nitty gritty details we design custom optics based on the particular module um that there are um and then we have a we have a calibration process that we push everything through that's proprietary and it's actually done here in our facility um on all of the units um and that integrates then with all of the software so if there are any folks out there who have experimental smaller format 8k displays they should they should send me a note because it would be great to collaborate um we do um of course there's a lot of stuff that we do in the background that we don't talk about publicly we try to get as much out there into folks hands as possible much more aggressively from a time scale standpoint than any other company big or small that i'm aware of in this space so um if something's purely experimental and it's going to be 10 years until it sees the light of day that might be something we experiment with with a group in the background but won't be something that we actively talk about all right but um how about the the way it's portrayed on the table can you uh does it only work in that kind of way or can you flip it and be also getting the whole kind of like thing in that way too or the specific angle that's better to use so this system only works in portrait orientation i mean you can do this with it and the illusion doesn't break and you get different three dimensional information as you look around the scene within a 58 degree viewing cone um which is you know basically how you would normally use a computer or what have you with you know one two three four people something like that um uh so you don't really you can't really tell that there isn't um any vertical parallax but it is a horizontal parallax only system so that means if you turned it onto a side in landscape orientation that portrait system wouldn't have wouldn't produce a three-dimensional effect um that said our bigger systems which are 4k and 8k in their 16 inch and 32 inch those bigger systems are um landscape oriented because they're they're used for folks who are doing um uh work that generally requires uh landscape orientation uh nice so uh so so is is your hope that this is going to be uh what's called uh everywhere available in all these retail stores or amazon uh people worldwide is that is this going to happen because your your crowdfunding is uh is it looks fantastic i mean you you've uh you've got a lot of already backers right yeah the the and we we launched the looking last portrait on Kickstarter not to raise money but just as a way to sell to a community of folks that we knew really well that were engaged in working with 3d scanners 3d printers we had done a few projects on Kickstarter before um and so we ended up selling it actually outpaced the original oculus riff launch on Kickstarter which is sort of our high watermark in the vr ar hologram space which blew all of our minds that that was possible um for this um uh to happen during this time where you know during uh we launched in december it was you know all the stuff that was happening in december um and so it really speaks to the excitement that the community had around getting their first holographic display on their desk and and creating for it um is the to your question of is this going to be everywhere on amazon best buy and whatnot no the looking last portrait is designed for um the audience of those folks that touch 3d in some way so if somebody bought an iphone 12 pro because it's got a lidar camera someone who might have 3d printed something at some point um somebody who works in blender unity or unreal um folks who are curious about 3d depth photography or light field photography um you know this is tens of millions of people um it's not yet billions of people but this is an important step to get us to that point uh okay i i think it's billions of people i mean everybody wants to have the second best thing to being there right especially in this crazy time where uh family is far away sometimes and it's horrible it's sad and uh of people are zooming all day like crazy because they they they they are they feel disconnected they need to i mean they they have to and stuff like that but also um this is potentially like uh uh what's called uh psychovisually making you think that you are actually in the room with a person it's like uh it's uh okay but i mean you've been you've been in this world for many many years well no i mean i think it's um tens of millions of people in this step and then the next step that we take fueled by a lot of the work that the community is putting out there and all of the excitement that's starting to stir up right now it is going to get to hundreds of millions and then eventually billions of people and i think it's going to be as i mean it's going to be right there alongside the two-dimensional screens that we use to read the newspaper and send emails this is going to be how folks communicate and create for the next generation or two and uh on your kickstart it says april uh is that really the starting uh are you on track because it's often uh it's often a question with all these uh crowdfunding projects sometimes it's delay delay delay but are you able to you got the whole thing lined up everything is um like all the challenges in terms of quality control yields making sure it works and when you ship it doesn't shake around in the box and it's still okay on the other side and everything yeah we take um we've done this before um and uh several times before and a lot of our team um has deep experience in hardware products that are at at the level complexity of the looking last portrait um which is simple um when you use it but actually involves a few complicated elements in it to make it great um uh so we've shipped all of the advanced beta units so folks have some of the test units that have already gone out there and we're getting great feedback on those units um and uh there were some reviews done a couple months ago uh from third parties as well so there's all there's great feedback um from the community in that respect we are shipping um some number of units in april there is a delay that i i posted this for the folks who backed the Kickstarter project already so that this isn't news to them but there's a four to six week delay on what we were initially targeting um which was to start doing major shipments in march so that's delayed to end of april and into may and so on and sort of like cascades from there because the raspberry pi four that runs the systems in standalone those units are delayed coming from the uk because of shutdowns and i'm not trying to give an excuse we're ultimately responsible for the end product getting their folks hands but that's one element that we um haven't been able to control as well as we hope we have some units here already but they're sort of just trickling in slower the raspberry pi fours than we would have liked so if folks are curious they can check out the update that i had posted on the Kickstarter page or they can just email me or anyone in the team um to get more information about where we are on shipments um basically that's a long way of me saying basically on track uh except for the raspberry pi four delay um which folks in the community have been really understanding about and the quality is going to be exceptional when folks receive the systems uh what are we looking at here with how did you do the basketball is that a game or some real people oh those are real people um so that was recorded that's from a great um volumetric video studio called 40 views and they set up a number of cameras that then end up catch capturing actors or different folks um three-dimensionally and so they agreed to uh send us some of their volumetric video footage and we pulled that into the looking last portrait another great capture studio is um from microsoft mix reality um capture studios there it would be awesome if if i mean we already i kind of i try to ask already but if you could many people have several smartphones or they can ask friends to put them around in different angles kind of and like capture scenes and just with some kind of awesome app that would link them up wirelessly and then generate the 3d thing at the end is a dream people have had for decades right but uh maybe your your portrait can maybe help accelerate the development of all these and when you're using a raspberry pi there's like a huge community of open source uh stuff happening around that are you running android or linux or what uh okay there's too many questions what's yeah i mean i i encourage folks to um if they have specific questions or want to bounce around ideas about what the potential of this system is um there's a lot of activity on our discord so folks can go to look dot glass slash discord and i'm pretty sure that'll redirect to the sign up um to get into our discord community um and folks are doing a lot of amazing experiments you know by and large probably you know 95 percent of folks will um do what's out of the box and easy to do um this is a new format in a new medium so that's what we're trying to enable and encourage folks who are maybe not yet confident in making holograms of their own now we're hoping they will be but for the one to five percent of folks who want to get into the very very leading edge um we provide a lot of software tools um to allow folks to do like what you're talking about string together um a couple cameras if they're interested in it all sorts of experiments like that that we um hope bubble up in the community as well to the baseline functionality how about those an emoji uh how did they how do you get them in is that like a just uh some format that just goes out there it just works or no that doesn't just work although someone might build an application on top of the looking glass portrait that allows for this but this is um that's an application that someone in our team Nolan made um that's a sloth um that then's been controlled by Nikki on the team and basically it's the true depth front facing camera that you were um discussing before in the iPhone 10 and some of the other newer iPhones um that's capturing the uh blend values of Nikki's face and then applying those to the rigging um like how you would rig a puppet um to the rigging of the three-dimensional model of that um sloth character and for real-time control um I'm sorry Shavax I have to hop off here in a minute all right so thanks thanks a lot for for this awesome interview and there was a couple questions also but um sorry but maybe you can answer all these questions later hopefully there'll be a lot of youtube comments uh thanks a lot for your time and uh thanks so much for bringing such awesome technology to the world I hope you're having me yeah all the last pieces is working in terms of uh getting this out uh to millions of people 10 10 million people why not that'd be awesome all right thanks so much for having me and we'll be checking the comments too so if folks have questions post them there and we're looking forward to engaging with everybody all right thanks a lot thanks thanks for your time thanks see you thanks thanks everybody for watching