 So everyone, before we begin, we would like to express our regret that we cannot be with you in person for this panel due to unforeseen circumstances. Some of us cannot attend the conference in person, but we are excited to share our thoughts and insights on the metaverse, the opportunities, the challenges, and the role of open source in shaping its future. We have prepared for this panel with great care and attention, and we're confident that we will deliver the same quality of discussion and insight as we were there in person. We hope that you will find this recorded session valuable, engaging, and we hope to meet you in person soon. My name is Annie Lai. I will be the moderator of today's panel. I would like to introduce you our experts in Network, Edge, Cloud Gaming, Web Engine, and all of XR fields on their unique insights and perspectives on the metaverse and the role of the open source in the future of the metaverse. They are without particular order. Eric Meyer, developer advocate from Igalia, James Kaplan, the CEO and co-founder of Mikai, Renny Habie, CTO of Networking, Edge, Access of Linux Foundation, Tina Zhou, Chair of LF, Edge, who is also a director at ARM. Welcome panelists. Hello. Hi. Great. Okay. So first question. So please tell us about what you and your company or organization do in the metaverse space. What would you like to start? Oh, we can go in slide order. So my name is Eric Meyer. I'm a developer advocate at Igalia, which is an open source consultancy headquartered in Spain. We've been doing a number of things in the metaverse space, but probably most significantly, we took over from Firefox reality and we relaunched it as Wolvik, which is a browser built specifically for XR devices. So it's a way of bringing the entire web into an XR space. And we've been advancing that, releasing new versions over the past few months. And that's where most of our interest lies right now. Awesome. James? Perfect. I'm James Kaplan. I'm the CEO and co-founder of a startup called Mikai. We've actually been in the AI space primarily for the past few years and started founding, doing a lot of B2B AI work. But more recently in the past two years or so, we've been moving to the metaverse space, primarily operating on both sort of the top end of the stack and the bottom end of the stack, where the top end, we're building a lot of white labeled metaverse solutions for pretty large name brands. And at the bottom level, contributing and either sponsoring or helping develop a number of sort of long-term initiatives that we have in making the metaverse more accessible. Great. Ready? Hi, I'm Ronnie Hybe, CTO of Networking Edge and Access at the Looms Foundation. For those of you who still don't know, the Looms Foundation is of course the non-profit organization that provides neutral platform for companies and individuals to develop open source technology. And we're active in many different technology domains. We recently, as you know, launched the Open Metaverse Foundation that is aimed at accelerating the development of the technology layers related to the metaverse. My personal focus in the foundation is networking and edge computing, where I help our communities define their technology vision and find ways to collaborate. The metaverse is emerging as an interesting use case for many of our open source communities. It is both a great opportunity, but it's also presenting some unique challenges to the infrastructure, whether it's the networking and the edge. And we're starting to address those use cases and analyze the requirements. And it's going to be a very interesting time ahead as our communities adapt to this new set of metaverse use cases. Thank you. Tina? Hello, my name is Tina Jill, and I'm the chair of Elv Edge and a director at Alms. Alms is a global semiconductor and software design company focused on providing energy efficient computing solutions. In the metaverse space, Alms technologies and expertise enable high performance and energy efficient processing for various applications such as VR, AR devices, edge computing and IoT devices. Awesome. Thank you so much panelists. I'm so happy that you are on this panel. So next question is, what do you think the metaverse is and what opportunities does it present? Who would like to start? Yeah, I can go. I can go first, maybe. Just wanted to comment that we need to remember that the metaverse will come in many shapes and forms. We all tend to think when we hear the term metaverse, we immediately think about kind of immersive AR, VR entertainment, which is part of the metaverse. But what we're seeing in our open source communities is that there are many other use cases that may mature earlier. Like if you think of industrial use cases, things like digital twins for production floors or enterprise use cases like workforce education. These are some of the use cases that might happen sooner. And they're all part of the metaverse. The metaverse is something that could, from our perspective, could harness several of the technologies that's been in development in recent years. Like if you think of 5G network slicing or smart orchestration of edge. These are technologies that have been in the works for years. And now it seems like the metaverse will be a good use case or a good application to use and utilize these technologies. And things that seemed impossible a few years ago are now becoming impossible through the use of these technologies and they're used for metaverse. Great. Eric? Oh, James, sorry, James, go ahead. So very much agree with what Randy is saying on the metaverse taking multiple forms, not just stuff on goggles. I think, from my perspective, the metaverse is primarily an evolution of a lot of current applications that are being carried out today and perhaps inefficient formats. Where, you know, easy example of sort of filling the point of digital twins is like there were theoretically a lot of solutions around doing digital twins for a number of years now. But through better technology, those solutions go from just being like consulting projects that look interesting to actual on the ground implementations of real value. Likewise, a lot of applications I think on the consumer side get more interesting as costs come down and different types of experiences become extremely accessible in the same way that, you know, they're kind of accustomed to whether they're on apps or on the web. The metaverse needs to have somewhat of accessibility moment, but then it's not just in the goggles. It's like more of that evolution of everything we do. Great. Yeah, I actually agree quite a bit with that. The accessibility is actually one of the areas that I personally am concerned about. And I think that will be one of the real challenges in the metaverse. I mean, to me, the metaverse is it's a new medium, but at the same time, it's an extension of the media that we have. And so I think what the metaverse could do is both enrich what we already have, but also have entire new ways of interacting with things. One of the use cases I'd like to see is if you're looking at a Wikipedia page about regular polygons through a web browser and an XR device that there will actually be like a polygonal solid that you could manipulate and look at. But at the same time, you know, that's interesting. But also being able to explore informational spaces in actual three dimensions rather than a two-dimensional projection of a 3D scene onto a monitor. Tina? In my perspective, the metaverse is a collective virtual shared space that integrates the digital and physical realities powered by advanced technologies such as AI VR, AI and blockchain. It presents numerous opportunities such as creating new immersive experience in entertainment, education, and remote collaboration. Additionally, it fosters innovative business models, potentially transforming industries like e-commerce, gaming, and social networking. The metaverse also encourages the development of next generation computing infrastructure to support its vast interconnected ecosystem. Great. Thank you for that. So rather than a single software platform, John Reddolf described the metaverse as a digital environment made of seven distinct layers that represent different faces. These layers must work together from the experiences people seek to enabling technologies that made it possible. And these seven layers are infrastructure, human interfaces, decentralization, spatial computing, creative economy, discovery, and experience. Panelists, what are some of the technical challenges that you think that need to be addressed in building the metaverse? And I know Tina and Renny, you come from the infrastructure networking background. Could you give us some insight as what kind of technology challenges that we'll face in building the metaverse? Tina, would you like to start? Yeah, sure. From an infrastructure perspective, the metaverse requires a robust, scalable, and efficient computing infrastructure to handle massive amounts of data and support real-time low latency interactions. Key challenges include energy efficiency, developing the energy efficient processes and systems to minimize the environmental impact of the massive computing resources needed for the metaverse. Scalability, ensuring that the infrastructure can grow and adapt to the increasing demands of the metaverse ecosystem. Security and privacy, protecting user data, and ensuring secure communication between devices and services in the metaverse. Great. Renny? Yeah, so I agree to everything Tina said and just like to add my perspective that when you start to analyze the infrastructure requirements of metaverse application, you quickly realize that the demands on the network and the edge computing infrastructure is order of magnitude more than what we're used to in things like video streaming or existing internet technologies. And it's also very clear that a lot of the processing will need to be carried out in the edge area. I mean, the devices are limited with their processing and energy capacity and the cloud is just too far away in terms of round trip delay. So the edge is becoming that sweet spot for doing all the processing, but the edge is also a very resource constrained environment which is presenting new challenges. And it's also becoming clear that there needs to be much tighter integration between the applications, the metaverse applications, and the infrastructure. So we got away, I would say, with previous types of applications by doing everything over the top with the application, not really knowing what's available for them in terms of the network and the computing power, but I think would not fly in metaverse use cases. So there will need to be a better integration between the apps and the network and they will need to know what's available for them. They will need to be able to request resources from the network when they're about to do a resource heavy action or carry out some heavy computational tasks. So it all calls for opening of interfaces or APIs between the network layers and the applications that can be consumed by the applications and let the applications have tighter control and better understanding of what the underlying infrastructure can give them. Right. So what about the layers above infrastructure? I know James and Eric, you guys are very experienced in all those layers above network and infrastructure. Could you share your insight on the technical challenges that we might be facing in building the metaverse? Sure. Maybe I'll start there. I think accessibility, I think it's probably the key word I'm guessing that both Eric and I are going to echo, which is that one of the reasons why I think adoption has struggled to this point is because the certain solutions to accessibility so far are sort of towards Randy's point of being kind of lazy if people think, oh, we're just going to pixel stream or something and everything's going to work out. But actually we need much more comprehensive technical solutions at the application layer that have the understanding that the infrastructure layer is going to move forward rapidly. So we have to be kind of planning for all of that in mind in all of our development. I'm a big believer in the browser as being the primary entry point for a lot of this, but I think we need to have more intelligence in the browser in terms of how it carries out workloads. There's a lot of technology that's been developed, but the browsers are going to need to push forward much faster in terms of how we look at hybrid workers, how we look at hybrid rendering. A lot of technology like that is I think going to be key towards enabling application developers to make applications that load quickly and run well on cheap devices. Yeah, I fully agreed on that because yeah, not everyone can afford a PlayStation VR or even what we would think of now as a lower cost headset. There are a lot of people who are resource constrained in their own way, whether it's financial aid or what technology they have available to them in their local markets. And also if the metaverse becomes widely adopted, then there are going to be real barriers to things like how do you help users who have low vision or no vision? How is the, how will design for immersive spaces accommodate people who cannot hear? You know, those sorts of questions are still ahead, I feel like. And for that matter, I think in sort of in conjunction with the rapid advancement of the infrastructure and better understanding among developers and how to help users at the application level. Also at the server level, there are real problems right now where XR browsers are not recognized by websites. And so they get given the mobile version of the site, which is not always appropriate and sometimes it's completely unusable in that particular form factor. And so it feels like there needs to be not just an advancement in the and the hardware, but also an advancement in sort of the collective understanding of how do we do this in the same way that there needed to be a shift in web design 13 years ago to understand that responsive web design was the way that we develop websites now that we can. There's going to need to be, I think, a similar learning curve that gets ascended by server developers, library developers, and application developers and device developers. Awesome. Yeah, I just want to place a little plug that Linus Foundation, Open Metavis Foundation, has SIGs, various SIGs, special interest groups that cover, that covers all these topics. So I would highly recommend you to check them out if you are interested in exploring all these technical challenges and wanting to contribute to building the Metaverse. Okay, moving on to next question. So chat GPT. And we're going through this major tidal wave of the chat GPT. And it's almost like we can't leave any tech conference without talking about chat GPT. And panelists, what could chat GPT mean for the Metaverse? Who would like to start? Maybe I can jump in. Sure. So actually, our background in AI as a company in terms of conversational AI is actually what motivated us to move into the Metaverse in the first place. So we're all very excited and bullish on the idea of more conversational interfaces to this type of content. Since I think that every time I see a demo of some chat GPT application and some frankly boring chat interface, I think how much cooler it would be if you had this entire world in which you dysfunction it. And I think that's really where we're going to see the Metaverse take off in the next 12 months is the idea of like finally now there's an interface pattern for these 3D worlds that isn't just like using WASD in kind of a silly format. Yeah, I'd like to add that I'm seeing an interesting use for generative AI in our domain. The Metaverse of course calls for the creation of new protocols for this communication and previous speakers spoke about the interoperability between headsets and websites and web browsers. So we'll probably see a lot of need for creation of new protocols and their implementation. And I started seeing impressive work on using generative AI to actually generate the source code that implements the specification. Because if you think about protocol specifications, there are very structured documents that can easily be fed and as training materials for an AI model and they can spit out very easily code that implements the protocol. So that can be a huge accelerator for putting in place all these necessary protocols that are required for the Metaverse. Cool. Eric and Tina, feel free if you have any opinion. Yeah, sure. So I think chat GPT can contribute significantly to the development and enhancement of the Metaverse in several ways. Virtual assistance, chat GPT can serve as an intelligent virtual assistant within the Metaverse, helping users navigate the virtual world, answer questions, perform tasks and provide infrastructure and information on various topics. And conversational AI, chat GPT can be integrated into various Metaverse applications to facilitate human-like interactions between users and virtual characters or NPCs, the non-player characters, enhancing the immersion and engagement. Content generation, chat GPT can be utilized to generate dynamic and potentially relevant content for the Metaverse, such as stories, dialogues, missions or even inward advertising. Language translation, by leveraging its language capability, chat GPT can provide the real-time translation services for users in the Metaverse, enabling the seamless cross-cultural interactions and collaboration. Also, the personalized experience, chat GPT can be used to create personalized experiences for users based on their preferences, interests, and past interactions, tailoring the Metaverse environment to individual needs. Sentiment analysis, chat GPT can modify and monitor the user interactions within the Metaverse to identify the sentiments, chains, and behavioral patterns, helping developers optimize the user experience and adjust potential issues. Education and training, chat GPT can be employed to create immersive educational and training experience within the Metaverse, simulating real-life situations and offering conceptual or contextual relevant guidance. The last one, accessibility, you mentioned already. Chat GPT can help make the Metaverse more accessible for users with disabilities by providing text to speech or speech to text capabilities as well as offering assistance and guidance tailored to their needs. By integrating chat GPT into various aspects of the Metaverse, developers can create a more engaging, interactive, and immersive virtual environment for users, while also leveraging the AI capabilities to adjust challenges and enhance the user experience. Wow, that's awesome. Thank you. Yeah, so it looks like we all need a chat GPT special interest group in Open Metaverse Foundation. Okay, so we're all here because of open source and what do you think Open Source can play as a key role in developing the Metaverse? Who would like to start? Yeah, I can naturally start coming from the Foundation. Of course, for me, there's no other way other than Open Source. But more seriously, I think if you think about what the Metaverse needs to be successful, I think one of the key aspects is interoperability. I mean, if we end up with Google's Metaverse and Metaverse and maybe Siemens Industrial Metaverse, then I don't think it will be successful. In order for it to really take off, all the implementation needs to be work nicely with each other and seamlessly integrate. So, Open Source communities have been proven in recent years to be very successful in creating those necessary open standards for interoperability. So, I think that for the Metaverse, we're already seeing with the Open Metaverse Foundation and the individual communities, we're seeing good collaborative work going in the industry between all players to create those necessary standards and protocols for interoperability. So, that's one benefit of Open Source. The other aspect is a more economical one. The Metaverse, like any other technology, is a very deep technology stack. And usually, the lower layers require a lot of investment, but there is not much room there for value add and differentiation. So, there's no point in each and every company doing it themselves and reinventing the wheel, so to speak. It makes much more economical sense to do it in a sort of collaborative way, as an Open Source project are the perfect platform for this type of collaboration on maybe the lower layers of the stack, so that companies can invest their resources in providing differentiation in the higher layers. And Tina, you are also very much involved in Open Source. Could you share your thoughts on that? Yeah, definitely. Open Source can play a critical role in developing the Metaverse by fostering collaboration, innovation, and standardization. Open Source enables developers and organizations to work together, contributing their expertise and resources to adjust the complex challenges in building the Metaverse. This collaborative approach accelerates innovation, reduces the development costs, and ensures interability between different systems and services in the Metaverse ecosystem. As a result, Open Source can help create a more accessible, inclusive, and interconnected Metaverse that benefits everyone. Absolutely agree. Okay, so we actually have a lot of interesting questions about Metaverse, but we don't have enough time, so I'd just like to give each panelist some time to share with us your parting thoughts before we conclude. I would say that, you know, as we were just saying, Open Source will play a key role in developing the Metaverse if it's going to be at all successful. That was one of the massive accelerators of the worldwide web, right? That all of the core specifications were not just Open Source in that case, but placed in the public domain so that nobody could control it, that you wouldn't have a Microsoft web and a Google web and a Lycos web and an Altavista web or whatever. And so that, I think, will be a major key. I also think that taking Open Source approaches to development of not just specifications but sort of core components of the Metaverse will really be critical because it allows for not just a lot of eyeballs to find bugs, but a lot of eyeballs to find possible failure states that aren't code related but are, let's say, societally related or are based more in the human layer, you know, where someone might say, hey, I know we've been using chat GPT for these other things, but does it actually make sense in this thing that we're talking about now? Does it make sense to synthesize this kind of content as opposed to, you know, NPC dialogue or other fictional content? So having a community that looks at that and is able to see from different perspectives how certain challenges might be met and how certain solutions might have unintended consequences I think will be a major key to having the Metaverse advanced and really become what it can be. James? Sure. I was going to say, I think given the scale and scope of all the technology that needs to be developed, there's going to have to be a lot of risk taking in terms of tactical decisions where we can always make safe bets about like, okay, this is the right tool for this job, just because what we're trying to do, I mean, for any points in order of magnitude more complex than a lot of previous things. So I think, you know, there's already been a lot of initiative, I would say, shown by the open source community in terms of trying new technologies, but that's rusted wasm or other ideas around, you know, WebXR. So I think just a matter, we have to be willing to take risks as an entire development body as opposed to just playing it safe and potentially being too slow. Sounds good. Renny? Yeah, first, I kind of want to say that I really want to see the Metaverse succeed. I'm seeing exciting use cases, again, not just in the intermittent domain, but also in healthcare. We have some of our communities demonstrating things like remote surgery or remote medicine that is using 3D models of organs and stuff like that that is supported by Metaverse technologies. So I really, really want to see it succeed and not become just like an overinflated hype thing that crashes and burns, as we've seen, unfortunately, with other technologies not so long ago. And I think what we all need to do is as the industry software industry, developing that is to build in all the necessary guardrails so that it doesn't go out of control. I mean, we need to go fast, but maybe not too fast as to neglect all these guardrails and safety mechanisms, because it's very important to do it responsively and make sure it succeeds. Open source communities is of course, as previous speakers said, a good platform for doing things responsibly with responsibility and having a lot of eyeballs and making sure everything is done properly. So I encourage everyone to take part in these communities. But really, the end goal is developing a successful Metaverse with all the guardrails in place so it can really take off. Tina? Yeah. As we continue to explore and develop the Metaverse, it is crucial to prioritize sustainability, inclusivity, and collaboration by leveraging open source technologies and fostering partnership among various stakeholders. We can create a Metaverse that is not only technologically advanced, but also socially responsible and environmentally sustainable. Awesome. Well, thank you so much, panelists, for your valuable and valuable insights, and we're very lucky to have you join our panel. And for the next couple of minutes, I'd just like to briefly introduce Open Metaverse Foundation here. I'd like to share my screen here. Okay, Open Metaverse Foundation, it was launched last year, and its mission is to create open software and standards to enable portability, interoperability for an open global scalable world which supports interactive and immersive experiences for the benefits of individuals in the industry. And the foundation is not to build the Metaverse. Rather, we're building the blocks, the blocks that enabling technologies to build this interoperable Metaverse. And the focus, there are focus areas as earlier I mentioned, we have a lot of special interest groups that work on various aspects of the Metaverse. And basically what the groups do is to create scenarios and then develop source codes to demonstrate the scenarios and throughout the process we build specifications and publish standards. And this is where you can come join us. So as we conclude this panel, we just want to emphasize that Metaverse is not the work of a single company or group of individuals. It requires collaboration, inclusivity, and diversity of the expertise from across various industrial and communities. The challenges and the opportunities presented by the Metaverse are too great for any one entity to tackle alone. Therefore, we'd like to invite all of you to join us at the Open Metaverse Foundation to build this Metaverse that benefits all of humankind. We need the collective knowledge and expertise of this wider community to create a Metaverse that is truly inclusive, accessible, equitable for everyone. We hope this panel has inspired you to join us in this effort and we look forward to working together towards this exciting and transformative vision of the future. Thank you. Thank you for having us. Thank you, Eddie. Thank you, everyone. Bye-bye. Bye.