 Good morning all. My name is Colin Keenan. I'm a staff member with the North Carolina State University Libraries. Today I'm presenting a project case visualization studio development kit for Unity, our development of a tool set for our users within our library community in response to a new space. This work is in collaboration with two very talented undergraduate researchers, Jaden Sansom and Elliott Schultz who you'll meet throughout this presentation. All of these slides are already available online if anyone was to want to browse on your own device. I'll also use this link later to provide a more accessible version of the video upon its posting. And that link will also be at the end of the presentation in case anyone wants to take it away with you at the end. So this is a story about a leading university library building community-centered creative tools to speed their community's innovation by building out template interfaces, tools, and project templates. And features a really outstanding example of student and staff collaboration, kind of collaboration across career progression in order to center teaching and learning for our community. So let's learn a little bit about that leading university library. Let's meet the North Carolina State University Libraries. I think a lot of vision statements can really misguide you on what an organization is all about and what they prioritize. I do not think that's the case for ours. I kind of love our vision statement. The libraries are NC State's competitive advantage. At our university we have something of a hub and spoke model where the library gets to be a resource hub that sits central to the university's colleges by building out resources within our libraries. We aid all of our colleges. We are a university with some leading graduate college programs, including textiles, engineering, and design. They're obviously overlaps between those academic pursuits and our resources aid all of them. We do that by putting users at the center of our service design, increasing accessibility wherever possible through experimentation and collaboration. We aim for diversity, agility, and keep in mind values of well-being. And I think my favorite addition to our list of values is delight. I think delight and enjoyment of being in your library spaces, engaging with library services is what keeps a user coming back. We're not that different from businesses in the sense that when a user leaves, delighted, they're excited to come back next time. We have several goals underneath our billing as the experiential library. I'll talk a little bit about how my work engages with that experiential teaching and learning pursuit, but it really is an organization-wide understanding of our mission. We aim to enrich educational experience, research, productivity, and campus life broadly. We do that through creating accessible spaces and services that inspire and connect our users. These are both virtual and physical spaces. I think the last couple of years have really advanced a progression that was already ongoing of systemically connected virtual and physical services and spaces. I'm going to use that word quite a bit within this presentation accessibility. I just want to clarify the way that I'm using it. I'm really using it in the totality of the word there. Obviously, at the forefront of that is equity and equitable and suitable for the adoption of our spaces and services for use of people varying abilities and disabilities. We're also using accessibility within a user discovery context as well in this case. We have an emphasis on concierge services within our experiential teaching and learning spaces. We're also aiming to create tools that are easy to approach, easy to use, are reachable and obtainable so that as staff members and as faculty members, we don't become gatekeepers of services and that these are services that can scale beyond our available person time and create bilateral use of those resources independent of staffing. I think something our libraries are very well known for and are very proud of is our experimentation and dissemination of emerging technologies to our user community. This is another case where we're aiming to create accessible in all senses of the word services and teaching and learning spaces. You can see some students here experimenting in our maker space with microelectronics. We're really lucky to have several making spaces and independently maintained making spaces on our university campus beyond the ones that are maintained by our libraries. There's about five maker spaces on campus at this time. We were the first ones. We will always aim to be people's first maker space that they enter on our campus. We will always be aimed at novices and beginners understanding that those users will matriculate on to more custom built and niche spaces on campus and they'll become members of those communities while remaining parts of our maker community as well. So this is the department that I live within is learning spaces and services. It's a rather large department within our library. We have about 25 staff members of the 205 total library staff members. This half of the department we really are focused on a lot of what I'm about to talk about but we really benefit from the collaboration with our department mates who are focused on instruction teaching and learning on our very pedagogy grounded. A lot of the reason that our department makes as much sense as it does is because we have kind of the dual action of pedagogy focus as well as experimentation with technology. This half of the department are my more frequent collaborators. We are branded under the umbrella of the studios. I'll talk about how we have several making spaces that fall under that umbrella but we provide technology rich spaces, hands-on access and support for students to make things. We're very focused on making, creation, experimentation and all of the spaces I'm about to describe in the studios are staffed spaces. You can see we have a really broad range of making spaces from digital media creation booths to a really acclaimed and excellent maker space that has a focus on both 3D printing which I think is associated with maker spaces but we have a heavy emphasis on textiles and soft materials making which is really booming in our space right now. We also have a virtual reality studio and a portfolio of spaces for creating digital fabricated objects using high graphic power computers that's both in our VR studio which really has led the way on that for us and the digital media creation has really followed with that work as well. Those are not the only individuals within our organization who interact with the topics I'll be talking about today while those are our staffed spaces where we have two student colleagues staffing each of those spaces for on average 50 hours a week we keep those spaces open. We also have a team and this is a group that you see on the right side of the screen who's really located across the organization and this is not a comprehensive list of staff members who interact with these topics. It is really an organization wide left but these are some of the folks who are really essential to the work I'm presenting today. These are our high tech spaces team broadly staff members in research engagement, IT, external relations I see in there I think I'm probably missing a couple of departments as well but this is the group that really does the heavy lifting on putting really personified joyful experiences in high tech spaces and having that go well for our users and be enjoyable for our users. We're not new to the concept of rolling out really bleeding edge spaces for showcasing student work, research, instruction, showcases of campus life just kind of bouncing around the screen here of some of our precedent spaces. We have large monitors, high def monitors throughout our buildings. You can see the art wall which will actually have a renovated image there pretty soon in the top left corner across from that we have the tech showcase showing off technologies that we have in our lending collection. We are very proud of our ability for users to walk up to the desk and walk away with a piece of technology that will aid their work. Another large immersive screen there with the eye pearl theater. A really precedent space for what we'll be talking about today that really led the way in a lot of the paradigms that we're going to be utilizing and standing on today is the teaching and biz lab in Hunt Library right in the center of your screen there. It's a 217 degree projection surface. It's our largest screen and wraps around into the peripherals of the user as they're taking in the media. And then finally in the bottom left corner there you see a space that's very near and dear to my heart. This is our interactive exhibit space called the Innovation Studio. This is something of a scratch space that uses, it's very graphics imbued. There's top down projection across the room with hand tracking available for interaction with the exhibits that are displayed on the tables. So you can see that's a space that we can turn over exhibits of about 15 minutes to showcase stories of campus life, campus research, teaching, learning. A lot of our faculty and students that we've engaged with in past semesters have talked about how they build really beautiful things that they're very proud of in their coursework, especially when it engages with the libraries. And this is a space where those projects don't die after final grading. They don't go into an archive somewhere maybe into a student portfolio that's not seen very often. These are opportunities to showcase work that our community members are proud of for our 1.6 million visitors a year who we see go through our spaces. So with that in mind, showcasing innovative student work. This is the space where a lot of our conversation will be based out of today. This is the Simon Rubin Visualization Gallery. This is our newest showcase space, one of our newest spaces at the NC State University Libraries. This is our first 360 degree immersive theater. Sorry, this is a follow up on our first attempt at a 360 degree immersive theater. We took a lot of lessons away from the version of this that lived from about 2013 onward. And we really tried to enhance usability and a sense of awe when interacting with this research space. I think this space is really capable described by my colleague Hannah Rainey. And we'll hear from her from fall in 2021. I just want to make a note. These comments are from before the space was opened to the public and we use an older name of the space, the Visualization Studio. You'll see that used throughout this presentation frequently. The name change to the Simon Rubin Visualization Gallery is something we're really excited about and we're really proud to have the name of a really innovative NC State alumni on that space. But it is something that's a relatively recent change. So let's hear from Hannah. This Visualization Studio is a reimagination of the original Visualization Studio that existed in from 2013 to 2019. This renovation was an amazing opportunity to reimagine the space and push the boundaries of the technology. We also did something fairly unique in making a fully round room. You don't see round rooms many places. And so in creating this round room, we wanted to increase the immersive quality of the space. And so the space offers 360 degrees of projection. They're high definition projectors run by a single very high powered PC. And so you have the combination of the 360 degrees of immersive visuals combined with the auditory immersion can create a really unique experience for our students and faculty. The vision is to provide sort of a place where people can imagine what's possible with teaching and learning and presenting research. But developing the space, we had a team of eight people from the libraries from all different departments with different expertise. We all got together and met regularly to talk about the different use cases and to make important technology decisions. We also demoed technology and are now working on developing programming together with faculty. The use cases for the visualization studio are primarily, I imagine, teaching and learning. So really expanding what is possible in the classroom. Faculty can bring their classes into the visualization studio and transport them to different physical locations, different time periods. They can even transport them off the planet by providing that fully immersive experience, 360 degrees of projection and spatial audio. It can really create a unique learning experience. Beyond that, the space will be used to showcase research. So we have a lot of amazing research here at NC State. And similar to the Hunt Library, we provide places where researchers can communicate with the public and really showcase and have the public experience at research in a new way. I hope everyone uses this space. All of our NC State faculty, students, staff, and even our community are welcome to experience the visualization studio. I think the visualization studio has a lot of potential to impact learning outcomes for students and the teaching outcomes for instructors. It's a unique space. So instructors can plan a special lecture utilizing immersive technology whether that's Google Earth or their own unique content that they've created for that class. It extends the impact beyond the classroom without having to physically go anyplace other than the library. And I think even though we can't gather in this space right now, it's really going to be significant in a post-pandemic world when we start to move away from individual devices back towards community experiences. And this space has enormous potential for that. Hannah's leadership on this project has been really inspiring, really instructive. She really understands the north stars of this project really well, I think. It gets me excited about this space again to hear those comments two years ago knowing that we're now living in that world where we can kind of gather in these spaces and having seen examples of that. Hannah describes the space really well. There's a challenge of this space that's not dissimilar to the challenge of showing virtual reality through a flat pancake screen like this where you really don't get a sense of what it's like to be surrounded by this content. However, you can see some examples of instructors using early phases of deployment of the space on the right there. So this is a story about building those community-centered creative tools to speed development and utilization of the space as well as possible. One precedent that we took away from the earlier spaces that have a similar pursuit to this space is the importance of maintaining the personal computing paradigm that users are used to on their own devices and in their day-to-day life. Users make great strides in utilizing these spaces when the user interfaces and the deployment experience is very similar to what they experience on their day-to-day machines. So the visualization studio, from a user perspective, operates like a normal desktop. It is a Windows desktop. It is much more powerful in terms of graphics and processing, but when a user opens up the display server interface in there, they're met with the similar blue Windows background and everything is kind of where they would expect to find it. This is really useful for building executables as well. So this little blue dot that you see in the middle of the screen is a 1080 what we used to call an HD monitor resolution. And just for perspective, this is the visualization studio's aspect ratio. So there's obviously some unique challenges to building when your display is like the windshield of a flying saucer. It's very, very wide. You get a very interesting view into the virtual world you're creating for your users, right? And we wanted to aid in the ease of thinking about that design consideration as much as possible. So our first goal, and really something that was precedent for all of the work I'll describe later in this presentation, is building tools and templates for the most familiar workflows. You know, something like 90% of presentations we see in our spaces are built with very familiar technologies, slides, video, web browser. So those are the templates we built first. This is really building off of the templating we've done for previous innovative, immersive showcase spaces. So those PowerPoint and Google slide templates are very oft used. I think there's a reason I'm presenting a slideshow to y'all right now. Even though this isn't the type of media I spend the most time making, it's a great way to communicate to folks. It's a great way to structure a conversation like this. So we have seven Google slide templates available for our users to adapt and use before they've even stepped foot into the space. Video is also a frontier for us that's really important to tackle when people talk about creating the most immersive media that can create a lot of our users that you're describing video at that case. So we have a really high emphasis on user guides and usability guides for video. We have some really exciting developments in that space, only in the last couple of weeks. We now can record native resolution video, native resolution to that screen I just showed you, as well as capture events that have happened within the space for dissemination elsewhere, which is really important for accessibility to be able to capture things that are happening in this space, whether or not a user wants to visit the space themselves. They can see the content that's being presented there. And then I'm a web developer and game developer. That's like the development avenue closest to my heart. So the web is a very, very confident disseminator of interactive experiences. Trevor Thornton, a staff member in digital learning, digital libraries rather, has built a set of HTML and JS templates for this space to present things ranging from historic timelines of state. You can see a really excellent one that's on exhibit now from my colleague Victor Betz on my right there about Asian-American history at NC State. Or experiences like video drama, which shows 10 distinct video views around the room for a somewhat disorienting kind of media experience to use the full capability of what that room can do. However, oh, also, you know, the web can do 3D stuff too. I'm a WebXR developer and some of the first experiments that we ran were natively 3D content that's interactive in the space. You can see a snippet from a Twitch stream where I coded a boat that you could drive around the room with a video game controller. This is very early days of using this space. However, when I heard that we were building this space, the first precedent that came to mind for me wasn't slideshows or video. I was thinking about kind of a next-gen tool. When I heard we were building a responsive, immersive circular room, I could really only think of the holodeck and how we create responsive experiences that are user-stimulated, are user-centered. And I thought about, you know, data throwing the rock at the rear wall in the holodeck. So what are the tools that we use to enable users to create that kind of experience, not just demonstrate it for them, but put those tools in their hands? The answer is game engines, in my opinion. So these are game engines, our integrated development environments. See some examples of the top of the screen of what we call off-the-shelf game engines. These are tools that are capable of real-time rendering, so rendering images to display on the screen in response to user action in the current state of the simulation. Physics simulation, which really embodies digital objects in a physical, immersive space, really essential. Scripting and logic, obviously, the ability to create extensive, essentially, if-then loops to anticipate and respond to user behavior and stimulation. Asset management and import, these are broad media compositing environments, and they have the capability of bringing a really wide array of different types of media into them as development environments. That can include video, actually some of our early prototypes of showing video in the space involved, running them into a game engine and feeding it back out to the display, a really compelling way to maintain user perspective. Increasingly, game engines are in use in animation. If anyone's familiar with the recent Spider-Man animated movie, that really was pretty groundbreaking in terms of merging traditional computer-generated animation workflows with more real-time rendering, shader languages, things like that. That's really a heavy point of student interest for us right now between that kind of animation and the kind of soundstage sort of work that's happening with shows like the Mandalorian where performers are performing in front of responsive environments. These are questions we're fielding from students. It's the media they're taking in and what they want to make. Also, a little less applicable for our purposes in this talk, but it's really important the idea that you can create a project once and build it to several dissemination avenues that something game engines are very good at. You can edit once and then build a Windows, a Mac, an Android, an iOS version of the same application. So the two really mature, shout out to my open-source game engines, but maybe we're four or five years away from being as stable as their competitor off-the-shelf engines. When you're talking about off-the-shelf for academic purposes, practically free to use game engines. Unreal Engine from Epic Games in Cary, North Carolina, and Unity are the two major engines that we see our community interested in using. So these are immersive and responsive tools for research, teaching, learning, and creativity. We're looking at off-the-shelf, pre-existing, open-to-use engines because we're aiming for tools that have wide adoption already, have high degrees of extendability, the ability for users to build on top of them and import assets that are relevant to the work they want to do, and long-term support that's from organizations that aren't us. So in early parts of this work, a lot of our work was environment scans on how does Unity intend for you to learn Unity, what are the steps in 2022 that a user is going through to learn this engine? Same kind of exploration with Unreal as well. We decided to address Unity first for really boring reasons that are pretty in the lead and actually might have been wrong, but it worked out pretty well. So we started experimenting with ideas of how we create the translation from digital creations to our physical space in summer of 2021 before any of us had actually been in the completed space before. So we have some experiments. We kind of understood what the paradigm might look like, but we went from these sketches in summer of 2021 and you saw in the video that Hannah was speaking over earlier that we actually snuck one of these demos and snuck its way into that video. It's kind of embarrassing to look at now considering that we've gone from these kind of grainy ways of showing these environments to something that looks more like this in fall of 2022. This is the Aquarium. It's a fully responsive virtual environment. Any of the marine life that you see swimming around is procedurally navigated. So you could run this demo for, you know, pick your integer number of hours and you'll never see the same thing twice, which is the first such experience I've seen built out like that to that degree at one of our spaces. It really feels like a precedent for the sorts of programming that our students will be creating in these spaces in the future. So let's talk about the students. You saw two additional names next to mine at the beginning of this presentation. Those are folks who we were able to integrate into this prioritized organizational project. This is the furthest thing from kind of student work or busy work. We're not really into that concept in our organization. We really want to engage students in prioritized organizational projects and ones where they are collaborating with colleagues who are across the spectrum of career advancement. Those are the kinds of experiences that can become bilateral and can kind of aid creativity in both directions. In this situation, in this story, the Susan Ellen Everett Library internship was essential to our capability to build out this set of tools and to employ our colleagues in the way we did. I just want to talk a little bit about this alumni story because I think the NC State alumni community is pretty amazing. So RO and Edith Everett established the Susan Ellen Everett Library internship in 1999. RO lived an amazing life. He's a graduate of State in 1947. He became an executive vice president at Wacovia Bank. Edith also was a really valued essential member of your community in Salisbury, North Carolina. She served on the Roan County Public Library Board of Trustees and their friend at the library board. She's a lifelong library advocate, and she was a member of the Historic Salisbury Foundation. And this is an endowment that offers exceptional undergraduate internship opportunities engaging in computer science engineering and electrical engineering topics. We were really lucky to be able to hire a cohort of two interns for this cycle. This is our second cohort of Everett interns. I do just want to shout out some really innovative work that was done by our colleague Kamai Guillory at a very difficult time to do exciting work. Kamai was with us in spring of 2020 and was building out holograms and utilization of 3D displays in our spaces for equitable and diverse public exhibition. Wrote a really impressive white paper about his work as well. So please meet my collaborators, Elliot Shultz and Jaden Sansom. This is our cohort of Susan Ellen Everett interns who are working on the Visualization Studio Development Kit. Elliot is a computer science senior. We'll graduate here in May. In the game development concentration in CompSci, Jaden is a rising senior. And also in the game development concentration and has another maker community over in the art and design program in the College of Design. They're both leaders within the campus organization known as Video Game Development Club. This is, we honestly could give a presentation about how they have educated us on some of the aspects of this work over the last seven or so years that they've held a lot of events. They've held their weekly meetings in our spaces and they have this really interesting and cool fabric of making of their own on our campus. When they hold their weekly meeting, you kind of walk into a room and they turn it into an indie game studio. It really is the only thing it feels like when you see 50 students kind of buzzing around working on four or five projects that once entirely self-guided. Jaden's got one of the coolest portfolios of any student I've ever interacted with. I would totally encourage you to go check out her Zhentavros website there. I'll have that at the end as well. And if you check out the slides afterward, I'll have links to all these student portfolios. There you can see a little mushroom that was on the front page of Sketchfab a couple of weeks ago that she animated. And Elliot is a pretty cool guy. You never see him not wearing his signature shade of yellow. I think it's pretty cool that he's branded himself with a color. I'm kind of inspired by that. And he's really interested in how the engine works, how the game engine works, and scripting languages. So you can see some examples of cool games he's made there. He's playing indie games. His itch page is really fun. He posts new games there all the time. So how did we put student work into production in a project that was immediately going to be seen by their campus colleagues in campus community? I think we put a heavy emphasis on providing freedom to fail around this innovation. Freedom to fail constructively and move the project forward and our understanding forward. Here you can see I think the sketch might have resulted from our first meeting with the Everett interns and they pretty immediately grasped all of the demonstrations we had done to that point. Cool little sketch of those concepts. Apologies for the amount of text on the next couple of slides, but I just want to show some of the project management paradigms that we found really useful when working with our undergraduate colleagues. We started with an environmental scan phase where we kind of banned them from doing any production. We wanted them doing environmental scan on the existing paradigms and educational environment of the Unity game engine as well as just exercising and displaying sustainable working norms. I think it's very easy when students are excited about a cool project for them to maybe overwork themselves early in the project. We wanted them having broad team introductions so they really understood where in the organization their work occupied and we gave them open access to maybe 50 different colleagues, contact information and calendars which are really appreciative to all of those colleagues for being so generous with their time. We, in phase two especially when the meat of this development was happening we gave a lot of opportunities for them to reduce scope and we had persistent one-on-one check-ins and presentation milestones and shout out to Sean Bennett and Hannah Rainey who were really essential in the process of co-managing these students' work over the last summer and fall. So let's talk about that user-centric experiential learning and teaching. So we're aiming to build long-term support of these new spaces, these tools and these services and so this is the introduction of the Viz SDK, the developer toolkit for Unity. Software development toolkits frequently offer modular, resilient and extendable tools for developers to create software in new hardware paradigms. When video game consoles release they often release with a really dense set of documentation on how developers can engage with the new product. Things like the Xbox, the iPhone, the VR headsets, the standalone VR headsets that have really immediately impacted that market. They're all aided by really comprehensive documentation and demonstration of how to use those tools. So Viz SDK is built on top of Unity's existing huge prolific stack of community prefabs and tools and assets and we do that for the greatest possible standardization and long-term support. We also had our minimum viable product involved on-ramp to reverse compatibility for existing projects on our campus. Unity is pretty widely used by researchers on our campus and it's a pretty tough sell to go back and say, do you remember how you built a project for our space 10 years ago? Well now we want you to start from scratch again. That's not a very fun conversation to have. Come on in, bring your project and let's see it in a new space on the first day that you step foot in there. That's pretty easy email to send in my opinion. Approachability was aided by the fact that Unity has its own teaching workflow for learning their engine. So we had lots of resources that are very updated we could share with our users. So this is the visualization studio development kit. I just want to thank Dr. Mark Sandberg from the College of Education. Our markdown based documentation is hosted in GitHub. It's available if anyone wants to use it. This is all open source software, but the actual templating of how this web page works is built off of a Jekyll static site builder that Dr. Sandberg built. So we had a pre-branded way to deploy this to our community. So there are three essential aspects of this development kit. The first is a 3D camera. And this is where I get to do game development vocabulary here. A camera, when we talk about a camera in game development we're talking about a virtual camera. So this isn't a physical camera that's in the studio. It's an object that moves around your virtual scene and relates the virtual scene to your hardware display. So when you're running around behind Mario there's a camera following Mario as you run through the game and that's what's relating this 3D world to your 2D monitor. But it's not a camera in physical form. We'll have two cameras we'll talk about within this presentation. So I just want to point out the two different paradigms there. Perspective cameras work a lot like in physical 3D reality. When you're moving a camera around in space you're using the view frustrum of the camera to relate the 3D objects that are in front of you down to a 2D view that can be displayed on a 2D monitor. There's also orthographic cameras. We often refer to these as 2D games because the relation of scale and perspective are locked. So if you think of the first Mario games where he's side scrolling left to right the camera is remaining the same distance from Mario as he runs along. That's why he never changes size. So these are kind of offhand referred to as 3D and 2D cameras. I just want to make sure we're all on the same page and how we're using those terms. That's Mario, a little warm. Okay, so the Viz SDK's 3D camera system is kind of the core aspect of the development kit. It stitches, you can see early versions of this in the previous slides. It stitches together eight different camera views that are all children of the same parent. So when the parent moves around all eight cameras move with it that maintains a seamless view of the 3D space onto our 2D monitor. It has quick and easy installation. It's literally a drag and drop tool. And it's adaptable. Anything you want to do with a camera within Unity you can do with this camera. It's not really that unique. It is just specced out and tested really resiliently. So it's very QAID. We've been yet each of these eight views to aid the blending of the projectors in the room. It makes the image look a lot better. We're also running it through a Panini projection which is just fun to say, Panini projection. And that helps you avoid like a fisheye effect from having flat camera views on a circular wall such as like a little math trick. And something that's really cool about this and is cool about the Viz SDK broadly and is probably relevant to folks who do not have a 360 degree theater that they work in daily is that these paradigms are translatable to really any immersive screen. It's very adjustable. You can adjust its field of view and the output screen resolutions to match the screen that you're working with. It's a really quick way to deploy to large screens or to screens that are like unique in that they're widescreen or wrap around or something like that. We also have a 2D camera. I don't make 2D games myself. So this is where we really leaned on the expertise of our student colleagues and their network of makers. The 2D camera system is relatively rudimentary. It works really similar to how a 2D system like works usually in Unity because we're using the wrap around screen as a flat surface. So it's a simple setup. Again, it's drag and drop tool. It's adaptable to different 2D styles and screen relationships. There's a seam at the back of the Viz Studio where the desktop's left side meets the desktop's right side. And this template actually has a way that you check box where you can make it continuous so you don't perceive that seam or where you can turn the seam into a game element. And we have templates that work both ways. Finally, this is in my opinion the most important part of the Viz SDK. It's not a tool that everyone will use in their development workflow, but it's like really important as an accessibility tool. This is a 3D emulator of the space. So one of the problems with having a bespoke one-of-one theater is that only one person can be using it to test it at a time. This can really slow the deployment process and makes us gatekeepers of the space and gatekeepers of development, which we really don't want to be. So the 3D emulator allows users on their own device to see how their current build will look in a virtual version of the Viz Studio, Viz Gallery, and allows at-home development. So we actually have a really cool project happening right now. The Video Game Development Club is running their yearly hackathon and is centered on the Viz Studio as one of their spaces they're working in. And they have four different groups making games at one time over the course of April in this space. That would be impossible to do if they didn't have the ability to emulate the environment from at home. And we've been really thankful that we were able to get as far with that as we were. This is inspired by XR and virtual reality products where the device that you're developing on is very different than the device you're deploying to. So this was really inspired by Microsoft's tools for HoloLens. Magic Leap had a really excellent emulator. And our friends at Looking Glass Factory were really precedent for a lot of our SDK work. We're using video material texturing for this, if anybody's curious. So let's talk about the demos. Our students made an array of really cool applications of these tools. I'm going to kind of fly through these, but definitely check the slides out. You can really see Jaden's style in a lot of these demos, which is so cool. Hankshaven is a suite of 2D demo games. It involves, it has a flappy bird-esque. You play flappy bird around the room essentially. A platformer called Herdlin Hank and a multiplayer around the room Scavenger Hunt called Hanks Heist. It's really fun to watch like middle school students come in and use the Viz Studio and play Hanks Heist, which is a Scavenger Hunt where you kind of shout out when you found something and they run around the room and they have permission to kind of be loud and run around in that space. And it seems like one of the most impactful demos we show. We also are interested in reverse compatibility, so we solicited demos from some of our previous collaborators of games they thought might be difficult to adapt to the space. One of those was Mason and the Elegy of Time, created by our very talented colleague, R.J. Washington. This is an adaptation of a previous and existing 2D game that was created for a game development class. It's very pretty. We were really interested in like how animation would work. So it involves these like big zoom outs to show the scale of the spacer within. And that was pretty ideal for our use cases. We also had the opportunity to work on a really exciting physical project. We were contacted by members of Centennial Campus Real Estate, a campus development partner. And they were building a rapid prototype physical space on one of our campuses and wanted to understand the views that would be present if they oriented the construction in different aspects or in different rotations. So we built a full simulator of their build site for them in a matter of two weeks using VS SDK and actually impacted the development of the physical thing that now exists. It's very trippy to like one day be working in like a virtual scene in Blender and then like three weeks later, it's like a physical thing that you can walk around in. Like since I'm not an architect, that's very unfamiliar to me. Our students love like going and hanging out in that spot, knowing that they impacted how it was built. That's called the corner now. And then finally, the aquarium is a study in screen space versus game space relationships. We were looking at that earlier and that demonstrates kind of ambient interactivity. We've already used this demonstration for our campus wellness weeks. We've set up study room space within Visualization Studio and run this projection around the room. Kind of anecdotally, students have talked about it as a relaxing grounding wellness activity. So when we build those user-centric experiential learning and teaching tools, we're aiming to have persistent accessibility for our new and existing audiences. So let's just talk, as I wrap up here, about our ongoing goals. This is an active tool in development. I would say we might even be picking up speed now that we have the groundwork down. You'll notice I talked only about Unity Engine. Unity, if anything, is losing market share as it remains. It maybe hovers around like 40% of indie games are built with it, 50, 60, somewhere in that. But it's other competitors on real engine. Given the city that we are in, we have a lot of interest in using Epic Games tools. And Epic Games tools have come a really long way in the last two years with the move to Unreal Engine 5. This is the first time we're saying this publicly, but we have full Unreal Engine 4 and 5 support at this time. So we can just support projects incoming from either of the off-the-shelf game engines that our community uses. And I'll congratulate Jackson Bosch, who defended his master's thesis yesterday really capably. And we also are aiming to extend the support of this toolkit to all of our spaces that it's relevant to. So here's a cool demo video where we have a query running in our teaching and visualization space, which is interesting because it's not a 360-degree space. So we are able to adapt the camera array to suit this 217-degree space that's actually a larger monitor. And we have that tool will be in the deploy this SDK by the end of the month. We're interested in helping adapt to other spaces if folks have kind of unique paradigms. We're also looking to do more QA, have more users, and have a broader release. As I was mentioning earlier, there's an ongoing library hack happening with those video game developers right now. And I wish I could use some of the demo videos of their development that I see popping up on Discord, things like a game controllers, gyroscope, controlling where in the room your avatar is looking. Room scale, multiplayer games. And we're also really interested in building pre-fab interaction, so like buttons in the virtual environment, ways of relating to physical hardware nimbly. So this was a story about building community-centered creative tools to enhance our educational spaces, encouraging outcome-focused experimentation, the role of students in collaborative teams to integrate in with prioritized projects, and long-term support for our new spaces, tools, and services to empower our users as much as possible. So thank you all for your curiosity. I'm happy to take any questions. I think we got a little bit of time here. The Vis SDK is available at the link on the screen right now if you'd like to check it out. And as I mentioned earlier, our slides are available at Vis SDK-CNI. And if anyone would like to get in touch with me, I'm at goet.ncsu.edu slash Colin, all my professional contact information's there. And then I'm on Discord about 24 hours a day. So you're welcome to find me there, too. Go off. Thanks, y'all. So I know this isn't directly related to the toolkit, but I'm curious about scheduling and staffing. Do you have priorities for scheduling? I imagine you're, if not now, soon will be overwhelmed with requests for classes and other events taking place in this space. Do you always have to have a staff person present? How are you working all that out? Yeah, that's a great question. And I appreciated your shout out to our concierge services in your chat yesterday. We utilize that same approach for our high tech spaces. So there's a public interest form indicating your interest in using any of our spaces that are liaised by these high tech staff. Any other of the spaces I've shown are staffed with open hours that are found on the web. But for the high tech spaces, we have them meet with a staff member. We do a series of consultations. Through those consultations, we build out the schedule about six months ahead of time for research and instruction being our most prioritized and kind of time sensitive. They have lock in for their curricular instruction times, schedule them first, and then as the semester goes on, we build out the rest of the schedule with folks doing testing and experimentation. So this is another nuts and bolts question. For your visual space, the infrastructure support and the refresh of equipment, how's that handled? Yeah, some of those collaborators we had on that high tech spaces team are AV integrators. So Kante Farah and Ryan Hunter are really essential members of the high tech spaces because they're in the nuts and bolts of how cables are getting plugged in and where server racks are going. We rely on their expertise quite a bit. For this project, we also had a relationship with Panasonic for contracting some of the alignment of the projectors, but that's now all maintained in-house, except for emergency situations, things like a projector coming drastically out of alignment. But having the ability to do it in-house really speeds us and makes us a lot more resilient to kind of be responsive to user needs, for sure. Anybody else? I really appreciate you all being with me today. Yeah, please. Thank you so much for that amazing presentation. I've been a part of XR initiatives at two different universities, and a lot of the questions I get is, why XR in a library? Why isn't it in the robotics studio or in media arts space or in ITS? So I guess, how do you feel that question? Why XR in a library? Yeah, that's a great question. I think four or five years ago, when we were justifying why we do XR in our building and why it's an organizational priority, we were kind of gesturing, not really knowing what the horizon looked like towards projects like this one. XR is maybe not in its totality of the future of computing. I'm not totally convinced all personal computing is going to look like XR in the future, but it will be like an essential vertical to being kind of multi-device and accessibility features, especially the bleeding edge in XR, like the ability to understand the human frame, to understand the actions, physical actions, and how they relate to computer environments. These are aspects of that work that we took from that understanding and totally applied here. It's like no coincidence, like half the people who are involved in this project think of themselves as XR developers, but we didn't use anything that you'd find in the VR section of Amazon in this project necessarily. So we took a lot of those lessons forward. Thank you. Cool. Well, I appreciate you all. Enjoy the rest of your day.