 So thank you all for being here today. So what I'm going to talk about is three projects that I've worked on over the last six months that are demonstrating how typography can be integrated across immersive, mixed, and virtual environments. And I want to start with a disclaimer that this is so far outside of my comfort zone. This isn't something I've been doing for years. But I think that's kind of a leaping off point for a lot of people just need to do and do something. And I'm also interested to hear from many of you on how maybe you variable typefaces and also working multiple languages into designs, especially in these virtual environments. Let me know. But essentially when you're working in virtual spaces, the rules of typography change. For instance, ideal line length is shorter than for print applications. And because there's more intensity of visual information in VR environments, less complex typefaces and heavier weights are generally more successful, particularly if there isn't a fixed background like in a mixed reality. So additionally, just as when you're designing for websites or apps, motion is a primary design principle when considering AR and VR spaces. So Google's material design does a lovely job of explaining motion for UI, but their focus is mostly on app development. They do provide some guidance for the VR cardboard headset, but it's mostly about physiological considerations, such as how to help avoid causing motion sickness, which is good. And as Google puts it, motion sickness is usually caused by the disparity between what one feels and what one expects to feel. So they give the example of how many automobile passengers become car sick if they are looking out of the windows of the vehicle, yet drivers rarely get car sick because they can anticipate the sensations of movement before they happen. Anyway, I just wanted to introduce a few of the basic things I had to consider when creating the projects that I'm going to discuss. So the first work I'm going to show is a virtual reality interpretation of Denise Duhamel's piece, Mobius Strip, Forgetfulness, which is a non-linear poem about Alzheimer's disease. In her original text, which is shown on the right and bottom here, Denise tells the reader to create their own Mobius Strip, or Infinity Loop, by xeroxing two copies of the poem, taping them back to back, and then connecting them in a twisted circle, which allows the viewer to begin at any point that they would like. For the VR version of this project, we created a giant Mobius Strip that viewers could walk through. We also gave the user's wristbands that monitored heart rate, perspiration, and location, which both allowed them to be aware of other people in the VR space, but it also gave us feedback on what parts of the poem might cause an emotional response in what areas people spent the most time. So my role was to create the typographic videos. Zach Dewar created the Unity model, and Eco Book Vic created the sound pieces for this exhibition, which you will not be hearing. For the videos, I arranged the lines of text irregularly to make each block of type a little more related to the lines next to it. Being a non-linear piece, I wanted to create an environment in which a user could read the poem all the way through, but I also wanted to create these serendipitous misreadings of the text. So the videos had a very slow moving distortions of the type, mimicking the same sorts of confusions one might have with Alzheimer's disease. And each animation looped back and forth and faded out to black, but in such a way that some text and type were always present. I had to pace my animation slower than one would typically, as when you're in a virtual environment, there's so much more to absorb that motion feels much quicker. Additionally, we rendered videos on both sides of the strip, creating a magical environment where if you walked through the wall of the strip, you could look back at a different part of the poem. Additionally, as people moved through the space, they would hear abstract sounds that corresponded to the content of the poem at that particular location. Next, I'll show an Instagram promo video I made and another fly-through video during which I'll read just a few snippets from the poem's text. She tries to pee in the trash can, but misses. She puts her socks over her shoes. She wears all of her necklaces at once. She has another bruise. She hides her rings in the toaster slots. She shoves the toast in the VCR. She holds the chili's menu upside down. She tries to eat an acorn. She likes to eat canned frosting. She is small and curled upon the bed. She can't smell anything good or bad. She uses her lipstick as eye shadow. She loses her favorite crucifix. She carries an empty pocketbook. She doesn't recognize the pudding. She doesn't recognize her son. She wears a baseball cap she stole from another Alzheimer's patient. She turns on the stove to watch the flames. She wears her sweater as a hat. She turns on the stove to touch the blue flowers. She tries to eat a checker, then a domino. And what you see here in this rendering of the VR exhibition are the avatars of other users and the colorful trails they leave behind them to show where they've gone in the space. Although it was really strange when I was using this because the other person didn't realize that the blob was me and they kept trying to touch. So I was running throughout the space trying to avoid the person, so. The forgetfulness project was designed specifically for the cube at Virginia Tech, which is a four story black box theater. The cube itself was rendered in the virtual environment that I just showed you, which kept people from walking into walls and such because the actual wall was in the same place as the virtual one. And the next piece that I'm going to show was also designed for this space, but as an immersive theatrical experience. It's called Shakespeare's Garden, an immersive sound stroll through his sonnets, soliloquies and scenes, in which a team of designers and performing artists created a typographic experience that utilized spatial audio and flexible video projection. My role was to create a series of motion graphics again, alongside these installations of audio recordings, which utilized both spatial, meaning all the sort of around and locational more direction-lized audio. Each piece uses the text from selected Shakespearean works, including passages from a Midsummer Night's Dream, As You Like It and Merchant of Venice, as well as several sonnets, with the goal of creating engaging visuals, which were projected onto these series of hanging scrims. In this environment, actors were replaced by the directional sound recordings of their voices, and the recordings of Shakespearean works were also complemented by the process sounds of the garden, forming sound escapes throughout the space, and we'll see if this, JB's, thanks. Overdale, thorough bush, thorough briar, over park, over pail, thorough flood, thorough fire. So during performances, the audience members follow this meandering path through the installation actively engaged in their own exploration of Shakespeare's garden. So part of my research was to look back at Shakespearean texts and also to think about the type itself. So in spite of tending toward using sans serif typefaces and much of my work, I chose to use a typeface that would have been in use during Shakespeare's lifetime, even if the Adobe version I used would not have been. So Garamon was designed around 1540, I hope that's right with this audience. And it's organic forms are also connected to the natural sounds used in the auditory compositions. So we also had to figure out a layout for our installation. We had six eight foot by 20 foot tall hanging scrims, and the positioning of these around the sound stations in the cube theater were directly influenced by English Renaissance gardens, specifically Andre Mollet's design, which is seen here. It pays to have friends who are art historians. So the cube itself has 360 degree audio with 124 standard speakers, four subwoofers and nine additional speakers that project hyper targeted sound like the aural equivalent of a spotlight. So this enabled extreme control over the auditory experience. And the Shakespearean recordings were played over these directional sound speakers, meaning there was about a six foot area in which their voices could be heard. And if you stepped outside of this zone, then you could no longer hear the performers. So lighting cues hinted to viewers where these auditory stations occurred or it's the circle if you're looking at it here. So seen here are some of the student actors' recording and note that many of the gender roles were flipped. In addition to the recording stations, there were garden soundscapes that could be heard throughout the entire theater as they played across the 124 standard speakers. This setup allowed Nichols to compose his sound piece in regions. For example, the bird sounds would come from higher level speakers while water sounds and such came from lower down. Both the auditory and video transitioned through the seasons as you move through the space. Seen here some of my source video that I took blurrily. And the first selections of Shakespeare work are set in early summer with themes of blossoming new love. As such, the first scrims were warmer toned and the corresponding audio pulls in the sounds of summer. And the next pair is a little bit cooler with the text emulating rain which is reinforced by the surrounding water sounds. And the final set is even cooler toned for winter and the text flows in and off the scrims, mimicking the wind, goes from high to low to mimic the balcony. So while the final compositions are abstract, they're all built from collages of images and video that hold meaning such as relevant flora and natural scenes and all from photos that I've taken myself. So beyond this installation, I also created a program and a poster where the negative space is meant to represent meandering path that the viewers took throughout the exhibition. And I found a way to translate this piece into a digital poster for use outside the theater and on social media. And I really like this idea of subtle motion where it doesn't necessarily have to be screaming for attention, but rather can be ambient or be used to capture a feeling. So even though immersive theater is not new, technological developments make this an ever-evolving field. And while we created this project, we wondered, how will audiences react to a sound stroll? How will participants engage with their fellow audience members during this experience? How do spatial and aural environments affect the way in which an audience responds to and engages with Shakespeare? Will they hear his words more fully, reaching a higher level of understanding? Studies show that in our era of smart phones and constant internet access, that audience has a significantly reduced attention span. So thank you all for being here, especially on the last day of a conference. But in addition to that, it takes the typical listener approximately 10 to 15 minutes for their ear to become attuned to the archaic anomalies of Shakespearean language. As such, our team wanted to find a way to engage viewers on a multimodal level as their ears adjusted. Shakespeare can be a bit overwhelming. So our team helped present visual themes and pull out phrases via this typographic projection that helped illuminate patterns within the text. This project was intended to be a pilot for future experimentation in immersive theater. And we're currently working with Virginia Tech student, Dylan Q. Tehr, who's a virtual and augmented reality developer on the next incarnations of this work. He's recreated a version using Unity for use with the Oculus Rift, but it really isn't done justice in this sort of presentation. I think of it kind of like the difference between early Simpsons and the Simpsons, you know, today. You can describe them both and they pretty much sound the same, but they're actually quite different. And I'm not Oprah, so I couldn't just bring 400 Oculus Riffs and be like, you get one, but... So anyway, we're still playing with elements like opacity, ambient light and this sort. And Dylan's also set up a way where you can turn on and off different aspects. And this is the sort of thing where I think variable typefaces could come into play or working with like, Nodo and all the different languages. And I have no idea how to check that because I'm terrible at that. But even things where you can have an object that would be representing a program and you could deal with it in this space. So, but we struggle some in the VR aspects considering we thought about VR after the project rather than during its creation or before. So for instance, if I was to keep working on this, I wouldn't have just rectangles. I would have this text continue to float through the air with transparent backgrounds. And we're either going to take a step further with this or we're going to implement some ideas for a project we're working on with Edgar Allen Poe. So the last project I'll show is one in real life or IRL. And it's called the Future Home House, which is a prototype for a smart home. And it will be competing in the Solar Decathlon in Dubai this fall as the only US team. So the team has done VR walkthroughs for it and they've been exploring mixed reality applications such as being able to test home features and a virtual overlay on top of your existing structure. But the aspect I want to bring up today is the concept of integrated reality systems in which apps, touch panels and smart devices would work in tandem with mixed and virtual reality features. The Future House is a connected home in which you can control it using voice, gesture or touch or through an app. There are also wall displays in each room allowing users to control the home and you'd be able to do things. So there's a smart table in the middle here where you could pull up a recipe and then fling it to the back splash so while you're cooking it would be up there. So we've been, oh, where I and typography come into play is on developing the visual interface design for the home. So when you have smart mirrors in the bathroom and such they won't just have the visual default. So right now we're in the process, I'm working with four students on creating an app to control the home but we're also addressing the built-in displays and touch panels in each room meaning our type treatments, color systems and style guides must be flexible enough for a wide variety of applications but still have a sense of cohesion. So in addition to dealing with the general look of we're tackling how inactive and active states might differ in how to handle the structure of a really complex app. Since users would use their home so regularly we don't want something overly busy but we want to integrate small moments of joy and some subtle animations so when you adjust the service stat the colors might shift. And in addition the home is supposed to help you monitor energy and water usage and it can also help you with your health as there's built-in heart rate and weight sensors throughout the home. So translating these features across all sorts of screens can be difficult but we're finding consistent application of type, color and icons are helping so one of the students, Izzy, has been working on developing this icon system and while it might look similar to things that you've seen out there you'll note there's things like a drone delivering a package which wasn't really a default in most existing set and there's also some cultural sensitivities or site specific so one of the features of the home is that the bathroom can be converted to a foot washing station to help people prepare for prayer so if we're having an icon representing the prayer setting it wouldn't be the typical what you think of kind of as Christian prayer hands because their hands when go to pray is more of this sort of so a lot of customizations is part of it for respect for culture but also the site specific applications so unfortunately this project is still being developed so I don't have our final integrated system to show but if you have any suggestions for me later I would love to hear them and many thanks to Taipan for having me to Virginia Tech the Institute for Creativity Arts and Technology the Center for Human Computer Interaction for all of their support and most of all thank you all for being here with me today and my dad for driving down from Ashland Hi Mr. Dean