 everything's still working. We are going live on YouTube. All right, let me just switch screens. There we go. It's being placed. Oh, and one thing later, if it's not already in there, let's make sure the link to the slides is included on the YouTube stream. Okay, I will try to get there. You bet. All right, so we are live on YouTube and I'll welcome everybody to our our second workshop of the OSCC Community Conference. OSCC means Open Simulator Community Conference. And today we have a presentation by David Flesen, who also works with Avicon, the group that helped organize the conference. And David is going to talk to us today in a workshop titled Walking the Dead. Just a couple of ground rules here in order to help prevent any Zoom bombing. I'll ask people who are in the Zoom meeting to use the raise hand tool. And I'll use that to indicate and I'll give you permission to speak and turn on your web camera. And then others in that, I think David, the floor is yours. Hey, thanks. Hi, thanks for coming. First off, let's talk a little bit about ground rules so you know what's going on before we start the workshop. This is the website I think I'm sharing on my screen that you went to when you signed up to attend. And if you noticed in here, there is a download slides link. I would request if possible if you can go there and download the slides. This way you have something to walk away with. And it'll give you the information that we covered during the presentation. And some of it is also exploratory. There's some links in there that if you want to go and learn more about things, you can. So that's enough on that. Let's go to the slideshow and switch devices. Okay. All right, so Walking the Dead. We used motion capture to reanimate life. That's what we're going to talk about today. So I do work in animation. I'm an animator in a virtual world kind of store. I figured with everything that's going on with the metaverse, there's a lot of things with technologies right now that are good to look at. Motion capture has really taken off in some exciting ways. So I wanted to share that and share some resources and some thoughts on animation as well. So first off, it all starts with a thing called computer vision. There's two essential technologies as you see listed here, deep learning and convolutional neural networks. They basically shape this. If we take a look at it, basically AI enables computers to think and computer vision enables them to see, observe, and understand. So it's a matter of the computer being able to look at something and understand what it means and make decisions off that or make recommendations off of that. So as you see here, IBM defines computer vision as a field of AI or artificial intelligence that enables computers and systems to derive meaningful information from digital images, videos, and other visual inputs. And then to take actions or make recommendations based on that information. So one of the findings, I always look for things that can be useful to a community that is a very open source focus. So I try to look for open source tools. Sometimes it's easier to do than others. In this case, this one I thought was open source, but it's actually just free to use right now. I believe this is from a student while he was working on his degree. He put together some technology that he's developing. So it's still on a developmental stage, but it's pretty good and it shows a lot of promise and it shows a bit of where things are moving in the future. We just talked about computer vision. I mean, we see computer vision all around us. If you sit here and you look at your phone, I go to my iPhone and I'm sitting here texting somebody and guess what? I can look like a fox. I can put a fox over my face and it's lips move instead of mine. It's all synced up to me because it looks at my mouth, it looks at my eyes, my nose, the shape of my face, my chin. It's able to correlate all that data and understand the meaning of what those things represent. So that's one way that computer vision is working and how it relates to motion capture. So in this case, move me. What this gentleman's done is he has made the programming with the... He's made it so that you can take regular commonplace video and turn it into motion capture. That's really impressive. So if you play around with it, it doesn't quite work the way you want it first. Realize this is quite a leap forward. So even if it's not 100% yet, it shows a lot of potential. Now here we get some instructions and I will tell you, if you see this ninja here, that's not the way to dress when you do it. Because the ninja is dressed in black and this doesn't really care for black or white too much or even probably shades of gray. It's more into colorful types of things that make you stand out from the background. And I have some steps here which you can review. Like I said, this is all in the slides which you can download from the website that you use when you register for the conference, the conference.opensimulator.org page and it gives you all the steps right here. Like one thing is you have to stay on frame and you have to be fully on frame when you start in standing position. So it does have restrictions. I also think it was like maybe 15 seconds worth. You can't go much longer than that. So it's got limitations. But then again, this is, you know, it's kind of teased us with a free to use one and then later hopefully it'll become much more powerful when he's charging for the service. Excuse me, Dave. I neglected to start the Zoom recording. So please forgive me. There'll be a little announcement when that starts. Okay. So should I start over again now? No, I think please proceed. That should be fine. No, I mean, if you need to get the whole thing, I'll go back. I don't mind. I think, no, let's keep going with your schedule. We can cover it later. Okay. All right. All right. So we did have it the whole YouTube time, just not the whole Zoom time, right? Recording. Correct. Okay, good. Okay. Okay. So to show you a little bit about movement, you can see me here in my Eminem pajama bottoms and green shirt. This is what the tool looks like that he put together. Right now you can go to the site. Let me get you the website again, the web address. I got it on that. Okay. You can see it here on the bottom. It's get move me.com. Okay. So if you want to test with it yourself, you can. That's part of what a workshop is, is you're trying it out too. So you can see here, this goes by step by step. It's pretty straightforward, but I'll just go over it here. And then I'm actually going to demo it for you partially anyways, because it does take a while to actually have it capture the motion capture for you in a meaningful way. So we start off by just going add media and the file menu. And then we come down here to where the number two is, and we're going to set what we want to capture. We want to capture the body. I want to capture the facial movements. You know, you can do one or the other both. What quality do we want? We want low, medium or high. And it's kind of like with virtual world browsers, going high isn't always the best answer. You got to test it and see what works best. And then number three here, we go when we click this button that says capture, and that brings up a capture window. Then we have to click capture a second time, and then it'll start playing here, and it'll do things in passes. So it's going to capture maybe all the facial movements, and it'll go all the way through that and maybe get to about halfway. And then it'll go through and do all the body movements. So it's actually using that computer vision to be able to tell where the head is, where the body is, where the torso, where the legs are, the hands, and be able to correlate that to the rig. And then once we're done with this, and it's captured everything, then we have an option to almost like running a server. We're going to start this thing up to activate render view. So we're starting up a service, and then we can play the animation on that service. But when you're done with it, you want to deactivate the render view because that's taking up your resources. And then you don't have to play it back, but if you play it back, you'll see it in this side here. So it'll show you what it's captured. You can compare it to what it was in the original. Then we come over back up to the top where we started for one. And right next to it, we have one for export. And then this window will pop up that we just basically choose the file name and the export location. And when we're all done, it'll say motion data is successfully exported. Now, I will warn you, when I was doing this on my computer, it put this window behind the screen that I was using for MoveMe. So I had to minimize this to be able to see that. But again, it's early development of the software. There's a number of file formats. I know FBX is one. That's the one I choose to use the most. If you're looking for its use in places like OpenSim and Second Life, it doesn't export out to BVH because that's a very specific format to those worlds and not as much into what most other applications would use. Yeah, thanks, Dave. Mike Higgins asked what file format does it export to? Yeah, FBX. So we have that there. And then I'm going to give you a demonstration of this. Let's go to MoveMe. Okay. And here's the MoveMe screen. All right. So what I do is, again, just like I did on the screen for you, I'm going to show you inside of MoveMe. We're going to add the media. And I did it in black and white here, but I want to do it in a color version. So I'm using my color version that you all see me in my M&M pajamas. So here I am in my backyard. I'm doing a thing here that I play with my dog. I got a little Yorkie, Honey Bear. And she loves to be chased around the yard. And what I do is I do this thing, zombie needs puppy brains. And she just loves it so much. She starts running all around trying to, she gets behind me. And then I did that thing kind of like in, if you remember, Monsters Inc. I do that thing where, where did she go? I don't see her anywhere. So, and I tried it with her in the scene, and I tried it with her out of the scene. I think, I'm not exactly sure, but I think it's better for the technology to restrict it to just the one actor so that you don't confuse the optics. So as you can see here, I'm playing back my motion of the zombie looking for the puppy. Okay. I won't show you everything, but there, there you go to show you a little bit of it. So the next thing I do is I come here and I can say body. I can also capture the face. And you see when I press face, it added the facial part of the rig to it as well. And then I have accuracy, it starts on low, but I'm going to kick it up to high. And well, actually, I'm going to try to low this time to see what happens. Maybe it'll rig it real quick. You can do custom capturing, but I'm going to leave that be. And then I come here to capture. And you have to press the capture button on this one as well. Now it's loading the materials. That's going a lot faster because I went on low. So you can see how it's capturing it. It says in the bottom here, face capture. So it's going to go through and I got a status bar and it'll show me as it goes all the way through. When I actually recorded this thing and did the capture for the file, this probably took me an hour and a half for the 15, it was a little less than a 15 second clip, 14 second clip. Okay. So, but when it's all the way done, if I let it play out all the way, but for a time sake, I'm not going to do that with y'all is right here, you would see the animation and you could put it into, you activate your render view and then you would play your animation and it would play out in this right hand side here. So remember, if you put it into activate, always best to turn it off. It's kind of like a, almost like a server. You know, like when you stand up and open sim grid, you want to turn it off when you're done using it. Same kind of thing. So when I'm all done and I just want to export and I don't even have to preview it like I did. I could have just gone here to export. You go to export and it'll ask you what kind of an armature if you click here, you have a metal rig, you have a basic human rig and then you have a human rig with the head controls. So because I did the facial, I'm going to go with that one and I just give it a location and I export. Okay. Any questions on that before I go on to the next section? No, nothing in chat yet. So I think it's pretty clear. Okay. It's pretty self-explanatory. The only thing is it's hard to get the results you're after, but like I said, early days, I think it has a lot of potential with movement. Okay. So let's go to the next one. All right. This is Mixamo. Oh, Dave, here's a question. I don't know. You do some testing. You'll see how it does the names on it. Dave, did, excuse me, Mike had a question actually and actually there are a few questions in chat. So from Mike Higgins, what bone name standard does it use? Yeah, as far as the bone names, I don't know. I didn't look at it that closely, but I will be bringing it up on screen. Yeah, I'll bring it up on screen for you so you'll be able to see the bone names when I bring, when I'm importing it in. Okay. Okay. General noise. I didn't really notice. I wasn't quite getting the results I wanted yet. So it wasn't an exact replication. It was close, but it wasn't quite exactly. The one problem I did see with it is, and you'll see a little bit as I play the animation for you as I'm importing it, it doesn't move the avatar, the character in the scene. It just has the movements, but it doesn't have them physically moving within the 3D space. So you'd have to animate that yourself, I think, or that might just be a thing that's still being developed. Okay. So the second one I wanted to show you is Mixamo. This is a company that was separate from Adobe at first and Adobe saw it and they liked what they saw. So they bought it up. They broke up the pieces. Some of them are available. Some aren't. I mean, they used to have Fuse, which was really cool too, but now it's very limited. But now Adobe is going in different directions with Mixamo and it's integrated with Arrow, as you'll see in a minute here as well, which is their AR app. But the nice thing with this is it gives you a motion capture library along with 3D characters that you can use. Now you'll need to check their rights to see for your application if it's okay to use it that way, obviously. But it's really nice. And I think it's still free to sign up for an Adobe account. Somebody knows different, please type it in a local chat. But I think if you don't have an Adobe account already, you can probably sign up for one for free and still be able to use Mixamo because this is a free service, not a paid service. Personally, I pay for Adobe every month, so I've got it under that. But let me show you what this is like. Oh, and you'll see here that these are the formats that this exports out to, so FBX, Colada. So you can export the models out if you want to use them for open simulator or Second Life as well. Okay, so let's go over here to Mixamo. So I'm in here and I got my doctor, but you know, I don't know if I want to use the doctor. So up in the top here, you have characters and you have animations. So I'm on the Dilby's Mixamo site now and this is just Mixamo.com. I can do a search here for the type of character I want to use. In this case, I guess, well, why don't we do a zombie, right? Since we're talking zombies. They got all kinds of horrible looking zombies. Let's try the Girl Scout. This one's hilarious. It's not the same kind of Girl Scout that they had when I was a kid. All right, look at that. She even got a baseball bat. Slugger. Okay. So what we're going to do is we're going to animate this. So we go up here and we have characters. We have another one here for animations. And I typed in zombie already. I got it in there. But you can also, as you notice here, you can also get restricted if you want to only be combat ones, adventure, whatever. You can do that as well. Let's say we have her do a zombie scream. Okay. So there she is doing her zombie scream. Now, if left and right, I want to flip because some of these are more specific. So let's say we want it to be a left-handed person instead of a right-handed person. I can click on this thing for mirror and it'll flip it over to the other side. You don't notice it as much with this one though because both hands are doing almost the same thing. You have trim here where you can change the length of this, where it's starting point and ending point are. What's really nice is that whenever you get an animation, as if possible, get the first frame and the last frame the same. Otherwise, you'll have to weave that if you want it to be seamless later if you're going to loop it. You have character arm spacing. So if I want her arm spaced way out, I can do that. And that's very important because some of these characters like, you know, here we have a little girl, but let's say it was an older woman with rather large breasts. Well, we don't want the arms going through the breasts when she's doing the zombie scream. So you might have to, you know, get the spacing change for that. And then the overdrive is the speed of it. If you want it very fast, or maybe she's just a very slow zombie. Okay. So you can set things up just the way you want with this. But let's say you have your own character you want to bring it in. You got an option right here. You see underneath the download button to upload character. And it'll actually go through and allow you to get your character rigged by Mixamo. That's a really awesome thing too. It takes a little bit to get used to you're basically just going to put points for different parts of the head and things like that in the body. And it takes a little bit of experimentation to get to work right. And the other one here is if you ever want to take your stuff and also use it in augmented reality, send to Arrow. It'll send it to the Arrow app. And then you can use Adobe Arrow to output it on augmented reality. And that's on things like smartphones and tablets and things like that. So when I finally got her just the way I want, I just hit download and I picked the formats. In this case, I just probably go for FBX binary. I could do that or Unity. They have one for as well. They also have the FBX ASCII. And you got a couple different versions of FBX as well as Collada. Collada obviously would give you something that you could use in open simulator and Second Life. But it won't export. I don't think Collada, does Collada have animations? Well, it looks like it. Well, no, I didn't get to select Collada. Let's try this once. I'm a little curious. I can get it to select Collada. I guess it's not eligible for Collada because I have the animation in here. I was guessing that Collada is just meshed without the animation. So, and then the skin, do you want it to have the avatar or not? I usually like to bring it in because I can always take away the avatar afterwards, at least that way I can see it. And then uniform or none. I just leave these on defaults. Frames per second, what basically these are, 24 is what they use in Europe for broadcast quality. 30 is what we use for broadcast quality in the United States. Or 29.97, pretty much. And then 60 is what they use for VR. So, depending on what you're using it for, but I'm just going to leave it on defaults and then I'd hit the download link. So, that's how Mixamo works. All right. So, I mean, since I'm going to go in and show you how to animate zombies, I thought it'd be kind of cool to show you what I do to make zombies in the first place, you know, because that's kind of the secret sauce, so to speak. So, what I'm going to do is I'm going to show you inside of here. So, this guy here, Alessandro, I think it is, and he was actually a free asset that they were providing to their customers right now. But I took him and I turned him into this guy here, my little zombie guy. Okay. I'm going to show you a little bit about how I do that. Yeah. A question from Al Scotch. No BVH export for Mixamo? No. Now, as far as I know, BVH is only used by Second Life and Opensim, which is putting it a bit behind the times, in my opinion. It's not a format that's openly used in the community. Now, the NM format is used by Autodesk, and that has more potential. We'll get into that one a little bit later when I talk about allies work for the community and what he's doing in Blender. Now, reply, so why are we here? That's up to you. That's up to you. It's all a choice. So, this is a snapshot view of how it started and how I created the avatar or the 3D character. So, with Character Creator 3, it's very similar to some of the virtual worlds in that you have a marketplace. And you also have, in this case, a content store. They're two separate places. One is done for developers when they first start, which is the marketplace. And then once they've sold enough things and become well established, that's when they can sell things inside of the content store. So, that can be things like clothing, vehicle, hair, plugins, poses, animations, all that kind of stuff. Now, one of the really nice things is they're coming out with a new version of Character Creator and iClone that's being released in the first quarter of next year. So, that really has a lot of promise to it as well. And like you're saying, well, why are we even here? I'll give you one thing back on that. I used to work with a group of futurists. And one thing I figured out from that is that when it comes to the future, your best way to know about it is to study it and to have conversations about it. So, you're looking at things and seeing the direction that things are going makes all the world of a difference with things. So, you know, that's my two cents on it. And you can send your characters between the two different applications back and forth. So, that means even though I got this zombie that's giving me this medical worker put together now, maybe later I need to have her become more zombie-ified if she gets bitten. So, I can actually push updates back and forth between the two packages. And this is what my zombie looks like. And inside Character Creator, just a quick overview of the interface just to give you a little idea of how things are laid out. On the left-hand side, you have basically like your assets. You'll have your scene view and your content and things like that in your store. All the items that you purchased are going to be on this side. On the right-hand side are your modifying areas. So, things like your textures. If you're going to do morphs on his face, his body, you know, things that you're doing to change it, your rigging, all that stuff's on here. This thing can do pretty much most of the stuff. I can edit the pixels and everything in here. Your pose, although it's not as good as like my or Blender on that aspect, but you can export out to those and then bring things back into a certain extent as well, especially with ZBrush. Over here you have some panels that are pop-ups. There's pose and there's expressions so that like the way that he looks on his face that was done through expressions and the pose is the way he was set up to work here for his body. So, let me go into there and I want to show you just a little bit and hear the magic behind the scenes. This thing's almost like a special effects workshop. I mean, when you go into these things and you look at it, you know, look at that detail. Horrifying, isn't it? Yeah, this guy's had more than one trip around the zombie farm. So, what I wanted to show you with this, when I come into this, I got to select my avatar. So, I go to my scene view and I select Alessandro. So, I have the avatar selected and then I want to go into this area that I have here that's called appearance. I'm going to activate the editor. And the other thing too, when you're talking about like using things like this in virtual worlds, you know, this can also give ideas and spur creativity and ideas of how you can do things in other environments. I mean, a lot of times, like for example, Second Life, a lot of the things that they did, although it took them like a decade to get some of the things, they were following what was already being done in the gaming industry. So, understanding how these things work helps towards that. I mean, it'd be nice to do this kind of stuff in world someday. So, we come in here and these are different levels of makeup that are applied to this avatar. Right now, I'm working just on the skin, not the makeup, and I'm on the head layer. So, anything I do is only going to affect the head layer. There's a body layer, arms, legs, and nails. So, I come down to the third one here. See this scar that's right underneath his eye? This great big three slasher. If I click this, it turns it off because that's what that is, and that's one way to tell. But I'm going to select it, and now I can come down here if I wanted to scale this thing up, cover his entire face. I could do that, put that back. But if I also wanted to offset it, I can move it. You can see how it's moving towards his ear there. Hopefully, you see that. You know, while you're doing that, Dave, just to clarify, ball peen hammer, are these blood and morphs procedural? I don't know for sure they're with sliders. So, I think they are procedural, but I'm not 100% sure. I'm not the one that made this portion of it. It's an effect that I purchased. Some of it comes with the software, others you have to buy as they add on to it. For the phonemes and the visemes for the English language, we're going to show that in just a minute. Right now, I'm a character creator. You'll see that when I get to iClone because iClone is where you do all the animation. Yes, it does work with visemes. If you're not familiar with visemes, it's kind of like phonemes, but it's basically your mouth shapes. So, instead of the letters of the alphabet, it's the actual sounds that are being made, the shapes of the sounds. Okay, so when we come down here, as you saw, I can move it around. I can even change the bevel. I can change the colors, all kinds of cool stuff, but you'll notice too that he becomes a naked zombie. I know that's horrible, right? It wasn't bad enough closed because you're just looking at the skin. If you go to the makeup layer, you're just going to look at the makeup and you won't see the skin stuff that we just did. So, it's going to limit it so you can just focus on one thing at a time. So, if I want to come back out of this and get back to my regular mode, I just simply uncheck this little box up here by skin that says Activate Editor. I don't expect you guys to memorize this. There's no test afterwards, but I just wanted to share a little bit of the secret sauce to show you what the things you're liking. Maybe it'll get you interested in technologies and who knows, maybe it'll lead to technologies being used in other applications in the future one day. Especially, I know we have a lot of programmers that are looking at these things too. So, and then what I can do with the avatar is I can transfer him. I can export him to send character to iPhone or I can export it out as an FDX. I can put it into Unity Unreal. I can send it to ZBrush with Gauze. I can also export it out for use in Blender. So, it has a lot about applications. Now, right now they have this thing called 3D Exchange. I won't get into this too much because it's going away, I believe, when they go into their next version of iClone and Character Creator. But right now, this is the way we have to import things in because they wanted to control things at Reallusion as far as what you're allowed to export out. Do you have license for it? Are you a premium supporter that gets that option to be able to export? Now, I also have to buy things with exportable rights to be able to use it. So, I don't have those rights unless I pay for it. But one of the things I wanted to point out on this is sometimes when you're bringing things in, even though my thing was saved out of 30 frames per second, for example, on Mixamo, when I brought it in, I uploaded it at 60 frames per second because if I don't go with the defaults, it's not going to work. So, I don't want to get into too much of the interface here, but I'm just going to give you an overview to show you what it looks like. And that'll also give that one person that was asking earlier about the bone structure a chance to see. So, this is the one that I recorded, the colorful expert zombie needs puppy brains. So, we're going to import that. This is iClone 3D exchange. So, again, I get this dialogue box here. It's asking 60 frames per second. I'm going to say, okay, and just say go with all the defaults. Now, in this case, all it's doing is giving me the armor truck out of way to the window finishes. Hopefully, I'm not going too fast for anybody. Trying to keep it moving. Okay. So, now it's in here, but the thing I found a little difficult, and I'll expand this over a little bit if I can. Actually, can I open this up more? No, I don't think it lets me open that window. It's trying to show you more of the bone structure. But move me spine, spine 001, spine 002, 003. And if that tells you anything on your bone structure that you're looking for, cheek. Looks like the name conventions are a little bit different than what I'm used to seeing. But we come down on the side here and we'll see that there's the animation. So, I go here and I select the animation, and now it allows me to hit play here. So, I can see this is that one I did in my backyard. As you see, I'm moving around, but I'm not really moving as much as I actually did in the 3D space. I'm almost tied to one spot. Okay. But it did have a bit similar to what my motions were, but like I said, it's still early days. Okay, Mike. Yeah, it's not a made for open sim asset. No. Again, I'm looking at technology that's leading towards the future. That's not all going to be based on open sim. I am going to show you something about open sim in just a couple of minutes if you, you know, want to stick around for it. Okay. So, once I get this, I'd have to export it out. And I go to export and I pick to send it to iClone. And then that will send and then export the animations. I would export that so that it gets into iClone. All right. So, that's 3D exchange. Now we're getting into iClone. We'll get into the animations a little bit more than somebody was asking about. Yeah, you might need to bring it into another tool to get it to change to the bone structure or sometimes what you have to do is you have to actually get in there and you have to rename the bones, you know, provided the rig correlates to the same bone structure that you have. It might just be the naming conventions. But I do agree that it's a lot easier when everything's set to do it. It's just that most of the industry is not focused on open simulator right now. So, when it comes to it, it's more what's used in animation and gaming that it's focused on. And that's where the disparity between the two sometimes makes it rougher for developers because it's not using the same naming conventions. It's not using the same formats. You know, like I don't use BVH in anything aside from open simulator in SL. All right. So, here for real illusion, iClone 7, it's very similar in its layout to what we saw in character grade of 3 with your inputs on the one side, you know, where you're getting your content from. And then on the right hand side is what you're doing with that content, how you're animating things like that. So, one of the things I've got here is this is my iPhone 10x and with the 10x, one of the nice features that it has. Am I still sharing on your screen there, James? You are. Let me switch over to that. I was distracted. That's okay. Okay. So, if you see here, this is my face on the phone. Okay. And let me see if that actually is working. Where is the... And while Dave's doing that, if you're not familiar with Zoom, there's a view option drop down at the top of the window. And if you choose the side-by-side view, you can enlarge the image that's coming from Dave's cell phone. Okay. So, the iPhone 10x, they incorporated a thing called depth camera. And it gives you capabilities you don't have on other devices. I don't know if they've incorporated that yet on Android. I don't think they have, but it probably will come in time. But I tap my screen. And as you can see, I have a mesh on my face now. So, any movements I make, eye blinks, any kind of movements of my head is going to be projected onto the avatar and I can record that. Okay. So, that's one of the nice things with the iPhone 10. But it's the only one that I know that has the 10 and beyond. And then what it ties into is this thing called Motion Live, which is an app that I had to pay for. The one on the phone was free, but the one that was on here you had had to pay for. Motion Live can be used along with Live Face and that phone. So, oh, and then another thing, we're going to take a look here at the very top right. You'll see if they have different options. We're going to go through a couple of these. I can record my voice or even better than that, there's one called Acculipse, which actually works with the Visims. So, it'll record the Visims for me. I can do a audio file or I can do text-to-speech where I just put my text in there and it goes ahead and it puts it in place. And then there's one on the bottom here. That's the one called Live Face, which is separate. Personally, I like to use the Live Face more for general movement of the head and the face and leave the actual lip syncing up here so I get it with the Visims support. But we'll see that in a minute here. So, let me take you into IClone and show you. Okay. So, here's my zombie scene and I'm going to take my phone and just put it down right now. So, I don't know if you want to switch back to your regular face for right now, James. So, I- Oh, okay. And while we're switching, Al Scotch asked, can it get tongue and teeth? I don't think the tongue and teeth move in the one, at least I haven't figured out a way to do that just yet. It has the teeth in there and you can see it as it opens, but I'm not seeing the actual movement of the teeth and I haven't seen the movement of the tongue yet. But it might just be that I haven't had applications to use that part yet. I mean, they're there. I just don't know if it might have the option to actually animate that part of the geometry, but like the Live Face, it doesn't do the tongue and teeth as far as I know for auto-animating it. So, all right. So, I go into my scene here. Again, my inputs are on the left-hand side. I'm going to have a regular zombie, actually what he looks like. Okay, there's my regular zombie and he's not- If I hit play here, you can see he's not animated at all. He's just standing there with his meat cleaver doing nothing. So, I can go into my content and these are ones that I imported from Mixamo before the class and I'm going to just drag this one and put it on him. And now, when I hit play, you can see he's going to move around a bit, just a very slow idle. Okay. And it's only going to go for a portion of a time though. You can go in there and loop it if it's loopable. One thing you'd like to do with loopable animations is make sure your first frame and your last frame are the same so that it makes it continuous without it having a point where it jumps. So, we go back up here to the palisadro. That's the one that I was making. That's actually an earlier version. I'm going to show you what I'm going to do with him a little bit to make him a little bit more gruesome. But I got the zombie in here and I want to put some of these animations. So, normally you're going to start, you'll have this stuff here where it's taking you to 3D exchange and stuff. There's a tab at the top for animating. Once you have the avatar selected, I can do things for the body here. I can do things for the face. There's morphs, there's plugins, plugin motion library we're going to see in a minute here. So, I'm going to go down here to the facial. Actually, let me put before I do anything else. Let me put that same zombie or that same idle animation on him just so that we've got something to start off with. And make sure that you see where you're at in the time zone because the timeline, because wherever you are is where it's going to start recording it more or less to put in things called keyframes. Yeah, bento is specific to SL, not to the industry. So, no, you're not going to find bento. It's not going to be called the same things. So, we come here and we do create a script and this comes up. So, first one I'm going to do is I'm going to do an audio file. Now, I'm sorry, the first thing I'm going to do is I'm going to do an acolypse. I'm going to do an acolypse. Now, I'm going to do an audio file. Let's do an audio file. So, I'm just going to record audio. So, I click on this one. Hold on a second. I'm sorry. I should have scripted my notes a little bit better on this one. Create script, text-to-speech audio file, script file, acolypse, I'll record voice. I'm going to record voice. I'm sorry. I skipped that one while I was looking at the screen. So, okay. So, I'm just going to record my voice. This is just going to be using my microphone. So, here we go. All right, zombies of the world tonight. Here we go. Zombies of the world unite. I can play it back. Zombies of the world unite. If I don't like it, I can start it over again. I'm just going to say okay for now. And I can hit play and you'll see that it's going to be lip synced already. Zombies of the world unite. And, you know, this is a starting point. It's not perfect, but it's a starting point to get your lip sync going. You might go in there and do additional things to push it a little further. Sometimes, too, you have settings on the tabs that you can set to push it a little bit more. So, at this point, I want to do another one. I'm going to go here to create a script. I want to show you. You can do text-to-speech. All right. So, I thought I'd do something a little bit more Cruella de Villes here. Bring me those puppies. I'll just put one at the connection point. And I can say here it's bring me those puppies. Okay. Cruella, I want to go female on this one. So, I'm going to pick a different voice. And if I had additional voice libraries, I could use those as well. Bring me those puppies. Okay. I think her pitch would be a little bit higher. I think her speed would be a little bit higher. Let's hear it again. Bring me those puppies. Okay. So, I think that's pretty good. I say okay. And now it applied that. So, let's see how it is. Go all the way back. We'll hear them both. Zombies of the world unite. Bring me those puppies. Okay. I can scrub it and see where it's at. I think we're good there. All right. And then the next one I'm going to do is going to be another great script. But for this one, we're going to do the one the guy was asking about earlier, or the person was asking about earlier about with the viscims, which are basically mouth shapes. So, we're going to do a thing called aculips. We come in here and that video file that I did earlier, where I was talking about zombie needing puppy brains for my little doggie. We're going to go here. Zombie needs puppy brains. And here we can hear it. Zombie needs puppy brains. Okay. So, I hit generate text. What this does is it places the text underneath the audio so I can see where each word is. But you're going to see something unusual here. It's not going to quite be the right things. And you can also see there are some things in red that it doesn't quite understand. Okay. So, I can see that this isn't quite right. It should say zombie. And then needs not needs puppy brains, not braves and exclamation point. And then I'm going to say brains, comma, brains. Just wipe all that out. Just the brains, comma, brains, comma, brains. Okay. And what I do is now that I've typed it out, and another way to do this too, like I did one the other day with a singer, I took their closed captioning and I copied and I pasted it in year after I linked in the aculips on it. And then I had to still go through because closed captioning isn't perfect and make some adjustments to it. So, I've got that all set. I say align. Let's see how that turns out. And we can preview it to see before we actually accept it. So, I hit the play button. Zombie needs puppy brains. As you saw there, it was a bit off because I did the odd beginning that I forgot about. So, I come to the very beginning here and I'm just going to put that aha moment and let's hit align again. Okay. And now we're going to play it. Zombie needs puppy brains, brains, brains, brains, brains. And then I do have another odd the end there that I don't really need. So, I'm going to go to just before that and they have like that beginning and the end marker just like if you're editing video. I had that one and the first one's put at the very beginning. And it's good to go now. It's got everything it needs to have. I say apply and now it's in there. Let's see the whole thing together. Zombies of the world unite. Bring me those puppies. Zombie needs puppy brains, brains, brains, brains, brains. Okay. That looks like it still brought the sound in on that but I can clip that off afterwards. But and I can move these things around if I have to as well. But I can also do some body motion on this. So, like let's say at some point I want you know he talks about unite. Zombies of the world unite. Okay. So, we got unite here. I'm going to take his hands and I'm going to mirror it and I'm going to go to record and spacebar. Okay. And then I'm going to go a little further down and I'm going to hit record again spacebar. Okay. So, I just added some motion and what's really cool in the new version of this that's coming out in early next year, you'll be able to pilot this like you do an avatar in a place like Second Life or Open Simulator and record your motion capture of the avatar. So, you get to work like you're just playing a game or in a virtual world and you can record yourself and do motion capture inside of a virtual environment. I think that's so cool. So, we can see this now as we play it out. Zombies of the world unite. Bring me those puppies. And obviously, this is very rough animation. You'd have to do smoothing of it and things, the curves and things like that afterwards. Which by the way is you come up into the window here, there's a workspace for animation that brings all that up for you. So, that you have your timeline and you can see everything and work your curve editor and everything. Control two, we'll go back. So, the final thing I wanted to show you in here with the animations is motion live. This is the one where I'm actually working with my camera. So, if you want to put the camera back on, James. Okay. So, you're seeing in James's view, this is off my iPhone 10. So, I get this. I have to set it up, make sure the server connection is right. And then I have to pick which characters since I have two zombies in the scene. And I say go live face. And now I have a preview and a record button. And you basically preview it to test it first to see. And then you can do the record when you're ready and you can use a space bar to actually start it. So, what I'm going to do with here is I'm just looking to add a little bit of facial movement and head movement to it so that I can make it feel a little bit more real. Okay. So, we come into here and make sure that you can see my face good enough in the view. All right. I'm going to hit record and the space bar. Zombies of the world unite. Bring me those puppies. Zombie needs puppy brains. Brains, brains, brains, brains. Okay. So, we can see what the final piece is like. I close out of that. All right. So, we're going to go back a little bit. So, that's the scene here. Okay. And are we all the way back to the beginning? Okay. Control two. Make it go full screen. Oh, I'm sorry. Control seven. Okay. Here you go. Zombies of the world unite. Bring me those puppies. Zombie needs puppy brains. Brains, brains, brains, brains. Control two. So, I mean, this wasn't a, you know, with the intent to make something that looked finished or anything when it came to this part. I just wanted to show you the techniques of the technology so you can see how things work a little bit. But I'm going to show you a more finished piece in a little bit. But let's get back. We had been, we did have some questions about the open simulator and I wanted to share some something with you on that as well. So, let's go to that one. So, from what I heard from, oh, geez. I can't remember now. Who shared the data with me, James? Oh, Kayaker. Kayaker mentioned that allies wrote a Blender plugin. He's working on it. That can write the A-N-I-M file, formats for animation files. So, this is useful if you're doing work for the virtual worlds of Second Life or open simulator. You can download it, the link that we have here. In the past, you can only do this kind of thing through Avastar. So, this allows you to export things out from Blender in the .A-N-I-M file format instead of doing it in BPH. So, that's a pretty big improvement. The .N-M format is used by, I believe Autodesk uses that one as well. So, it's used in industry. So, information about it is listed there in the background on the anim format in SL. And then, of course, there's the link for Blender if you're not familiar with the software. What it basically does is you see here in the file hold on menus, you'll have a section for OS SL that allows you to do different things with the rig. So, good work on that. Nice to see them doing things that are actually being used inside of Blender as well. Now, while we're talking motion capture, it's also good when you're working with animation to understand the basics of what makes good animation. And there are what are called the 12 principles of animation here. I've given you some links to learn more about them. You have Disney's Nine Old Men. Disney was the one that pretty much set the standards for what is animation for characters and animation in general. So, a lot of these things were brought into techniques, these 12 principles, which are also outlined in this book called The Illusion of Life, which is by Frank Thomas and Alan Johnston. So, I highly recommend checking that out. And these links get into it. I'll show you a brief one, but I'm not going to get into it too much. I just wanted to show you where the stuff is so you can learn more. But like, for example, this is on Disney's TheDisney.com site. They talk in squash and stretch, and you probably remember this as a kid seeing the cartoons that were over exaggerated, you know, stretching and squashy. Staging, like when you see the bunny take off here from Bambi, the bunny ramp, it pulls back before it leaps forward. So, there's anticipation that something's going to happen, you know, so that's a big point. And then you got staging, and I'm not going to get into all the specifics straight ahead and pose to pose. A lot of times, I'll do pose to pose animation where I'm going from one strong character pose to another strong character pose, and then working the in-betweens to make it smooth, make it feel right, the timing of it. But you'll see all that stuff on there. And then the other one I wanted to show you, I got this from my animation instructor in school, and I thought it was really good. This is on Tumblr, the 12 principles. Nice thing with this one is it gives you a visualization of all 12 of these principles right on your screen. So, you know, you can really see how the squashing and the stretching is. You can see how the anticipation back and forth, solid drawing, how it making things look like it's 3D, even though it's on a 2D screen, you know, makes a big difference. So, I encourage you to explore that as you have an opportunity to. Let's go to the next slide. And then I give you some resources here. You got the link for Get Move Me, if you want to try that out. Like I said, it's free to use for right now. I don't know how long it'll stay free to use though, because he is looking to make it into a paid for one eventually. Mix-N-Mole, I believe, is free with an Adobe membership. Reallusion, if you're interested in the software, it's a little bit more expensive and you have to buy a lot of content. I've probably invested about $10,000 in it already personally, but for the kind of work I do, I use that. Animation books, these are some that I really think are some of the really good ones. One of my main instructors, oops, let's go back, sorry, you need to click. One of my main instructors for animation at the Art Institute, he used this one and he knew the guy, Eric Goldberg, as well. The animation survival kit is a nice one. There's also a set of videos for it that are good. And there's even an iOS app for it. I don't know if it's still on the store or not, but this is done by Richard Williams. He's the guy, if you remember who framed Roger Rabbit, he was the one that created that movie. And he's also done a lot of other animations that you've probably seen growing up as a child. Like if you remember, I think he was the one that did the witch and Bunk's Bunny, if you remember that one. Then there's the illustration of life, and so some good resources to use there. So that's pretty much it. I just wanted to let your palate get you interested. Like I said, some of these technologies I think we're going to see really take off now that there's a metaverse focus and a lot of things and big monies being spent. I think you're going to see computer vision really take off. And at this point, I think we can probably close off the YouTube stream as we go into the VIP after party for those that registered early. That sounds great, David. Thank you so much. And thank you to everybody watching on-