 I've been in the 3D industry probably about 15, 16 years now. I've done commercials, films, AR, VR, games, back to short films again. So kind of, if it's 3D, I've been around it or done it. I'm a principal art director for the Fuzzy Pixel team. Our Fuzzy Pixel is a creative team within AWS. And we did this short using Nimble Studio, like you mentioned, and Blender. We're a group of industry vets. We come from kind of all over the industry. And our mission is to use our products and services as our customers would. So we're doing real-world productions, we're creating the highest end content that we can possibly create to really battle test all of our services and make sure that they're ready for creatives when everything goes live. And we're kind of referred to as a customer zero. We hit everything first and make sure it's ready for you when it comes out. So our Fuzzy Pixel team, we produced four shorts to date. We did Spanner back in 2020. We did a couple of very short shorts. Shockingly fluffy and getting fuzzy in 2021 just to kind of wrap our heads around Blender in preparation for our bigger short, Pichu, which we put out this year in 2022. This is our main character, Mayu, and our splash art for Pichu. And then I just wanted to share something from our director, Amaro Zez, about the story of Pichu. So this is from Amaro. He says, My mom was a big influence in my life. She taught me the importance of using my work to contribute to society and the technology could be used with a purpose to create and transform. Connecting technology with the heart to tell the story transforms the work a team involved, its leaders, and the people who watch the story. It moves us to act in our reality and that transforms us as people. The concept of Pichu was a story that I had already been working on since the end of 2020. While writing this story, I came to the cruel realization that Ecuador is not the only country that struggles with children's education. It's a global problem. I wanted to create a story that has a great social impact. Pichu is the story of an Andean girl who will have to face the challenges of life with the support of her mother. With great pride, Pichu is based on the Ecuadorian culture using its beautiful landscapes, clothing, and Andean music. So one of the things that Amaro mentions there is authenticity. And this was really important to us. And one of the things is that we really wanted to ensure as we were creating Pichu that we stay true to the Andean region of Ecuador. And it was important that the people from Ecuador look at something that we create and say, hey, that's me, that they really feel connected to it. They don't feel like we did a poor job of representing them. We really wanted them to be proud. And you'll both see and hear how we explored the Ecuadorian culture throughout the architectural environment, clothing, and the characters. And to that end, we got Amaro's mom involved. She plays one of the characters, Kapa. She plays the mother in the film and is also our cultural advisor. So as doing anything with art, it all starts with reference. I needed to immerse myself in the cultural aspects of the Andean culture. I'm looking for, to back up just a little bit, I'm a character artist at heart. That's where I come from and then we're doing production design for the film. So I always start with character. So I'm looking at skin tones, eye, nose, and mouth shapes. You can notice things in the characters, like they get very sun, wind-burned cheeks. And these things are very important to represent in the film. We also got to learn things like there's different ponchos between the boys and the girls. The girls end up being more of a blanket that they wrap and pin together, where the boys poncho is like a whole poncho that's all one contiguous piece. And then hats and belts and all the costuming and things like that. And to that end, I created a series of 14 speed sculpts of which you can see a few here. Some of them pretty good, some of them kind of crap, to be perfectly honest. But that's important. It's just try and fail until you get it right, basically. So I started off with something more realistic as the top left one there and then tried to push and pull. And by the bottom left one there, I had something looking sort of decent, but then I wanted to push it to something more unique, created an unique style for Pichu and ended up like pulling her face horizontally, created more of a chunky round style where everything's a little bit inflated, a little bit puffy. There's no hard edges to anything, like you'll see in some of the architectural and things like that later. And I hear like much larger bevels, slightly exaggerated proportions. But we did want to have a material truth. So like the cloth feels like cloth, skin feels like skin. The eyes use refraction shaders and things like that to pick up the light properly. And then blending the style and realism together to create our own unique look. And this, if I can get it to play. This is a turntable of our characters. So Mayu there in the red poncho, Cop of the Mother, and then her friend Saewa in the lavender poncho. How do I get this to move on? Go to the next slide here. Can't you? Oh, here. Okay, thanks. You can also see how in the foliage, these are paper trees. It's very iconic to the region of Ecuador. They grow up in the high altitudes. And then how we put the style pass on that. The image on the left is sort of a less stylized paper tree where there's lots of little leaves, like it's in the minis it would normally have. The leaf colors are more warm tones. And the paper tree on the bark is a little more detailed, things like that. But when we put our style pass on it, we removed about half the leaves, inflated them, made them much more puffy. And then we even pushed the leaves cooler color. With the intention, we really wanted to be intentional with our use of color throughout the film where our character Mayu is in bright, vibrant, saturated colors of reds and things like that. And then we wanted to put her at visual contrast with the environment to make it feel more dangerous as she's moving through her journey. So we made the environment much more cool in color. Just to, one, draw your eye to the character, but then again, like I said, to put her at odds with the environment just visually. And then we went through the architecture as well. You know, everything is rounded bevels. This image on the left is actually a door that Amaro's dad produced. He handmade, because he's an architect. So as part of his thesis project, he made that. But then we put it through our style pass so it has no sharp edges. Again, we made the colors more cool and desaturated. But we still have our material through so the metal feels like metal and wood feels like wood. Another example of the house. And then the school where Mayu ends up. And then we also, like I mentioned previously, we wanted even the music and the sound to be authentic to the Ecuadorian culture. We hired local artists and singers. All the actresses in the film are from Ecuador. Our local musician here, he actually made all these instruments and played them for part of the film. And he even went out into the wilderness and recorded like the river sounds and sounds of birds and frogs and things like that. So everything we could was as authentic as possible. And then at this point, I'll hand it back to Olesky to talk a little bit about the pipeline and things we use for Nimble. Thanks. So there's the recipe for getting ovation. You just need to pass it over to Snow. So as I already mentioned, we made this short completely in the cloud using a managed service from AWS called Nimble Studio. You see in the top left corner. So it's kind of overarching the whole thing and abstracts the management of the underlying pieces. And these are depicted here. So we used different workstations for the artists. Some of them were running Linux, some were running Windows for different kind of DCC software that the artists were using. And yeah, I think some of these, Chris, we'll go into some more detail later on. We'll be milking this like ovation thing. So there will be a couple of transitions. But so before going into more detail on what Nimble Studio is and what it does, let me take a step back and look at a typical pipeline for such a project. You might have seen this before, and this is an overly simplified version of what's really happening on a project. So a movie consists of shots and you move those through this pipeline, right? So you need assets like sets, props, rigged characters, and so on. And these are created in advance. And then every shot goes from layout to animation to character fix and so on over to finally to editorial. And this is kind of linear, what we show here, but in reality there are iterations in every step, correct? So you want to make it better and better and better. And the cycle repeats many, many times. And the more iterations you put into it, the better kind of quality level you end up with. And this is an important point because the number of iterations that you can make actually depends on kind of capacity that you have in your underlying infrastructure. And it's important because, like, supporting this pipeline is technology. So it's artist workstations, storage, so for assets, asset repository, and render farm. Now that we speak about render farm, probably all of you know that, but let me just repeat it once again. So rendering is taking geometry, lights, and camera relative locations and giving it over to a machine and saying, okay, compute what it looks like from the camera perspective. And usually before it looks nice, this process is pretty time intensive. So for many films it's typical to have render times of couple hours. Sometimes, like previously we were talking about days of rendering time per frame. You'll know there's like 20 or at least 24 frames per second, sometimes more. And of course it doesn't make sense to run it on the same local machine where the artist is working, so we uploaded to a render farm. And render farms were logically the first component that studios tried moving into the cloud because it has been one of the most prominent bottlenecks in a studios infrastructure mix. If we take a typical project, a studio usually starts having a certain pre-allocated server capacity, so over on the left side of the slide before things go wild. And then they notice at some point that they run into a bottleneck of not having enough of this capacity. So artists submit jobs, at some point it's too much for the render farm to handle and all available resources are consumed and then artists have to sit around waiting for their renders to come back before they can continue with their job. Not very nice situation. And then the studio usually responds by adding some capacity, depending on the duration of a project you do it several times and then you embark on this game of catching up with the needs of the production, adding capacity and figuring out how to do it at the same time juggling other tasks that you have. So there are certain inefficiencies to this approach. So anytime the studio has too much capacity to fill, this represents the wastage resources. So these light areas where no jobs are submitted to use the capacity that we have in place, essentially this is capacity that you're paying for but not using. And when there is a utilization above capacity, this means the studio is using all their resources, which is good, right? But this could be a potential bottleneck. So artists are waiting until jobs that are kind of going over the capacity will render to continue their work. Now the situation gets more dramatic if we take a typical real-life project where everything results in a crunch. And there can be many reasons for this. The typical suspects are story being revisited or character designs being altered late in the process. Whatever reasons maybe the result is that a lot of work is delayed and then it all has to be done in very limited time because the deadline is looming and you cannot move it. So teams try to iterate as many times as possible before the deadline to bring that to the quality level that they're shooting at. So basically it's a very difficult situation to handle. Of course this was an awesome opportunity for studios to turn to the cloud and use the virtual instances as render worker nodes. This was a huge win because it worked and they began looking for other opportunities to move other infrastructure elements to the cloud too. But curious enough, the studios didn't use cloud render farm quite the way we were expecting or like they were expecting. We assumed that many would just take the peaks of high utilization that we saw on the previous slide closer to the end not necessarily in the very beginning of the project and send these extra jobs to the cloud when the capacity was no longer enough. And mainly closer to the end of the project that was happening. But it turns out that this wasn't the case. Instead, studios spread out their jobs and took advantage of all this excess capacity at all times. Practically this was resulting in more iterations taking place throughout the whole production process not only during the crunch time. And yeah, so this results in more iterations, more reviews, more nodes, higher quality overall. Now let's zoom out a little. Rendering is not the only part that you can move to the cloud as I mentioned. And essentially you can build a studio in the cloud with all the other resources as well. So you can place your artist workstations there, you can place asset repository there. You no longer need beefy workstations in everybody desk. And this can be difficult. We released a guide describing how you can build a do-it-yourself studio in the cloud and this was hundreds of pages. Not everybody has like an army of cloud architects to build and support this. And we thought in everybody probably thinking there must be a better way. And there is a better way. And this is where Amazon Nimble Studio comes in. This is a service that allows you to deploy cloud infrastructure and be up and running in a few hours. So you don't have to spend these hundreds of hours or months before you can start creating. And it's a service that was created by creatives for creatives. Back when Nimble Collective came up with their idea, they were at Dreamworks and already had a lot of experience in the industry. And throughout their careers as animators and other professionals working in the industry, they saw that many artists were struggling to get their ideas off the ground. Because bigger studios only invested in limited number of movies, but at the same time smaller studios couldn't afford the tech necessary to realize their big ideas. And this team's main goal with Nimble Collective was to enable smaller studios with the technology available to bigger studios. And in 2019 AWS acquired Nimble Collective and we took some time to make the solution well architected and enterprise ready and we relaunched it in 2021 under a new name, Nimble Studio. On a very high level, Nimble Studio provides exactly the building blocks that we were talking about previously on a very high level. I'll dive a little bit deeper later. So these are workstations, shared storage for assets and render farm. And an important point, it's all in the cloud so you don't have to invest upfront before you can start creating. If we zoom in a little, here's what a simplified architecture looks like. Let's take it step by step. So on the project you will have artists who do modeling, animation visual effects, compositing, et cetera, and they need access to powerful workstations, preferably GPU based for hardware accelerated graphics. With Nimble Studio, admin can define templates saying, okay, this group of artists needs this kind of software and these other groups need something else and then you just let people spin up from these templates, spin up instances based on what they're supposed to be doing in a project. So it's not like everybody has access to everything. And the good thing is that you connect into it using something like Nice DCV or Teradishi. So it's really bandwidth efficient and you can use the peripherals that you're used to using like your Wacom tablet and it will support several high res monitors so you are not limited to only one screen and also some specific features on the peripherals that you're using will also be supported with this kind of connection. And one note here is that we recommend placing these workstations into the nearest AWS region. I'll show the map later because latency is important. So here you want to make sure that it doesn't exceed 50 milliseconds and for luxury performance like 20 milliseconds. But you have a very good setup here in Europe with connectivity so it's workable. And the left slide of this architecture diagram shows a mix of infrastructure components that the pipeline depends on. I'll go through them real quick and on a very high level not to bore you to that but just to understand what it does. So there is storage in the form of just asset storage where your raw original assets are stored but also FSX is a file system, a cloud-based file system that you can mount on an artist workstation or a render node. A license server for software that needs it so you can run it. Active Directory and SSO are components that allow you to create users and for these users to log into the portal. I will show what the portal looks like. Deadline is our render management software which is free for use on AWS and the compute represents the actual worker knows that the deadline wins will spin up when you need to render stuff. Customized are the templates that you can create to include all the DCCs after that you need and you can actually customize it with plugins, typical plugins that you would use and then reuse this customized template over and over for all artists who need it. So what it looks like for an artist, let's say we have Carlos who is a designer on our project and when Carlos logs into Nimble Studio Portal he will see one environment that he can work in called design, logically. And to begin working he will just click launch here and Nimble Studio will spin up a machine for him with all the tools he needs to do his work. In this case he's using Blender who wouldn't. And it's important that Carlos can operate Blender as freely as he would on his local machine. Those of you who probably saw our booth at SIGGRAPH or IBC could probably also try what it feels like. You cannot tell a difference between local machine and accessing a cloud-based instance. Like you move the mouse, the cursor responds instantly so you don't feel a difference with the delay that we're talking about with the latency of 20 to 50 milliseconds. And you can use your Intuos Pro or Cintiq whatever you're comfortable with. And let's say the artist is happy with their work and now they want to submit a render and they can use the integrated deadline submitter. It's a plug-in for Blender. And what this will initiate is that you have a queue of render jobs and it will spin up the workers these will be so-called EC2 spot instances the cheapest way to run is to instance like virtual machine instance on AWS and these will only be spun up when there are jobs so they will not be running at all times and then after the tasks are finished they will be also brought down automatically by the deadline. Important point is that instances are only used and launched based on what is in the queue. So it means that you only have a render farm when there is stuff to render. This is like a departure from where we were before and this also means that when you're early on in production the farm, so you aren't paying for render capacity that you're not using yet, right? And then when there's a crunch you are able to scale as quickly and as much as you need depending on how much you screwed up with the planning. So now animation is a team sport and it's not about one person being able to work it's about multiple people being able to work together so just as we have Carlos doing some design work we also want to have Anna to do some compositing. Notice that her setup is, it has more options than Carlos because she has more types of jobs that she can handle so as a studio admin I can create multiple environments and say okay this artist only gets one and this artist can do more jobs so she gets more and these environments will have the pre-installed software and so on. And it is configured in the form of so-called launch profiles so you say okay it will have this template it will have access to such and such studio components like render farms, storage volumes, license service, etc. And artists will be able to pick between such and such sizes of an instance so you also have these guard rails of who can launch different sizes of instances. So if you know that this group only needs to work with Photoshop you probably will not give them like the beefiest GPU based instance out there. And this is what it looks like for Anna it's completely abstracted from her so she can choose machine size, the template aka the launch template, the AMI and how to connect to it. She can use a browser or she can use a native client depending on whether she does feel like installing something on her machine or not. And ultimately the underlying complexity is abstracted and Anna can focus on work, collaborate with Carlos even when they're both using different OSs they can have access to the same assets because it's all in the cloud and interoperable. And one additional perk of running a studio in the cloud is improved visibility. When you create studio resources you can assign tags and these tags allow you then to pull up reports how much resources were used for which tasks how much render capacity you were running on throughout the project. This will give you visibility of how much you're spending and which tasks cost you most for example but also it will help you for the subsequent projects you will have better visibility into the breakdown of your costs. Before I hand it over to Chris again and I want to stress one point placing your instances close to where your artists are is important, I would say crucial. Currently Nimbus Studio is offered in four regions across US and Canada. It's in London for Europe and it's in Tokyo and Sydney for Asia Pacific. So expansion is planned with the current state of things is on the map. Now over to Chris. So I'll get back into Pichu and how we use that specifically with Blender. We use Blender as our DCC backbone same way we would use Maya or anything else. We can use whatever applications are sort of best for the task at hand so maybe I use ZBrush or I use Substance Painter or something else as an asset creator because that's what I'm familiar with as an artist or our contractors that we're bringing in but ultimately it all feeds back into Blender. We did some asset creation, we did all our rigging, our animation, we made use of the new geometry nodes and we did all our lighting and rendering in Blender with Cycles. So I'll start off with the bad news first and then I'll get back to the good news some of the stuff we liked or didn't like was like some of the lighting we thought we had some missing features a lack of light linking sort of hurt us creating a stylized film like a good example is you want that very tight specular highlight to hit right in the character's eyes in a very specific spot throughout most of the film a good way to do that is to associate a light with a specific piece of geometry we couldn't find a way to do that easily in Blender so that part was difficult for us we also do a lot of things like you want to put specific rims on a face or things like that so we used a lot of cards to kind of get around it a lot of blocking cards, a lot of foliage that we turned off the primary visibility on things like that just sort of make up for that another thing was a lack of check pointing so doing rendering in the cloud we could get a frame that's like a two hour long render frame it's about 95% of the way through you're like okay great I'm almost done, I can start compositing and then the instances take it away from you because it's on spot so somebody willing to pay more on demand they take it from you with check pointing it would allow that frame to pick back up at 95% but since Blender doesn't have that it would start back over from zero so you lose a little bit of rendering time there it can be a little frustrating also for hair we moved that whole pipeline from the most part into Houdini just for the simulation and grooming we found it much easier that the grooming tools in Blender weren't quite of what we needed them to be for creating braids and the hair that we were doing in the film and then staying with the simulation theme, our clothing we moved that into Maya and used in cloth just because the Blender simulation one our artists weren't as familiar with it and two we couldn't really get the results that we wanted out of it in a short amount of time so we quickly pivoted over to other solutions now for the good stuff the geometry nodes, they're awesome we used custom node setup for scattering some of the stuff we pulled off the marketplace that really allowed us to very quickly populate the foliage and all the rocks and things throughout our sets the marketplace setups we also used our own custom geometry node setups which I'll talk a little about in a minute and then our animation one of the things that was pretty neat is you could use reference footage and the compositor to very quickly put things together so the director can see what you're thinking before you actually do the work just to make sure everyone's on the same page and you can move forward more execution instead of kind of stumbling around and trying a bunch of things also the pose library is pretty cool so we could thank you we can use the same poses on multiple different characters because we're using the same rig for basically all the same characters so we can have a key artist come in, do all the very nice poses and then hand that out to all our contractors to keep everything on model as best possible and then lighting with cycles full disclosure like I've been lighting in Arnold from Softimage and Maya and things like that working in cycles I actually found really really nice the speed and stability is often times better in Arnold in my experience and working in Maya and things like that and the shader is in my opinion way better Maya's pretty clunky and it has a lot of quirks that are just weird to work around a lot of like just tribal knowledge how to make things go but Blender was very straightforward easy to dive into and I really enjoyed that part of it so far our geometry nodes our effects artist Joaquin he was able to create our own custom setup for the snake grass that you can see in the film it's part of this area that Maya has to navigate through but he was able to make it so that you could customize the size the shape even the animation so it had to blend to where you know when the grass is touching the ground it's not bending so it doesn't feel like it's wiggling through the ground but then up at the top it's wiggling quite a bit as it's hitting the wind and things like that and then also the marketplace tools that we're able to download to really speed up our production process and then for animation I mentioned like using the compositor and the pose library has a nice easy to use interface and really smoothed out that whole process some animators have never animated and blended before but get them in get them working quickly and make the production nice and smooth and here's a couple shots working through that animation process and you can see Jason he filmed himself, composited himself in so he can act out the part of Maya and it's like is this what you'd like her to do is this the way you think she'd react and show her mom's like oh I'm coming home from school we'll check out what I did mom so it's a very nice and quick way to get on the same page make sure everybody's thinking the same thing and then you can move forward to execution so you can move forward into the blocking pass make sure everyone's cool with that and you get a bit more polish but at this point you know everything's still kind of penetrating each other with the cloth, things like that and then you add the hair and the cloth which just was a more complicated process if you want to know about it later but going back and forth between the mom touching her daughter and her cloth and the daughter touches her and the back and forth that has to happen with the cloth and animation is pretty interesting but you can see the final result here talk a bit about the lighting like I mentioned before one of the things that was really nice was the speed we rendered Pichu at 4K and for the most part it was about an hour per frame sometimes a little less, sometimes our heavier shots were closer to two hours but that's I think pretty snappy overall considering the size we're rendering things at I can mention the stability was great we had a lot of very heavy scenes you'll see a lot of foliage in there they actually opened comparatively quickly to the previous project we did in Maya where it would sometimes take 20 minutes to open the shot our shots would open in like a couple minutes and you're ready to go very few crashes you know start up the cycles render move things around, scoot it around without restarting so I could kind of see it happening live and that was a really great experience I know in Maya if you do that too much you need to save a lot because you're probably going to crash it and then the shader editor I mentioned that it's easy intuitive I found it very powerful and flexible it's like Blender has all the little Lego blocks you need to kind of if you know how you're doing it just to make whatever you need to happen like one example is the refraction shader on the eyes I was able to make it so that normally you look at an eye straight on and you know it looks fine but you turn from the side and a lot of things it will you just kind of look right through it in a stylized sort of film which is not what happens to a real eye you actually see the curvature it refracts the color and the shape of the eye at a side view and I was able to do that and Blender I thought it was pretty cool and then we also used the shader editor to mix the physical sun and sky with HDRIs for our lighting here's one of the lighting shots that you can see you can see a lot of the cards that we used in the background as blockers all those trees on the side there aren't actually rendered, they're just there to cast shadows and block lights and really allow us to craft each shot as we wanted to and then we did our compositing in Nuke I'm sorry but we went into Nuke and actually that pipeline was great because all our composites that's what we were comfortable with that's what we knew we needed to move fast but the pipeline from going to Blender into Nuke was super easy and a lot of our artists used the tools that they knew and were comfortable with and we did our final color in DaVinci just to get that last little touch of goodness make sure all the highlights are just right you're balancing all the colors with the character in the background and all those good things just to finish it off and at this point I'll hand it back to Alexi briefly to talk about moving your production to Windows don't go far away, I need you so I just wanted to say a couple words about bottlenecks and I brought a picture of bottlenecks so I have all the components so the list here is your typical bottlenecks that you run into when you're running your on-prem infrastructure at a studio but these are also kind of a list of things that people talk about when they say why you cannot do animation or why you cannot work remotely and you will find articles on the web explaining why exactly it's impossible most of these articles are pretty old so the laws of I don't know, is it Murphy's law it's no, it's another one, it's Moore's law yeah you have to know your laws so it's progressing so none of these things are showstopper anymore and I just wanted to make sure we're all on the same page about that but it's not only technical bottlenecks that helps studios back from realizing their big ideas some of the bottlenecks are kind of non-technical nature and I listed them here and curiously or like interestingly enough cloud also helps solve those so you have a bottleneck for example location talent so you're not limited to your local talent pool but you can hire from everywhere and to the next bottleneck time and money you can start small in the cloud and you can take it from there get up and running in short time you can deliver with a higher quality level with the resources that you have you're not limited to your one workflow that your existing infrastructure is limiting you to you can try out real time workflow you can try out creating for AR VR for multiple different platforms and if something goes wrong, some experiments don't work out you can just turn it down and you're not paying for it anymore, it's not standing there in your server room accumulating dust and depreciating in the process and also security so for a very long time securing your IP was the top most it's still top most but in the cloud it's doable with a set of tools that you have there and many companies who move from on-prem to the cloud they say the level of security they are able to achieve in the cloud is at least on the level, if not higher and today we talked about content production how it can be done in the cloud our focus with media entertainment doesn't stop there internally we divide it into different other areas apart from content production there is media supply chain there is broadcast, there is direct to consumer streaming read Netflix here data science and analytics from media these are different areas that we internally at AWS divide media entertainment industry into and in every of these sections we manage solutions similar to what you see with Nimble Studio it's an abstraction layer that makes it easier for you stay creative and do things you want to do without having to worry about the underlying building blocks and orchestrating those so I encourage you to check those out and there will be a link on one of the subsequent slides and over to you Chris again for the last time all right thank you Alex so yeah if you like to watch the short film Pichu it's on YouTube hit the QR code here if you'd like to please give it a look we think it's a nice looking film with a good message it took us about nine months and 23 25 artists spread over six different countries and then part of our mission as FuzzyPixel is to consistently get back to the community so we've made Mayu the main character her notebook and a tree branch available on Github so you're free to download the assets play around with her use her in Blender and do whatever your little heart desires and then if you want to know more about Nimble Studio or other managed solutions for media and entertainment you can check out Nimble Studio in our M&E stuff here if you have any questions how does it work for the licenses? we have to pay for those licenses we have our own license server in the cloud and then we just run it from there we have a distributed simulation system for the media that can use simulation on multiple machines yes and no it was a little more manual we didn't have anything that was working like we could send it to the farm to do simulation type stuff but what we could do with Nimble is we could spin up multiple instances in the cloud so RFX artists would spin up like 5 or 6 machines and then hand distribute those to all the different machines there I was also as a lighter able to spin up 2 or 3 machines and lighting can take a while sometimes so on one machine get something going just let it do so I can see what it's going to look like in the end fire up another machine get that going and it just allows us as artists to work much more efficiently the way we had it no currently I imagine if you know what you're doing though you can make that work for us we were using Slack a lot we also use Sync Sketch for all our reviews so we use like a zoom or chime call and we all get on that and we can use Sync Sketch to talk but directly built into Nimble no not currently we did have the shared storage and everything just so we can share all the same files and be looking at the same things but as far as collaboration tools we went a little bit outside of Nimble for that yeah over here we did that through Kitsu I believe that's another open source thing we had our engineers hook that up right into stuff for Blender and then we used the system where we were versioning as we went along so we have all the versions of files so if we need to roll back they're always there but then we also had an unversioned file which is what we're using to link our reference into our Blender scenes that way you can open it up and you can always have whatever the latest approved file is at this point I know with GPUs getting cheaper that's a bigger debate I think the nice thing like Alessi mentioned is that you can get rid of that GPU if you're not using it so you're kind of only paying what you're using for at the moment which can lower your costs a bit but if you're using that thing 24-7 maybe a different conversation to have probably if you're going to sell it later you need to understand like how much it will how much value it will lose until then so we'll be able to buy a new one as the better version the whatever current best thing is with that same money probably not so it's this kind of conversation there are many other aspects that we can talk about not necessarily about the cost but yeah the producer will take over my 390 Ti connection and that's it okay to see my seeds and I told them that's buy a cloud solution IP security is another point for example with remote access that we used using nice DCV it's just pixel streaming you're not downloading any assets on your local machine so basically assets don't leave wherever they're supposed to be and this is another plus so you don't have copies of your assets on everybody's machines in their homes are we are we it's usually a conversation because there's a shared security model so usually you end up in a conversation and then they go through and analyze what you set up but we have to go through pretty crazy security in order to be able to be able to be out there so that's something to do is your solutions architects sort of run through your system to make sure everything is all okay I mentioned that Nimbus Studio was revamped to make it well architected and one of the so called pillars of the well architected framework is security so make sure it's up to the standards of how you make things secure in the cloud but when it comes to be compliant with a certain standard you can do that and there's really a big bit of a security stuff oh yeah we used the limbic yeah I think for the most part the limbic worked pretty well one of the things I found was that I wasn't able to modify the limbic afterwards to do like shot sculpting so I tried to use geometry notes for that at first but I'm not the best at geo-nodes yet but it worked and I defaulted back to Maya because then you can modify that and then pump that back in yeah so if you could modify the limbic inblender I would have just done it there but I just couldn't figure out how to do it under the time constraints yeah we didn't use usd on this one but yeah there's got to be ways to do that shot sculpting work there at the end just to make everything look one last question sorry they're cutting us on time no it was not a huge problem one of the ways we worked around that is you can sort of diversify the types of instances you're getting so it's like oh maybe we get a slightly slower render but we know that you know customers aren't constantly pulling that type of instance so we're able to get our renders through that way we're able to scale pretty well I think it was sort of just an issue at first till we recognized the problem and then we sort of found work ways or work arounds to get to you know reliable renders and then we also found that a lot of those on-demand type instances are pulled during working hours and then so like 5 o'clock hits and then everything comes back and you can get a massive amount of capacity at that point and really shove everything through but I think we're out of time at this point yeah so thanks everybody for showing