 A very good morning and warm welcome to all of you. Today, I welcome you back all here for much more presentations which we have in store for you. Let's start with the first group for today, Blender Animations, managed by Mr. Sameer Shah Sabuddi and mentored by Ms. Pooja Bhavasar and Mr. Nitin Ayer. Netra 3D. Netra 3D is an e-learning application designed to acquaint a student with the anatomy of human eye with its varied features, components and functionality in a 3D view space. It offers both exploration and practice for the user to review and evaluate learning at his own place. Easy navigability replete with the Android interactivity, cavalcade, off touch, pinch, zoom and swipe. Let us see what the presentation has. Good morning everyone. In this session, we are going to discuss about the various stages that we have gone through in creation of our final product. We belong to the Blender Animations group which has been creating various applications for desktop and educational apps for desktop and Android platforms. So in this period of internship, we were given the task to create an educational app for viewing the internal structure of human eye using software such as Blender 3D and Unity. The use of media is increasing day by day. People are more inclined to the use of tablets and smartphones. So we wanted to exploit this feature for using an educational system. What we have presently is, our means of education is through textbooks nowadays. So that textbook is convenient for some theoretical topics but not for topics such as eye in which we have 3D model in which we need interaction and viewing the model of eye completely because eye is a topic which can be viewed best when we have the cross section and view at various angles. So in textbooks, we have 2D images which do not have any kind of interaction. They are not in motion and they just show the cross sections through various angles. So in our app, we have added the interactivities like whatever interactivities are there in Android applications and tablets that we have tried to exploit in our app. We have added swipe interaction, touch and pin zoom interaction and accelerometer rotation. So our app basically consists of 3 stages in which we have explained the internal structure of eye and the working of eye through animation and the various practice activities involved with those explanations. So starting with the basic modeling of eye, my teammate Rakshita will continue with the stages that are involved in creation of that eye model in Blender. Hello everyone, initially during the first 2 weeks of our internship, we were being trained in the Blender software. So what is Blender? Blender is basically a 3D open source computer graphics software where we can model our 3D objects, create animations for them and build the interactive applications for the desktop purpose. We were completely new to the software. So by learning all the features that were present in the Blender, we could model things like cup, pan, chess pawn, etc. Not only we learned the Blender but also we got the chance to teach the Blender. One of the labs in the Crescent, they approached our mentors and they asked them to conduct the workshop for their interns and we got the opportunity to teach them the Blender. So that was one of the interesting learning sessions that we got during this internship. Now, our project required the 3D model of the human eye. So this was created by the Blender experts. The slide shows the eye model without the materials and textures. In the next slide, this shows the eye model where in the Blender we added materials and textures to give it a realistic view of a 3D human eye. Blender comes with its own game engine, Blender game engine, which is used to create 2D as well as 3D applications but that also for the desktop purposes. It does not give .APK as its output. So we cannot use that game engine for the Android devices. If we want our Blender files to be run on the Android devices, we need the Blender player. But the interactivities that we get to see in other Android apps like Swipe and Zoom that are not being included due to this Blender player. So we had to exploit other source, open source game engines as well as proprietary game engines to have those interactivities for the for our app. So from now, Himalini would continue. Thank you, Rakshita. A very good morning to the August gathering present here. I will take you walk you through the other approaches we took than Blender game engine. Now Blender game engine was restrictive in the fact that an additional Blender player had to be installed for the app to run. So other than that, we explored a cavalcade of alternatives. Major of these alternatives in the open source category were Min3D, GameKit, Ogre, LibGDX, JMonkey Platform, Catcake3D and Cocoa3D. Among these, it wasn't logical to spend all the time, all the course of the internship on every one of these. So we put up notches in our timeline and we divided ourselves into team where we had a major, a simple task of including all the interactivities of touch, zoom, pinch and swipe in a single Blender model. So if we could do that, we could definitely extrapolate every of the iModel functionality. So among the saving, we excluded these options from the beginning because Ogre, okay. Ogre is object-oriented graphics rendering engine. This was eliminated as an option for pursuing our entire project because it has no sound documentation. And the graphics quality, developed graphics quality is poor. Our major USP for creating this app was that text is bland than visuals. More visuals than graphics are appealing to students. So poor graphics quality was something we had to eliminate at the beginning only. Catcake3D is a software which is still in beta, so we could not continue work in it. JMonkey Platform similarly is used for low-level game development. So again, the graphics quality is poor. We had to eliminate this as an option. LibGDX does not include textures in the blend models which are imported. So graphics option, we had to eliminate this right away. And Cocoa3D is something which does publish apps in Android and iOS, but most of the work that has been done in it has only been for iOS. So after the saving, we got to punctuating our options to two, which were Min3D. Okay, Min3D is a lightweight 3D framework for Android. So what it does, you can take your blender model, dot blend model and export it as the mesh, the dot obj and dot mtl file. And what Min3D will do is that it'll flatten your object into an array of vertices. So it requires ADT bundle with the Min3D framework installed. So you can actually take the model and its vertices for manipulation and do whatever changes or modifications you want in the Android application code. So we did not actually use Min3D as our final option too, because for one model, we could create these touch, zoom, pinch, swipe, interactivity, but for multiple models, many glitches were coming. So this option could not be considered for our model. We then explore on to GameKit. GameKit happens to be a cross-platform 3D game engine using OgreKit and Bullet Physics for Windows, Linux, Mac and iPhone. So what GameKit does, it provides us a bridge or a pipeline from dot blend to dot apk. What we can do is we can impute the entire blender game engine logic, which we've created, and use GameKit for just taking it from one place to another. So what it helps us, the way it helps us is that we don't have to go through the hassles of adding more logic. Like whatever we've done in Blender game engine can be used directly from Blender to GameKit. But GameKit also wasn't used as a major punctuation point because, okay. Game, because it exports the entire logic, we have no elbow room to add our own logic. Supposingly we want to change or put some more interactivity, we have to go entirely back to Blender game engine and then impute all those additional features. So the entire export as a complete package made us restrictive to adding any more features into it. And because we see, because entire logic is exported, we had no access to the model for manipulation. Like if I want to explore with the components of the model, I could not do that with GameKit. So GameKit also, we had a premature punctuation to that point. Now after exploring these game engines, we came to a point where we were introduced to the very promising Unity 3D. Unity 3D happens to be a game development ecosystem. It has a powerful rendering engine which got us right away because that is exactly what we needed for textures. Visuals was our major propaganda which we were saying that these are more appealing students like it more and it was available in Unity 3D. It has rapid workflows to create interactive 3D and 2D content. Again, it has multi-platform publishing so you can create your applications in iOS, Mac, Windows Linux, and obviously Android. We had access to the assets in the Asset Store. Now Asset Store is an inbuilt feature of Unity. Okay, it has sound documentation and an active forum. We got through a lot of winding tassels from here to there and nowhere in all those other open source game engines because we had no documentation. So if we continued from one point, we actually dominated our exploration because we could not go any further. Unity had a very active community which could help us when we had any doubts. And it had the requirements of JDK and the ADT bundle to be installed and the .NET framework to be installed to work. So Unity was our best bet. Unity is a proprietary product. So it is not open source but it was our best bet because all the open source game engines that we explored, they had this one thing that we needed. Our major requirement was neat and crisp graphics. These are the things that attract a student. A student will prefer an app like this over textbooks if he sees good graphics or something which is more colorful or the components which can be seen in specularity and all those features. So all those open source game engines, they had poor presentation. So Unity 3D happened to be our best option because correct textures we got, every object we could manipulate singly. For every object and its subcomponents, we could do scripting and scripting also we had a flexible option in three scripting languages which were JavaScript, C sharp and Boo. So any three of these we were comfortable in, we could add the scripts individually to models. It had an extension for playing animation videos on the Android platform. Now initially when we were using blender game engines, we could not play the animation without having blender play installed which stopped us from working any further. Like you had to add this additional thing installed and already we had this extension in it. Okay, the tangible .APK file that we needed, Unity 3D was giving us as an export directly. The best point was that we got a time sensitive free license to work with Unity 3D Pro. So our mentors and Professor Sameer Sahasrabade, he approached the head of operations of Unity in India, Alex McCready for a free license to actually explore Unity 3D if it is actually what it promises. So we got our request heard and we got a free license for two months. So we could do, we had access to Unity 3D Pro, those features of glow effect, graphics, playing animations. These are pro features which we got access to. And another major incentive was from the work which had already been done in Unity 3D. Many games, many games like Temple Run, Avon Guard, these have been created in Unity 3D only. And there had been some work which had been done from our distant mentors from CMU who could help us whenever we got stuck in Unity 3D. So I think I should hand over the amp to Ravi who will guide you through the process of how we did work in Unity 3D. Now we see how we implemented our app in Unity 3D. So this is Unity 3D logo. So we got a license that we are happy. So now this is the outline workflow for implementation of our app. So we basically, we are creating the models in Blender and export it into FBX format. Then the Unity, in Unity 3D, we import that models which are in FBX format. So in the next, we add interactivity to that models so that is the intention of our app. So there are so many interactivity possible in Android. But for explaining the structure of I, so we stick to these four major interactivity. So touch, swipe, pinch zoom and rotate. So in order to implement these interactivity, Unity provides there are two ways to implement it. So we can use either C sharp or JavaScript to write the code and we used C sharp for this app. So it's an example for code for touch interactivity. You can see. And this is for pinch zoom interactivity. Actually, so while explaining the internal structure of I, so we need to look whenever we click on retina. Let's say it's a part of I, so it has to be highlighted. So in order to know the user has clicked that part of I, so we need touch interactivity. Here we can highlight the portion of I so that and we can also provide information regarding that part. So this is for pinch zoom. Pinch zoom and accelerometer rotation are the prominent interactions that are possible in our app. So this is basically game plan that we took for Net3D application for that is for our app. So it contains actually three phases. So it can be better explained with the video. So I hand over to the Ruchi. Thank you, Ravi. Now I'll showcase you the video featuring this application. So this is our Net3D icon. We have developed it in Blender itself. So with this flash screen, there are three modules which are structure of I, working of I and adaption of I. In structure of I, we have instructions which is for exploring its controls. How do we control it? So we have structure of I, details structure of I explained here. So you can see when we are clicking on any component of the I, we can see its detailed information in the left panel. So this is for blind spot, cornea. So we can have the whole view of I anatomy in 3D view scope. So we can have zoom feature as well. We can zoom it like this. And then we have accelerometer here so that we can get its 3D view. See, the object is tilting. We can see the latency here. So we can have its 3D view like this by tilting it. I'll just forward it a little. Okay, in this, then we have practice activity as well which will showcase you quiz. We have quiz for students which will, which is all about like the various parts of I. We have a questionnaire like select the correct option for the highlighted part in the image. So we will going to select any option. And then like I have chosen blind spot here. So we got a description here and my score goes with one, goes from zero to one. So we have multiple questions here. Then we, then in working of I, we have, sorry. Then in structure of I, we have practice activity. We are done with it. And then in working of I, we have three modules. First is for image formation. Second is for defects of I. And third is practice activity. So here we can, we can see the image formation here. All this animation is previously created by previous interns. We just imported it in unity. So defects of I we have. Okay, this is the practice activity. Okay, here we have this part which is adaption of I. So how I adapt with the light intensity. So we have the video which is showing light intensity on I and then the practice activity for this as well. So this is the video showcasing the effect of light on the size of pupil. So you can see that when light increases the size of the pupil contracts. And when it, the light intensity decreases, it dilates. So this is the practice activity. We just, we just increase the light intensity using this slide bar. And you can see the changes in the pupil. It contracts and dilates accordingly. So now I'll hand over the, this to someone. He'll tell you about the challenges we faced. Good morning to all. Now I'm going to explain about the various challenges we faced during the making of app. In many technical problems, we have many technical challenges faced in that we are not, we would like to show a prominent examples of that. One of the example is that installation of open source game engines. There is no proper documentation available for that installation of opens, many open source game engines that is very tough to install. And we are stuck with it. No one knows anything, how to do installation, no documentation available. Sometimes we thought that coding that game engine is bit easier than the installation of that open source game engine. That's what the situation we faced. Even though we have faced many coding problems in Unity 3D, we have a forum and which is of quick response so that we code it easier and our work is simple with Unity. Coming to the next, what we have learned. We came to know the various open source game engines and pros and cons of that. We learned Blender. Blender is good enough for making models for our use. But we came to know the main importance of Blender when we are asked to conduct a workshop on that. That is one of the biggest learning point what we have had. Where the students, suddenly we are faced with the students, they were supposed to ask the questions and they are checking our knowledge on the Blender. That's what the situation and this is what the biggest learning point what we have had. And learning Unity, teamwork and team management. This is what we have learned here in summer intern 2014. And now to Arpana. Hello everyone. Now I'm going to tell you about the future scope of our project. Now we can introduce multiple languages, multiple Indian languages like Hindi, Telugu, Marathi in our application. By replying the text, we can use this same approach of learning with all these user interactivities in various human body parts like for brain, for heart, for lungs, etc. And we can also extend the features like we can add gesture recognition, eye tracking, etc. And as we know Unity can give output for multiple platforms such as Windows, iOS, Linux or various game engines like Xbox or Oculus Rift. Then we can also use this app with Oculus Rift also. And Oculus Rift is a real-time 3D game engine which allows us to go into the virtual world of game. This is our team. I have one question. How much time did you take to actually develop the application apart from the learning? So if I give you year now, how long will it take? Sir, to actually create the application it took less time, we could complete it Exactly, that's why I am asking. Yes sir, two weeks. Two weeks into how many people? Two weeks and seven people. Two weeks into seven people, so 14 person week required for any human, this thing. Yes sir. But 14 blender experts or Unity experts or both? Both. So most of the time was actually spent in deliberating over a game engine to choose. So we worked on each of the open source game engines to figure out what are the shortcomings in each one of them. So because Unity 3D had the crisp and snappy graphics we needed, we had to converge and stop at that point and begin work and finish it actually. It works on desktop also? We can create it but currently it's working on Android. You said that it produces all kinds of outputs. Unity? Yes sir, we can create an desktop application but this was created customly for Android. The touch, pinch zoom and swipe interactivity. You just now said Unity creates all kinds of outputs. Yes sir. Yes sir, this can be created for desktop. What is your timeline? What is the function? Dot exe. Dot exe. What about Linux? Doesn't run dot exe. So Linux, we have these options in built settings of Unity itself. So wherever you hosted this thing, you should host other all the operating systems because let us see if Unity works. And make sure that it is hosted wherever for people to see. Sir, that can be exported but the interactivities will be different for different platforms. So accordingly, we'll have to change the code interactivities and then according to that, we can develop the application. Are you sure? Sir, because pinch zoom can't work for desktop applications that has to be in Android only. For desktop, we'll use keyboard and mouse interactivities. Sir, these all features are present in the Blender game engine itself. So that can be done in lesser time than we took for Android. No, no, your basic objective is to create something for all cross-platform. Now you started by saying that Blender game engine, though great, does not work with Android for some reason. Okay, that's why you chose Unity. Yeah. Okay. Right? Yes sir. If you want to develop it on two, I need a single development platform. Then it has to be only Unity. Yes sir. The mouse and keyboard interact. Yeah, the only thing is you have to add, same listener, you have to add to different kinds of events and then it should work everywhere. Thank you all for your patience.