 Hello, good afternoon everyone. My name is Vinak Khilap and I am here to present you our topic development on Akash platform. It consists of two parts, porting of proximity on Akash tablet, presented by me, Nilesh Singh, and Pradhyam Oite. And other is peer-to-peer communication on Akash tablet, presented by Dilip Singh and Pushpak. Now, just moments ago, Vivek and Ajay has told you about the published lecture. It is an interactive lecture and all. You can distribute on pendrive, SD cards, and they all told you about the web also. But the main thing that comes in our mind is we have it on our desktop. So why do we need it on Android? Or we can say Akash tablet. So if you think in the future, you can see more and more people having tablets instead of having desktop pieces at their home. Because Akash tablet being the cheapest of all is the biggest advantage of all. So here is some of the information of Akash tablet. It is an initiative taken by MHRD and government of India. It will be distributed at very cheap rates throughout the nation. And there are some specifications like Akash tablet runs on Android 2.3 OS. It has 700 MHz processor, 256 MB RAM, and 2 GB internal storage. And it also supports 32 GB external storage. Now here are some of the media file formats supported by Akash, which is supporting Android 2.3 OS. In audio, it supports MP3, AAC. For video, it supports MP4, 3GP. Image supports JPEG, GIF, PNG, and the bitmap. As Vivek has showed us about the public lecture on desktop, this is how it looks. It consists of video, slides, JTRI. JTRI means it consists of themes, lecture topics, and information. But when we look at the Akash tablet, it's just 7 inch, which is so far to occupy entire desktop is not entirely possible. So instead of this, we are planning to build some various types of published modes. Like one is the audio lecture, where you can listen to the audio of the professor, and other is a video lecture. So here is an audio lecture. This is how it will look. It will consist of the entire information, like course name, speaker name. And it will consist of lecture slides and all. And here is an image button which consists of sliding drawer, which will be explained to you later by Niles Singh. And here is an image of professor. So why do we need image of professor there? Suppose you are watching news. Sometimes there is a phone interview. When you see the phone interview, there is an image displayed over there. So it gives the feel that you are seeing the person live. So that's why the main reason behind it was to display the image. And as obvious that audio lecture consists of an audio, professor's audio, which is extracted from a video using multimedia transquarter. And it also consists of lecture slides. Lecture slides are synchronized with audio lecture, as was explained by Vivek and Ajay. So how do we get the data across all the professor name and all? It's read from an XML file. This is always read. If you see it properly, you can see a course name. Very good afternoon to all of you. My name is Niles Singh. So I'm going to continue this topic from here. Exactly. I'm here. I'm going to explain you about video, means video publish here. When I'm talking about videos, I'm using videos in this proximity. At that time, we have to use some components. There are some components as a slide drawer, as Nino told you about the slider. I would like to show you what exactly the slider. Now, here you will see some text and all. This is exactly your slider. It's not visible properly here. So I'll show you what is the slide drawer. Look, this is exactly the slide drawer where we are going to use this in our proximity for the video or audio. Here you can see the audio. In audio lecture, this is exactly the slider. When we'll push from the right side to left side, so it will show you some components. You can see here, before that, we will get all this information on that slide drawer, index, the tree form of that index. So we will get that information on that particular slide drawer. So now, same thing we will use with the video also. This is one of the snapshot for the video. Apart from the slide drawer, we have some other components like image view. This image view we will use for the image to display image on that particular place. Any place, we can use that. Then video view for displaying video. And apart from that expandable list view, this one will be new. Here, this expandable view we will use to display the tree structure on our video slide drawer. And then text view, buttons we will use for the button, and text view we will use for the viewing some text. Now here, the video lecture published. Here it's similar to your audio lecture published. Again, now here, we will get some video lecture slides and some lecture slides. In the lecture slides, we will get some first of all, we will start with what is that lecture video. Lecture video, I can show you like that. This is the idea of lecture video. You saw this one in the very first slide, a snapshot of this presentation. Now here, we will get the slide drawer and that the video that any professor like any lecture video you can see over there. And simultaneously, this one will be synchronized based on timing. Here, we are using the timing means if we want to jump some particular topic or one particular second to listen that particular topic. So, we can directly click on that particular tree video. If we will click on that any topic in this, it will direct that video will directly jump to that particular topic or video. So, that particular tree view you will find in a slide drawer. Then synchronized video lecture and a lecture slide. So, lecture slide means simultaneously if like a professor is saying something about some topic. So, simultaneously it will show you the slides also or that particular topic. If professor will change that particular topic, it will again jump to that particular topic. Then this is the read data from XML file. Basically, this read data from XML file is similar to all in all proximity presentation. Same if we want to get some information as Nino told you about this dvms, dvms like you will find that dvms on the source here means we will get the all information from the XML file. So, here basically the here we are working on that that reading data from the XML file. Now, I would like to call my colleague Pradhna to continue this presentation. I am Pradhna and to give my introduction as project experience wise, I am the most junior person on this team currently. So, due to which I am suffering with a very big disadvantage, my colleagues sitting here, they have given me most boring part to explain. I do not have any videos to show you. I do not have any fancy technologies to explain to you. I just have some design documentation to explain it to you. So, let me understand, you are from a computer science background, I assume. So, what is the significance of DFD? Can anyone tell me? What is a DFD? Cool. What is the significance of DFD? So, for example, after a month, when you will see your output, whatever you have designed, and then your lead may come to you and they may tell you that, you know, see, this is always missing. So, you can tell them, say, dude, this is a DFD, and this has been approved by you. So, that's not my mistake. So, to put it in a good way, DFD gives you a, you know, a reporting format that what you are going to develop and whether the whole team is understanding what we are designing or not. So, right now, we are talking about a very small application of proximity. There can be, on Akash tablet, we are going to embed the proximity. There is one more project coming as a peer-to-peer communication. So, you guys will be distributed among the modules. And at the end, we want to see all those modules on the tablet. So, it's going to be a very big project. So, all of us has to understand what is the importance of each module, where does it stand, how does the data flow from one module to other modules. So, these are some important factors to understand. So, all right. So, now you have seen the application C. Now, this is the UI, one of the UI, which has been planned to design on Android. So, now tell me, with respect to DFD, what is the obvious entity, important entity in a DFD? The end user, the student who will be viewing the published lecture, right? So, OK, so the student who will be viewing published lecture is the most important person in our whole application, right? So, for them, only we are doing this whole application. So, let's go to the diagram. And one can perform with the app which is running on the tablet. And accordingly, the response will be sent back from the tablet. So, what all actions will be? Very obvious. For example, if video is getting viewed, then I would like to maybe click on next, pause, go to previous. So, even in this tree, so if you see this tree, so these trees, what we have planned is we will design it according to the theme. For example, when professor may take several topics on, suppose, data structures, that can be linked list, that can be something more. So, what I would like to just view is only a linked list part. I would like to only view the linked list slides. And I would like to hear only the linked list video. So, what will I do? I will simply navigate on the index. I will go to that particular topic. So, that kind of flexibility, definitely the student will want. So, the one of the actions will be to navigate based on index. So, if you see the topmost arrow, the start the application, first of all, obviously you have to run that application. Then the second application would be, the second action would be to show that navigation index to the user, so that user can navigate based on the requirement. Then based on requirement, whatever the student or the user has clicked, show that output. Then with respect to videos, click on next previous pause and based on that, send the response back. And now here is the one more part of this diagram. If you see here, view the published lecture. But from where this lecture is going to come, there has to be some executable file or a jar file which may come to our mind to execute that application. So, whatever proximity people have shown you, that was a jar file which was getting executed. And that's how they have taken those snapshots. So, with respect to Android, it's .APK file, which we will be installing on the app, on the tablet, on the Akash tablet. So, that will be coming from here. So, import and install the apk on tablet and then view the published lecture. So, from where this .APK file is coming? So, it is coming from some edit and publish mode. So, initially you have seen some XML data, XML data is getting mapped to some output there. So, that is what that edit and publish mode will be doing. So, basically a proximity edit tool will import the videos, the slides, the time frame of each slide and they will prepare a XML which will have the correct data to give the output. And further that XML file we will publish and create a .APK for Android. And that further will be shown on the tablet. So, that's how the flow is. So, specifically talking about the viewing the published lecture. So, this is the part which we are currently we are focusing on or we are developing. So, what we as even Nilesh has shown you some that slide drawer. So, that's what we are exactly right now trying to give show that output on tablet. There is a video, there is a slides, the all form whatever we want. So, that's what the current design is. And what will be the future enhancement? The future enhancement will be this. So, we are planning to have the editing tool for Akash tablet. So, whatever tool the initially Vivek and Ajay had told is with respect to the proximity edit tool. Now, what we want is we want that editing tool on the tablet. Because what we are saying that we want to have every application on Akash tablet. So, there is no point on giving a desktop application to edit the XML and then import and run. So, we are getting flexibility to we are planning to give flexibility to edit the XML on the tablet. So, that process will be comprised of these all steps. So, first of all reading the XML file on the Akash tablet. So, first of all how the flow would be? The proximity edit tool will import videos, slides, audio whatever is required the time frame and they will create a XML file. So, that XML file we are giving in editable mode on the tablet. So, this is the XML file. So, I want to edit it. So, the simplest UI which I can think is I can have a left-hand side or labels like video name, course name and the right-hand side whatever the value is I will have it in text box. The simplest UI I can think of and the user can edit say whatever wants to do. So, that will be something editing of XML. Then the third that's what I told about making the data fields in GUI editable. So, I am giving those values in text box user can edit save and accordingly the APK will be generated. Then the binding XML file and the GUI so that any change in one is automatically reflected in other. So, definitely once the data is saved that has to be reflected in the published lecture then only it makes sense. So, these steps we have to follow with respect to the future enhancement of editing XML on the tablet. So, this is nothing but the XML file example. So, which we will say that we are going to edit on a tablet. So, basically on tablet we will give them a user a GUI to edit and save. So, this is nothing but talking about the future enhancement. So, a teacher or the editing team they will have an XML file coming from the proximity tool and that we will here provide. So, here our GUI will come which will contain those values in text box. So, that will edit the XML file, the further the updated XML file will be given to the publish process. This publish process will nothing but generate a .APK file for us. So, what is this? This we will soon be seeing I mean next couple of days that how to create .APK file on Eclipse how to create the Android application. So, this we will see it soon and this is nothing but a data whatever is required XML file, the lecture data nothing but videos, slides, audio whatever is required. So, this is nothing but a future enhancement and what I told this is something a current development which is going on. So, that's all from my side. Thank you. Do you guys have any questions for three of us? Development of proximity on Android. Any with respect to the flow, with respect to you haven't understood something or you haven't understood anything. Come on. Anything? All right. Thanks a lot. I will invite the lip-sync to give peer-to-peer communication development on Android tablets. Thank you. Guys, I am the lip-sync. I am MPEC here and so we want to design a peer-to-peer communication between each tablet. So, lot of tablet come here, approximately a tablet in this institute. So, we want to communication between each tablet. So, we need to design the peer-to-peer communication on a cache tablet. So, it's a scale project. So, this is an introduction on this project. So, actually, SkyP. SkyP is also available application. So, because we want to design through the Wi-Fi communications, so this application based on Wi-Fi. So, you can either text chat and voice chat with friend on in local Wi-Fi area. So, and also which allow to voice communication between two tablets without need to internet. It is basically based on Wi-Fi. So, you need to use a local Wi-Fi access point. So, that is the main goal. Why waste the money? The calling through your phones and different aspect internet also. Just use a local Wi-Fi and directly call with your friends. So, just connect your local Wi-Fi and directly access point to and start talking. So, architecture is mainly simple client server based architectures, underlying the network protocols, some voice data. And we are using VDP network protocols. So, because communication between each tablet, so these require some UDP because TCP is a connection less. So, suppose that some data lost, so communication will be interrupt and connection will be closed. So, we want to use the UDP communication in voice communication. So, it is a time sensitive applications UDP because the dropping packet, preferably warning and delay packet. So, we have to visit, we can visit and you can, this URL, this difference between TCP and UDP in voice communications. So, this is the flow diagram to overall design, how to design the communication between each tablet. So, this is the begin part and firstly, we connect the Wi-Fi network and local network and also find your IP address, what is my IP address and also available and display number of user connected and showing the number of user connected and local Wi-Fi. So, this is the overall structures to communication between each tablet, the call accepting and pick up call and call ended. So, this is a lot of application in this year. So, we need to the server application for the IP address and MAC address, also roll numbers and user name. So, we can identify by roll numbers, suppose that I want to call other person, user name. So, I know other person's roll number, so we can directly call to other person and pick up a call. So, this server get the MAC ID and IP address and between get the IP address and also application tablet. So, we can communicate connection will be established through the IP address and we can talk to each other. So, implementation we can implement because suppose that we communicate the local Wi-Fi and other person connected other Wi-Fi. So, we need to design the ad hoc Wi-Fi because suppose that I am here, I connected with QCID Wi-Fi. So, all my colleagues also connected with other Wi-Fi. So, we need to design a ad hoc Wi-Fi. So, we can directly communicate one Wi-Fi to other Wi-Fi. So, like a virtual private network and other things that trace the MAC ID, because it is a limited because suppose that MAC ID is only available, we can know only one Wi-Fi. We cannot know other Wi-Fi. But other user connected other Wi-Fi, we cannot identify what is the MAC address of hardware, ACA stable. So, this is the limitations and trace the MAC ID. We can implement how to trace the other user connected to other Wi-Fi and how to trace MAC ID. So, and other way video transmissions. Laplace tablet is a keep watching the real-time video. Suppose that I am here, class is also in video lecture and also in classroom. So, we can see directly real-time video and recorded video also. We can hold the recorded video and also implement the file sharing between each tablet. Suppose that photo and text and video, we can share each tablet and throw the server and other one is the call conferencing. And we can also implement the call conferencing on ACA's tablet. So, this is a lot of application in here. So, firstly, we have designed, firstly, we are designing a simple application. We can connect the local Wi-Fi and talking about each tablet. So, this is the overall application, what is the requirement? This is a software interface. Software interface is Android 2.3 something. And it is a based on Java applications, SIP library is also available. So, lot of application available in SIP session initiation protocol. So, API also available, we can use and other, we can back-end use the MySQL. So, it is a whole the username, problem, Mac ID and Android ID.