 My name is Kostu and I will be talking about the Android Libraries. Now the last, as Mahesh sir told you, the last presentation was little heavy, so hopefully mine will be much simpler. And I won't be covering too much of the stuff about the Android architecture and all the library. So, Nibbith has already covered the Linux kernel, so I will be talking about the libraries. Now in one of the presentations given by the Android engineer, we classified the libraries in 3-4 different parts. So one is the Lipsy, which is very much important and without the Lipsy, the entire Android OS, it cannot function. Surface manager is a surface flinger and another one is audio flinger. So those two are the demons, so those two are the services and once the Android OS starts, they keep on running. And all other libraries which are there, those are just simple SO files, which are similar to the library files which you get in the Linux OS and these are loaded at the dynamic, at the runtime. So first one start looking into the Lipsy, so in general, Lipsy is the standard C library or the library which is there on the Ubuntu is G-Lipsy, which is GNU Lipsy, but for Android the Google, they developed their own library. So in the Lipsy, what are the functions which are provided is first the inbuilt micros, then the type definitions and functions. So whenever you are using, suppose you are writing a simple Hallowall program, so in int main, what we will write is just printf Hallowall. So in that case, the function which is printf, it is defined in the Lipsy. So like these, all the functions are defined in the Lipsy and we just write simple programs and compile them to get the executable. So for the Lipsy, the different implementations are GNU Lipsy, which is GNU Lipsy, then Mewl Lipsy, which is for embedded systems, which is quite smaller than GNU Lipsy. And then Google developed their own version of Lipsy, which is called as Bionic. So the Bionic, they did not do it from the scratch, it is just a port of the BSD Lipsy and it supports the ARM architecture and X86 architecture. The purpose behind creating the different Lipsy and not using GNU Lipsy or the LUC Lipsy was that first is the BSD license. If you use the GPL or LGPL, which is GNU Lipsy has the GPL license and it means that whatever applications you write and if you use the GNU Lipsy, then you will have to make the source code open. So if they have kept the GNU Lipsy, then all the applications which are there, which use the Lipsy, they have to make their source code open, which is not good for the developers. Then second one is small size and the Lipsy is for the embedded systems or for mobile phones and tablets. The size of the library should be very much smaller and the third one is the tablets and the mobile phone, they have very limited power and the processor is also not quite powerful. So the speed is very much important in that case. The now in Lipsy in the Bionic itself, the p-thread is not fully supported. So the smallest implementation they have made and not all the functions which are in p-thread. So the p-thread cannot be completely used in the Lipsy. Next is they have not provided the support for the wide characters. So if the character is more than one byte, you cannot use it in the Lipsy, then not the locals. So only English and these languages, only support is provided for only those languages and C++ exceptions are all the support for those exceptions are also not provided. So suppose if you write a code in C++ and there are two functions, one function calls the other function, but the other function suppose it throws the exception. So in other libraries like G-Lipsy, the first function, the calling function will get informed that the exception has been thrown, but in Bionic the exception even though the call function throws the exception, the calling function will not be notified that the exception has been thrown. Now the additional support they have provided for the system properties and logging. So there is a logging facility available in the Android and the same support is provided in the Bionic also. So all the logs which other demands use the logging facility, so all logs they get logged into the same file. And in the actual mobile user tablet, the library is located at system-lib-slash-lipsy.so. The next thing is many components of Android, they depend on the Bionic. So like let me just show you various, how the system starts booting and how processes and demos they start running. So many of those components they depend on the Lipsy and the next thing is if you have written the code in cc++ that is the native language, then you will have to use the Android NDK. Android NDK is different from the Android SDK, SDK is software development gate whereas NDK is native development gate. So it provides the compiler and tool chains which you can use to compile the cc++ code and to generate the libraries. So for an example the Angry Birds or Adopt Flash Player, they are written in cc++. Now I told you about the four point was the many components of the Android, they depend on the Bionic. This another dependency list, so suppose the bootable, the form file and the true file. So true file are the references or the functions which are used from the Lipsy. So fprintf to write the data in the file, so the bootable slash disk installer slash installer. So it is one of the files from the Android system, so it calls this file. So like these the dependencies are there for many components and in total you can see there are around 1200 pages of this file. So around some 32,000 dependencies are there for the different components. Next library is WebKit. So it is just a web browser engine used to render the HTML pages. It has the full support for the CCS, JavaScript, DOM and Ajax. It is located in this path, system.lib.core.so and the pages can be rendered in a full desktop mode also. So the full screen can be shown as a web page. The next thing, the next library is SQLite. So SQLite is serverless, zero configuration and dataless engine. So why they chose this one? So if we have they chosen the other database engines such as MySQL or other servers, then all those database engines they require a server which has to be run all the time. But for SQLite as this says it is serverless and zero configuration. So you do not need to configure the database and its engine to use it. So now why the Android system it uses the SQLite for many purposes. Some of them are like phone book to store your contacts, then call history, all the received calls, missed calls, your SMS, MMS, etc. The library is located as lib.sqlite.so and one of the warning they have given is that if the Android database is correct, then the Android system it will automatically delete the whole database without even asking the user. So it will automatically get deleted and if the code is there for creating a new database then that code will get execute and the new completely empty database will get created. Next is SGL which is Scalable Graphics Library or SCIA Graphics Library. SCIA is the name of the company which developed this library and which was later acquired by the Google in 2005. It is just a 2D graphics engine. It is written in C++ and it also provides the background for the CPU based rasterizer then to show the PDFs and for OpenGL and it is located at lib.sqlite.so. The last SCIA was the 2D graphics library. It also has a support for the OpenGL system which is a 3D graphics, Open Graphics Library OpenGL is designed for the embedded systems then it has the it can use the in-build graphics processor for the hardware acceleration and or if the that is not available then the software rasterizer is also there. So for the API specification in Android version 1 the provided support was for 1.0 and 1.1 and the CPU based implementation was there. So no graphics or the hardware optimization and from 2.2 that is for the support was started for OpenGL ES 2.0 specification and there they have also provided the GPU acceleration. So like I said the support is provided for both graphics means hardware acceleration and software acceleration. So basically at the system the library level there will be a 1 wrapper and 2 libraries. So whenever you write a code which uses the OpenGL functions and if you have specified that the hardware acceleration should be turned on then the call first will get to the wrapper and wrapper will see that if the hardware acceleration is turned on or off if it is on then the when the implemented library which has the hardware acceleration capability it will be called or if the software acceleration has to be used when the existing Android library will be called. And the library is located at libgl and there are many libraries. So for hardware support the library will be different for the existing Android library the name of the library will be different. The next thing is media framework. So we can in Android system we can play the videos, audios and we can use the camera also. So the Android uses the packet videos OpenCore framework which is an open source framework for this and then the audio supports which are in build in the Android RMP3, AAC, AMAP video is for video formats are MPEG4 and H264 and support for the JPEG and PNG is also there and it also has the support for the hardware and software codec plugins. So if the processor in the mobile order tablet has the inbuilt support for the hardware codec. So the Android system will automatically use that hardware codec to speed up the total rendering part. So this is the media framework overview. As the three different parts are the audio, video and the camera. So the first one is camera, the second is media recording we can use the camera to record the, to capture the still images or to record the video and the third one is media player. So that's all ID Java application there. The next player is this Java framework layer. So for the camera it will use the Android hardware camera there and this Android NU surface. So the camera it will in the native part it will connect to the camera and camera service. Actually the one arrow is not shown here which will be this arrow. So camera service will get connected to the hardware camera and the driver is provided for the capturing of the image or to capture the video and the second flow goes through this the Android NU surface and the surface flinger or ui lip to the main frame buffer. So frame buffer is nothing but the screen of the mobile order tablet. So while capturing the image it is also shown on the tablet. So the second part is that. So that part will capture the image or record the video and the same captured image is at the same time shown on the surface using this L surface and surface flinger. So that was first for the media recorder, media recorder is used this media recorder media recorder and like I said the open course framework is used which is the packet video framework and the same hardware codec are provided for that to record the video and that will get used. For the player part now the file can be the media file can be audio or a video. So when you click on the media player to play a file then the android.media.media player will get open which cause the media player which in term cause the media service. Now there are these three parts first is PV player, MIDI player and Wormis player. So PV player is responsible for the video files then audio files then MIDI player is the MIDI files, MIDI files MIDI files which are there the support is provided for those also and this player it is responsible for the OGG files which are in the audio and video with both the files and in this part. So the audio flinger is responsible for the audio I will cover this part in the next few slides. So the second about the surface flinger is whatever you get to see on the tablet or the media the touchscreen part it goes through the surface flinger library and it in turn is a display subsystem. So it can combine a 2D and 3D surfaces from multiple applications to the frame buffer device. So suppose you have one application which is running and at the same time suppose a pop-up comes on the screen. So at the back there will be an application screen and another will be a pop-up screen. So all the both screen those are managed by the surface flinger and they are overlaid on one another and those will be sent to the surface flinger and in turn it will be displayed on the screen. So for the surfaces to be displayed on the frame buffer the IPC mechanism the binder mechanism is used and the whole surfaces are passed as buffer to display on the screen and it can use the hardware acceleration also for the composition. The last part is audio flinger now the audio can we can hear it from the earpiece means headphones then speakers and from Bluetooth also. So the app suppose multiple apps are running at the same time so and they can connect to the media player or game tone audio or if the application is a suppose it is a game then it can connect to the game audio and both will the multiple inputs can be passed to the audio flinger and accordingly which part is selected. So if you have plugged in the earphone then the audio flinger will send its output to the earphone or if the normal speakers are turned on then it will be sent to the speakers. So it manages the audio output devices then multiple paths are possible the input and the output both. These are the references the limit only about anatomical physiology and different architectures and core libraries.