 Good morning everybody. Bonjour à tous. We are here to talk about Android video streaming. You know, it's not a good commercial name, but well, let's try to think about it. We are talking about Android, you know, the operating system, mobile phone, developed by Google and the community, video streaming. We thought that maybe everyone with mobile in his pocket could be a journalist. You know, in each corner of every street, they could send a video to anyone who wants to see it. And that's a good idea, maybe that's the future we've got. But well, we are talking about Android devices as video prosumers. That's what I said. We want people to send what they see, what they look to all around the world and all these kind of things. In this slide, I thought I could use, I could put an index, but that's a good example of not an index. I'm going to move from past to future of this project. I'm talking about the prosum. All this started with a very short development team. It's just software developer, and I'm a telecommunication engineer. And we thought about this requirement that people maybe could send what they see to internet and maybe they could send to their friends and who knows to whom. I'm working in H.I., Iberia. It's an enterprise from Spain. We have developed this product because we've got a client who paid for it. So we have enough time to develop it, and that's why we move into this process. We are a company who is working with ICT companies, banking, public administration, public bodies. And R&D, research and development. We are in seventh framework program, national programs, regional programs, and all these kind of things. That's about an enterprise since 20 years ago. We have doing things about technology consultancy, maintenance of our quality development. And nowadays we are working hard in open source products and research and development, as I said. Which is the project origin. We were working in a public project of Madrid City Council in the arts area. They were thinking about using the visitors, not just showing to the visitor what they have made, what they have thought, they needed feedback from the visitors. So there is a cultural institution, Matadero de Madrid. I don't know if you know Madrid. Well, if you go, let's say it, they need interaction with the visitors. So they thought they need three columns, three pillars. They need wireless infrastructure because the people who go there need wireless infrastructure to connect with the world. They need video streaming because they think that maybe all the things that visitors see or think could be interesting for anyone in the world. And we have developed a semantic search product just for the archive. They have events, they have presentations, and they have all these kind of video things. So they want the people to find the things quite easy. It was a duration of eight months, and it was completely open source. So we've got about the video streaming thing, a simple idea. It's about Mayan. We are seeing the new car we want to buy. We want to send it to a friend. And so we've got our mobile. That's the one I have. It's timezone Galaxy S. I don't have stock options in this company, but it's a good mobile phone. You could buy it. Well, then you capture, you record the video, and you take it in your mobile phone, but you send it to a transcoding server. And we use the well-known transcoding application with GAD, FFMPAD. And we use codec and continuous free container, you know, TRA, or all the things we've got about steve.org. So we transcode the stream, and we send it to a streaming server. We use iSCAST, and we use OKFORWAR to send it to upload the content to the streaming server. And if we've got it over there, we can consume it in the laptop with using Firefox, HTML5, BLC, or NPLAY, or everything we want. That's the simple idea of the Acrochutter. Which technologies we are using? Well, we have tested it in Android devices, 1.6, 2.1, 2.2. We use TCP as a transport protocol, because we are doing buffering, you know, and we have no problem about the feedback, the recent protocol on this kind of thing that TCP has. But it's quite easy for us to send from the mobile to the transcoding server using TCP. We were thinking about RTP, the TPL1, but we prefer to use it. We have to convert. We record the signal with the video and the audio in different streams. I'll tell you later why. But, well, we've got the container MP4 and the audio with Pulse Code Modulation, PCM, in different streams. And we convert it with this codecs and containers, R and 3R for the video and Barbies for the audio. We use the transcoding application with GAT, FMP, and iSCAST as a streaming server to send it to upload the content to the server, to the stream server. We use Java, it's a programming language. We love out-of-memory errors, so that's why we use it. And only because we've got a previous product, we reuse it. We just sell a scripting to control the process of the pipes we use to connect FMP with iSCAST, we just sell a scripting. And the end user side, we could use whatever we want. VLC and player or HTML5 inside the Firefox or the browser. Technical details, as I said, we've got two different recording threads, because if we record them all together, we have problems to send it to the transcoding server because we don't have all the headers that the container format have. If we send it, we don't have timestamps, we don't have all the things we need to know which kind of bytes are video, which kind of bytes are audio. So we record them separately and we send it separately in two different threads, and in the second server, we mix them. We use a control port, we send XML data to include the major data information, like the author, the title, or whatever the user wants to include. And we respond with the listen ports. And as I said, we send the video and the audio in different ports. We use TCP to send the information. We use well-known transcoding technologies, so there's no problem about that. It's quite well-good tested and there's no problem about that. The problem here is the audio delay. It's controlled because we are using our own timestamps. We find which kind of delay we've got in the record thread, and that's why we put in the transcoding server to mix the audio and the video. And we are using pipes to connect the application we are using, FFMPare or Fowler and all this kind. We have developed a version of CAD called MyCAD because we really needed to construct a stream of constant bitrate of bytes. And if we use CAD, we are maybe having problems about broken pipe and all these kind of things. And we already developed a little application called Chopp Pages. We are trying to put inside FFMPare. We are talking with them, with the developers. The only thing this application is doing is making pages of our container just a few little, a few short, because they are quite big. We have problems with the ISCAS server. Here we have some screenshots of the application. They are in Spanish because it's for a Spanish institution, but it's quite simple. If you have already worked with Android, it's quite easy to define with an XML all these kind of things. We have here to define the configuration of the server, the metadata, the vision of the camera. It's quite simple. I thought I could do for you a live demo, but I didn't know if I could have Wi-Fi or... And I don't have a 3G connection or a roaming connection, so I beg you for trust in me. I don't know if you trust and get, like the Americans do, but well, please, in Fernando, with trust. If you go to Madrid, maybe you could go to Matalera. I don't know if you have ever been there. You can see it working over there. They are trained then when they have an event or maybe the visitors could... They have all the change. They have the streaming server and the transcoding server put in their machines, and you can send it, they define, they have posters. You could send your video to this direction and you could use it. Just cases. In which populations, more than the cultural institutions, we are thinking that maybe everyone could be a journalist. In each corner of Ever Street, we could be recording what we are seeing and maybe there is someone in the other part of the world that maybe could be interested in that. And maybe, for example, in Madrid nowadays, not the Dixieland people could send what they are seeing to the rest of the world. So everyone could be a good journalist. Another just cases could be, for example, if we want to see our children at the school and they are playing football, so maybe we want to send this football match to our family or anyone. So maybe it could be another good application. I don't have a photo about children playing football, so I use this one. I don't know. That was the best moment of my life last year. So another good application could be... Oh, sorry. I don't know what happened. Adult content. That's not a joke. I was talking to an adult content provider and he thought that if anyone in their house could send what they want to record to another people, that's a very good, interesting thing. I suppose you know Russian roulette. I think it's called... What come that it's changing in a random circle. Maybe that could be a good example of what we can do with Android bitter streaming. And the future. Well, we don't know who knows the future, but that's an open source license, so maybe you could do things about this development. You can give up support or maybe we are trying to decide to which forge we move the code, but of course it's completely open source because it's on the contract and we know about the open source is the best way to develop. And if you want to contribute or maybe if you want to... you have good ideas about this product or about the future of video streaming, maybe you could contact me or you could enter to our website and good opinion and good see what we have done. Okay, so that's all and if you have questions or maybe it's just two minutes. Why you need transcoding? Because we've got... if you see the API of Android, we don't need to... we don't have the possibility to write with using free codecs. We cannot... we cannot record with R, Barbies and Thera. We wanted to use them because we wanted to use a free streaming server, ISCUS, so we had to use R, Barbies and Thera. We thought about... we thought about putting inside the Android the libraries of R but it was quite easier to use. To send the codec or to send the stream as the mobile record, you know. No more questions? Okay, thank you so much.