 Hi, my name is Kempf, I'm working on VLC for a long time and I'm going to talk about what we did to put VLC on mobile OSes. This is not the usual talk I give, so I'm sorry if there are mistakes on it, it's one of the main focus of my development those days, so it's a lot of stuff I'm not completely set up. Don't hesitate to ask me and to ask questions and to re-interrupt me because that would mean that you're not sleeping and that would be lovely. So I'm just going to remind you stuff. So I'm a French engineer, I'm 31 years old, I've been working on VLC since 8 years now and I spend most of my time now on VLC and related projects on multimedia, everything I do is open source, mostly. I'm the president of the Videoland non-profit organization, which is a French non-profit that is mainly here to help develop multimedia projects including VLC, X264 and other libraries that I use a lot. I'm not going to speak about how VLC was created too much because this is probably the talk I give on Thursday. You have to know that the reason VLC exists is because the students who were on the campus of the École centrale Paris could not play Doom on a token ring network and so they want a new network and that's why you got VLC. VLC is a thing that no one knows but everyone uses. Many people don't know VLC, they know the cone player that plays everything. The story about the cone is also very funny but I can't explain now. I will be bright later if you want. VLC is known for two things. It plays everything, everything from completely stupid codecs, MIDI and stuff that no one plays and it plays them everywhere. We are very proud of our latest port to OS2 that was done last year and the three users are very happy too. We work on some weird OSs like BSD and the reason why VLC was very portable was that at the time where they started the project one of the main idea was to be cross-platform and when I mean cross-platform we meant BOS and Linux. Windows and OS10 were software that port that were added later. VLC is big, big means around 900,000 downloads per day on our servers so that's outside Linux distributions, that's outside download.com and all those scammy websites and outside source for and so on. This slide is a bit outdated, I'm sorry, but since we started counting we counted more than two billion downloads. What we do is open-source in the very bizarre way, I mean we don't have 400 millions, dollars per year are funded by donations and we don't employ anyone with this money. So I can speak a little bit but I'm going to focus now on the port on the mobile. So this talk is mostly about the technical questions. If you have questions on the usual questions you should refer to my first dem talk that I'm going to do in two weeks. So please come to Brussels, it's not far and here I'm just going to say a few stuff. I'm sorry, my mic seems to be on and off. VLC does not exist, there is no spoon, VLC is a very small wrapper around libvlc so it's a 200 lines of code and libvlc is a stable API above the core named libvlc core. As you see we have like lots of great ideas how to name things but we are not the only ones. So libvlc core is basically the core of VLC, it's a very small piece of code around 80,000 lines of code, maybe 100,000 with the comments of POC that is basically doing network threads and a few abstraction between OSs but doesn't do anything. Everything that is a codec, a video output, a nodio output is done in a module. This is a very standard way when you see stuff like Gstreamer, quick time, direct show mid-air foundation, it is a bit different from what you have seen with mPlayer for example who is just a monolithic thing. So all the modules are done and loaded at runtime and the graph is created at runtime. The good thing about that is that you can take the best on each platform. So for when you do a cross-platform mid-air solution for example video you could say well okay I'm going to take to do output OpenGL because OpenGL works everywhere. Right? No it doesn't. Every time you try to do something or put GL on Windows this is going to be painful. Same for audio outputs or even codecs so it's always the same. The thing that was decided when I was not even around and I didn't know what a computer was was to have those modules and we have different types of modules like decoders, video outputs and they are loaded at runtime. So what do you need to port VR to a new platform? You need a compiler. It seems obvious but we are porting to a platform whose name is Windows Phone to not name it that doesn't have a compiler. I mean doesn't have a working compiler of course. You need a libc, I mean a basic one, malloc free, a few string functions, basically handle files and handling unicode strings would be enough. We need threads and also in VLC we use thread consolation points for issue we had with porting to mobile ports. We need network and when we mean network we need the usual BSD sockets. It seems obvious but it isn't on all platforms and we need a way to basically load modules at runtime which we can't do also on all platforms. So those are basically the core of VLC that needs to have all those and then we need a new output in order to output yes one is following and a video output to output thanks. We know usually a UI but the UI is boring so we don't care about and I won't cover UI stuff because we also need some on mobile ports codec APIs. So I'm going to start with the worst which is iOS. iOS is a very bad OS because it's a very close OS however technically it's the best mobile OS we have around. Why is basically it's exactly the same OS as macOS and when I mean exactly the same it means the exact same bugs. So the pthread and signal implementation on macOS is broken and it's broken in the same way on iOS but we've been working on OS 10 for 10 years and working around those bugs so we know that. The biggest limitation is that everything needs to be statically linked so you don't ship .so you ship .a on iOS through the app store but also in jail broken systems. So that's one of the issue we had to compile everything inside a big libvlc module which is a .a that has libvlc libvlc core and all the modules together and all the libraries also so that makes a .a that is 14 megabytes and we have to support multiple architectures on v6 on v7 on v7s on v8 which makes a big blob but that's the way you do it on iOS and it doesn't support multicast. I don't know we filed bugs it was broken it works fine on OS 10. There is just a few limitations you use audio unit on iOS and not a whole because it doesn't work and the audio unit on iOS is limited to stereo don't know why and the shaders of the open GL output are completely buggy on iOS except that it was quite easy it was the reason why it was done the first but it's not working because it works right so now we are going to vlc iOS is 100% open source we support Android from 2.1 which means Android 7 it's a full video player we play everything that we play on the desktop the biggest difference is that it has a multimedia library with SQLite and it's also a great audio player and we can't say that about vlc on the desktop oops oops sorry my slight of a small issue so Android is technically very bad so we went from iOS where everything was quite correct Android is bad and it's a mess so no one understand anything the NDK is completely limited everyone is all the manufacturers are hacking stuff everywhere so it's very difficult and when people complain about fragmentation on Android they're speaking about different size of screen this is like just joke for us because we need to every GPU every DSP has a different bug and every manufacturer is modifying the headers that are supposedly public so the structures are all the same of all difference which is API breakage API breakage all the time and a lot of Chinese manufacturers are solving the issue by adding extra craft on stage fright which is a middle-layer API saying well it works on the new so then they are the the first issue is that Android compared to iOS supports shared objects I say well great yes but it's limited to 20 shared objects why don't know on the Samsung device since 3.0 it works up to 40 shared objects per process but only all the HTC until 4.3 you're limited to 20 it's not documented anywhere you just need to see the issue so we went the exact same route as iOS we have statically linked everything into a big shared object libvlcgni.so that has everything so the resulting one is a shared object but it's completely ugly and so you load everything it was too simple to take a working libc so they had to rewrite one of course and of course it's completely broken so p thread was broken no read white locks p thread okay is it better yes I don't yes do you hear me yes okay so p thread was broken the read write lock were not implemented it was of course buggy so the solution was to take some part of the libc on Android open source version of the version 11 and to statically linked into the libvlc we load that every every printf with white car and unicode was also broken so we did the same took stuff and we linked them statically to every so we do they decided of course that the off tee was going to be 32 bits and that's an OS built in the 21st century yes so I mean wow even windows fix that in early 1998 so incredible the multicast is so broken that we disabled it and this is just the one that I like on the top of my head when I was flying here there are so many others I I don't understand why they didn't take a bsd libc but that was a simple part because then I mean we'll see we just care about like actually output in audio and video this is just a very early early diagram just to there are so many layers on Android it's insane there is just for audio audio track audio finger different version with a binder audio player there is the awesome player there is stage fright and you need to know what is and of course nothing is supposed to be public except the Java APIs and well you're basically a C player your multimedia framework you want to write and see so you tried many places and you hope that it's going to work examples for the audio the only official API is audio track Java API so the audio it's good it's working the problem is that it's quite slow and the delay computation function is broken what is a delay computation it's mostly to know how many buffers you've already sent to the audio output in order to have lip sync right so I mean after 40 milliseconds you see the lip sync issues so this one is the one that we know works but it's so basically you're you're setting audio from the sea you're memcoping it to the Java and then you're sending it back to layers back to the drivers so that's awesome design and already I don't know five or six mem copies and as beloved people say mem copies murder in multimedia so that's bad well so that's not problem let's go directly to audio track audio track is native we can do that so that's pulling a headers that is of course not open and that is well it's open but it's not public and it doesn't have any delay computing function at all so well you're on your own so we try to do stuff and probably what we're going to do is that we're going to keep this one but call in the Java side in order to get the computing delay options the funny also with this audio track native is that every deciding it starting in android 4.2 to inline most of the function in the headers right so that should be fine except that a lot of a lot of manufacturers started to add new stuff inside the structure in the same headers the result is that it's very difficult to know if you're going to be able to deal open and to find the right function starting on since 4.2 well then you complain to google that we do we do that a lot and though they said okay we have a solution we're going to implement you a new stuff android open sles wow it's open it's supposed to be great well of course it's open well android open sles is implemented above audio track right and of course it's not used by any google application do you imagine how buggy it is yes it is um it works fine if you don't be sample ever ever so if you output 48 000 hertz it's going to work and it's not going to work even on the nexus devices outputting 44.1 is going to be broken and well and even on the nexus 5 it works on 4.2 but not on 4.3 it works on 4.4 but not on 5.0 and this is the official um the official output we have in vlc um an android and this one is the only public apia you have in c great well that's not a problem we are going to go deeper we are going audio flinger yes but then we have the same issue as your finger is too low and many hardware people are fixing the bugs above it in audio track in order to have basically um youtube app works if the youtube app works that's great because it's used stage fright and the new media codec uh the old media player apia but starting from 4.2 and 3 um in video they added a new apia called media codec and they are starting to have new um application like the new youtube who are using this new stack and this is the reason why you see so many devices who are blocked at android 4.2 is that the the step from 4.2 for 2 4.3 then you get the new youtube apps and then it's broken um so audio flinger is working great but it's not portable video well video is the same um so before android uh 2.1 it's almost impossible except mem copying every frame so in rgb so it's not good for video right um on android 2.1 and 2.2 you don't have any way to output correctly in c um uh video so what you can do is dial open surface flinger um surface flinger is of course a private api um and you need some heads that are from android 2.3 you can only output rgb uh anything else is not working of course i mean we are on the 21st century we do video all the chips support yuv since mid 90s why have a way to output yuv so it's pure rgb so everyone and there is no way to access the the scaler so you do that in on the cpu and you kill your preference um starting from android 2.3 we have finally a standard api to get uh android native window uh it's great it works fine but you can have access only to one buffer um which is a bit slow so while you're filling your your buffer in order to display your blocking all the chain um if you want to have more than one surface uh you can use android native window private and once again those are private headers that change so in the way in order to do that we are doing um we are recompiling when we compile vlc lib and native window shared object version 11 14 18 and 21 and at runtime with the different headers at runtime we are going to load and see which one is going actually to load and then our video output is actually going to load this library that is going to works fine uh with that and that's for the main release of android 1 vlc for android 1.0 we have up to 16 or 32 buffers uh which is great because then you can do full uh uh direct rendering uh using libv correct uh you can even have yuv surfaces and you can even with a kind of hack know if your surface is going to support yuv or rgb um of course it's nv21 and not why uh yuv 420 but it's pretty cool uh you can also manage the opaque uh opaque buffers which is good when you do full hardware decoding so in the end now it sounds should be good but you need to do quite a few hacks um we tried the open gl es way we don't recommend it to anyone it's a mess egl is not working um it's also conflicting mostly with uh java side so if you have to uh for example display any widget above it you're going to have a lot of pain uh don't do that uh codex codex codex codex so some uh there is open max i'm i'm speaking of the il one of course but the you cannot access the open max you need to go through the binder like everything on android uh so we have this kind of iomx which is ipc omx ipc uh it works since 2.3 um of course you cannot do full gpu zero copy because you cannot it's very difficult to allocate the buffers and well technically you can allocate the buffers but you can't display them um because you need an extra set of private apis uh or you could do that only in rgb but most of the open max decoders don't output rgb um starting from android 3.0 um which if i remember correctly was not open source until 3.2 um you can use the same mechanism but you can access to graloc she allows you uh to um allocate correctly those buffers and to display them uh it's a hack because iomx and graloc is completely uh not public but it works mostly it works on the nvillia tablets which was the important ones uh the famous tegra 2 uh which were amazing chips with no neon yes um starting in 4.1 um we after complaining a lot um we receive media codec uh media codec is finally a new api uh from android that you can access to get access to hardware decoders great great great great they just said okay you need to output uh buffers in yuv 420 but they did not explain the layout and there was no test case in on the test suite of android to check the layout what happened of course all of them were broken um because everyone was using a different pixel format uh we um we spent a bit of time with martin uh from libevi uh to finally merge a test case so starting in 4.3 we are sure now that the buffer output are still 420 but are the same layout um so starting from 4.3 you can actually use media codec you can use software and hardware uh rendering and it works fine but it's still in java starting 5.0 you have a version you can do that but um you can do it directly from c except it's still broken in 5.0 the c version so you need still to use the java it's going to be fixed in uh android 5 um let's so we're starting vl port vlc on android two years ago that was the first ui um we released our own google play uh only for arm v7 at the beginning it is gpl v3 um well technically the code is gpl v2 or later but on android it's impossible mostly to have gpl v2 code because you need to link to code that are apache 2 which makes it gpl v3 sure uh we made a lot of iteration in order to make it less shitty and less horrible that was the first version that was actually usable um and but it was black and then we designed it to be a bit more hollow like and we finished the hollow uh design not too long ago when they started to give us the new uh design of course um and that's it for android um i'm just going to give you a few laughs and speaking about the port of winarty so uh port of winarty we don't have a compiler we don't have a lipsy because you can't access to file um you can't have utf8 um the you don't have a c99 compiler you don't have thread consolation um you don't have network you don't have sockets yes yes there is only there are new winarty sockets which are in asynchronous sockets for the thread counseling point we have a wake up of every thread every 50 milliseconds amazing we we are able to deal open any libraries that we ship so a contraria from ios an android we actually ship dlls but it doesn't work uh on windows 80 it's only works in 8.1 because it's buggy after 25 libraries that you load um audio output well um you can't use wasapi correctly because if you do that you don't have background audio so that's problematic for the um audio output uh on video output you can't have any yuv surface uh so we wrote uh a direct 2d um special filters that of course does not work on all the windows phones and of course you cannot um everything that you do for the ui must be in c sharp and in order to discuss from the c sharp to the c level you need a special language called c plus plus cx but we did it thanks if you have questions uh i'm available and i think uh i took a bit already too much time um so if you have more questions i can take questions after also i'll go do a talk on thursday and i'll do a talk on at the first day so we can take we've got quick ones any quick questions a lot of a lot of platforms to do programming for how do you solve the problem of testing for all these platforms and even devices or versions of of here whatever devices and and software so the good part is that most of the code is similar um because um most of the code is libvlc and then we reflect it to the different language to do the ui so most of the code is similar and we just have very limited parts for the video output the audio outputs and so on how do we test i'm sorry it says that most of our tests of course we do what we can but most of the test is what i call crowd sources testing we send a version we wait for the bugs we fix the bug we release um it's very bad but there is no other solution especially for android um for the testing of the lower level i advise you to ask Diego who is there how they actually do the testing for the codex because that's one of the most useful uh case for vlc with the number of android versions you're supporting what's your plans for deprecating and then supporting new versions none um so now um the version 1.0 was tagged and we're working a lot on 1.1 which is the first version that is almost released that is material design and we spend our time checking that it still works uh since android 2.1 um the reason is that we don't we try to do a software project that is cool and that is technically interesting and we don't deprecate stuff because uh we need to deprecate so most of the java compatibility libraries shipped by android for material design started v7 which is android 2.1 so we're doing our best to keep compatibility um i don't see us dropping um so we might drop at some point 2.1 and 2.2 because of the video output but i don't think that we will drop uh 2.3 ever excellent well thank you very much for that