 Welcome back everybody. Here is Camilo from across the pond. He tells us one way just to be able to tell us something about his big project. Hi, and welcome. Today I'm going to introduce you to Babe, a talk a little bit about the excellent approach to being against the past. And you will see. Before I start, I want to introduce myself. My name is Camilo Vita and I'm from Medellina, across the pond, in Colombia, South America. I just recently graduated from the art program and also recently started studying my computer in junior year. And this is my first birthday. I'm very passionate about software learning about contemporary art and graphic design. And also I didn't use the genuine limits since I was in high school and I'm really happy now to give back to the community. So Babe is a music player and I'm going to talk a little bit about the history of what I started making. Because you will say there are a lot of music players and why would anyone make another one? And the first reason I started making Babe was because I didn't like the music players they were at all. They didn't appeal to me. What I wanted to make a music player was... And just to call a lot of people play this because I usually don't have a lot of files, I just like to discover new music. So I'm constantly listening to new music and where I like to download it or listen to it here. So at the beginning I just wanted to create Babe. At the beginning it was just to keep turning it up. And I created it because as I said I didn't need any other music player that was looking for it. Maybe it was just like as simple as it is. And then I realized that Babe will have to manage that music and also in some way to let me manage the discoveries. So I could keep on discovering new music. So they treated very important kids and the way I was covering Babe was to manage the most important to discover new music and to have a suitable design. That allowed me to manage the discover. So right now Babe is a regular music collection player like Amy Over. But here I use to incorporate a contextual way for managing the usage of music collection. This by the user who usually collect music information and the favorite music from online free streaming services. With this in mind, with this treat, if that's in mind, I start moving Babe from G2G3 to Q2. So I wanted a simple and time-plastic and a favorite music. And then once everyone was saying like, yeah, I don't know that music player, why people are making music players, why they just don't collaborate. And my simple move was more because I was learning and it was, yes it was simple, but it would allow me to work on a project that I felt passionate about. There were another two very important things for me, why I wanted to make Babe and the reason is because the music is emerging and contextual and sensitive. Most music players are very static, like they will let you manage their collection and they might do it very well. But they are, you know, those programs, as I always say later, to me they lack some sort of intelligence, because they are aware of the content and the content is not aware of itself. Besides the simple metal information they might have. They don't know really what, for example, in a music player they are reading the songs about and they don't know what the music is about and it's very hard for them to make suggestions or for you to make searches beyond the simple metal. Most music players are static collection players and too many times they are too generic. They lack integration of the system. So if you use one in GPP or Genome and you use it in Plasma, then there are pictures from music. So I wanted to make a music player for QDD specifically and using QDD technologies that will take a bunch of. So when the content is not aware of itself or the application is not aware of the content, for example, when you put a music player, you cannot make queries like hip-hop music or these artists that are Osama for our notion that specifically is like this one hip-hop for music. The application of the content and the content is aware of itself and also contextually, the content knows around the, for example, one of the files knows about the other music. By the moment, Bay lets you discover. This was the strategy I was using. I was discovering music via YouTube and then created this little extension that connects to Bay and then it can add that music that I like to the collection in the way I was able to keep discovering music. Right now in the state of Bay, you can just browse your music to a number of music, including a regular music collection player, like by album, by artist. You can also make a filter in the music, but in a regular way. Filter by artist, by the net, by location, by the name of the place in which the person meets. It's to generate this style of queries and the results. Not only locally, but probably the internet and the streaming services. This is the current design of Bay. Any words about Bay? Also, you can mark your songs by something I call Wolves, that in the future will be penalized and automatically tacked by passing the music needs or the sound waves and then automatically tacked. Now I'm going to share this on a screenshot from Bay and I'll talk a little bit about the future of Bay Hubs. Right now, it is the internet to learn about the music. Here you see, in part, like most music players, you get deleted and we open the artist out of the information. We also get similar artists and tasks that are related to the sound. This task works as links inside the library. And also, for example, if you listen to this song, you click on that, and you will search to how to get over to the library and then we generate suggestions. This is an example. And I call it probably musicals, it's like you start from one point and then you came to the start scoring in music. They are very, most likely similar to a song, so you're really lucky. This is the honest view and the artist's view. So, the problem I found most of the music players that I was using is what I call a thematic app because those music players rely on the metal art of the art. So they add levels of registration because most of the attributes from the metal art far from contextual and relational are often looked as scripted, a structural and a ministry. They have information about the preservation of rhymes, compression and electronic information. But they add a relational and contextual information. So what do I mean by contextual? Are we trying to find relationships that we can do in our music collection? How would you music about love from the 17th grade to music about love from the genetic he called, or there would be how relationships would work to make suggestions. Also contextual because it aims to get extra information about your music, for example, what is the meaning of the lyrics, what is the song talking about. And also, in some ways, by making pretty pussy analogies. Also it will let you expand the stack in the metal art information to process with online information. So the goal is to let the content learn from the internet. And this is done with MDA that stands for Music, sorry, by Multimedia. Okay, I forgot it. But this can be applied also to music, to movies, to videos, to books, to dance, to images, and to text documents. For example, by two images you can get information of the images like where the quote was taken and then that information can be crossed with information found on the internet. And that way the content will be more aware. Another example is for documents. It can make a parsing of the document and understand what the document is about. And then when you are making searches, it won't have to be specifically about by the name of the document or the place of the document, but more like if you are learning a document about whatever subject and if you enter in the search something related to that subject then it will appear to you. By that, I think it will let you rediscover your own content and have a smart local content and the content be aware of it. So that, but let me introduce a lesson learned from the internet. So some of the techniques I've been using to do so is making use of the music information retrieval with data mining, crawlers, data mining, automata, and natural language processing and something like that on my life. So the first step I've been taking with that is to create an ontology that describes the new series ontologies on the internet and I was making use of the music ontology but the class that describes the leads was very... well, it exists but it was just a trigger I just... What I'm using to describe the music I'm starting this structure into intro collage birds and by using natural language processing I do the structure by saying a song it's about a car, a girl, and the frequency. So this way the intro has a meaning and it has text and has a frequency and there are transitions between intro collage for example between the intro and the collage it's about love between the hook and the bridge, it's about cars So what is it about? And I'm setting this to an XML file that later can be parsed to give results to queries All of this I'm making a new project that will expand the knowledge to music through any of the kind of files like documents, images, videos and the project is called the Poeple and it's on your code there's the URL where you can check it out and it's to me, it's kind of like where I will give it some kind of feedback and I will give you results similar to a track, to that image if you have any questions you can make them because this is my first time doing presentations So some of the challenges I'm facing right now with the project is to define frequency where I believe is a frequency how to collect the information of the user if the parsing and analyzing is going to be luckily or not Also one of our teams is making annotations on the file itself or just using a database with all the extra information I will get from parsing the file and that was the amount of metadata The thing is, if I want to add somehow the information that I become able to extract from a file, document, music if that information is going to be only on a database the application is going to use or that information can also somehow be written down on the file itself and that's the same problem with the distribution of the information generated That's it You can try the application right now at www.qd.org and there are possibilities in publicator where you can find the project and hopefully we'll soon be able to generate suggestions for music but also to images It's pretty stable right now and they call this available and once that's done I've seen many overlaps with what MetaArray is doing with several projects that could be seen in the keynote Have you collaborated with them and are I going to talk to them? Can you repeat the name of the project? MetaArray, so the people behind the music brain Yes, yes and normally I haven't talked to them I've been using their API to expect information but I haven't talked to them but that will be very interesting for the world Robert is here so If you want an introduction to Rob lots of us can do that Can you show it? Do you have an application? Yes Do you have an application? Yes This is the main interface Using here was I collected from the tool I'm going to show you also a lab So I collected music It tries to in this case it fails to convert It tries to figure out what is the title of this graph and who is the artist So I just collected the reviews it will be added to the local collection I'm going to say here we are a little time covered but it goes through music services to look for information and here you can find related tracks and the idea is to have in the search bar the custom query and here is the right now it's just music I've collected a lot of the tracks I just collected from the engine and listen to this song I discovered this new artist Susan and yes you know you have right now it's working programs and using a branch on the repository DA instead of on the YouTube it's tax information from YouTube and it is listed here and you can collect it from the interface itself so that way it will be generating suggestions so it was loaded this is a what it is right now it's a lot of way more work it will start with more precise this is it what I have so far