 Okay, good afternoon. I hope everyone enjoyed their meals. Now I'd like to discuss the usage of modern peripherals and the Pact-on-Web development. I'll start by describing myself, then I will talk about the Pact-on-Web socket, which will be used in this project. And after that, I will go straight forward to a few modern peripherals that emerged in the last few years. Tomorrow with links to where you can get the source code used in this presentation, and also how to reach me. It came from Poznan in Poland, which is only a two-hour drive from Berlin, so it's close to the civilization too. I'm learning Python only from the end of the September last year, so there are a newbie. But it was enough to get hired by the awesome company, which is STXnext. As a junior Python developer, I'm creating apps using Flask, Django and Pyramid. I'm also a junior mentor at PyLadies in Poznan, where I have the opportunity to teach Python and Flask for younger generations. Before that, I was a photojournalist, war photojournalist, and head of local photography for one of the major publishing houses in Europe. You can see a few photos from when I was working still. And I still do some photography, but mostly for pleasure. And this is my first long time and first in programming. WebSocket. WebSocket is a protocol providing a full-plex communication channel over a single TCP connection. Sorry, I'm using one of the devices I will be talking about to present, so it has not working as good as it should. And I will not go to the details about the WebSocket protocol. I'll just briefly explain how it works. And it works with a shake handshake, which upgrades the connection of HTTP to WebSocket protocol. And after that, there is message exchange from server to client, triggered by a server, depending on how the developer programmed the application. And that's the part which interests us when communicating with the perpetual devices. And after that, there is a kill message that ends the communication from one side to another. There are a few ways to implement Sockets in Python. The first is the built-in Python socket library, which is very nice, fast, but has one disadvantage. You have to create the JavaScript, the front-end part yourself, so I didn't use that. Instead, I was thinking about using the socket IO library, which has both Python and port. But it would require me to run a second server just for WebSocket. So my solution was use flask extension, flask socket IO library, which is all in one solution. And allow me to use just one server for all communication. There is a picture version of what we're doing. We have a flask, which has a major stream of communication using HTTP port, and also a second to Sockets. WebSockets. And here is our receiving part on the front-end for WebSockets. It's very simple. It just connects to the application part on the back-end and emits or receives signals from the back-end. If you know flask, I'm sorry, I knew one thing more, it's important that we'll be using 0.9 version of the socket IO application because the current one is not supported flask extension. So it may differ from what you'll see on the Internet right now. And this is the back-end code, which is responsible for Python part. And if you know flask, it will easily use the flask IO extension because as you can see, there's an uproar for flask and there's socket-on connection. That's all you have to really worry about. And the second test on receiving and to emit, you just use start of the scripts. The first device about this A-tribe is an A-trucking hardware, which is nowadays advertised as the cheapest solution on the market. But despite the self-explaning name, what is an A-trucking? A-trucking is a process of using sensors to estimate where a person is looking, the kind of gaze. And A-tribe uses infrared sensor for it. The hardware part is simple, there's more complicated part on the algorithm side of the project. And A-trucking can be used in a variety of applications, passive and active. I will talk more about passive during the rest of the presentation. I'll only mention some of the actives. For example, it's a device control. You can, I'm in games, control your mouse point or even use it to log into your computer. A-trucking isn't anything new because you can get a photographic camera from the 80s and 90s, which will also be following focusing system, like Canon 1.0. And here is a short advertisement from the producer of the application. It's how they would like to see their device used. They concentrate more on daily usage. And as you probably think, this is only advertiser. It works so well and it's true. To prove that, on the second slide you will see what I think may happen in future, the A-following app written in this code. And it was working as you see. Not so bad. It has some glitches, but it was still acceptable usage. There are a few possible web applications. The first one is analysis. To analyze what the user sees, what he focuses on, to improve the layout of the page, of the design of the page, of course the ad location on the page. And it's already common use nowadays. The second application is to control whole sites using eye-tracking. It will allow even severely disabled people with mental or to use it. Everybody knows Stephen Hawking. He managed to use it daily. And probably more people will be able to use it if the devices will be cheaper. And the third possible usage is ads control. That will happen after using eye-tracking. It will be used on daily basis. And then you will be attacked from everywhere by the ads that will just follow your eyes on the site. Other places, even when you are driving a car. And when you will be stopped using keyboards and mouse, imagine that you will have to roll on eyes to cut or blink twice. Okay. You can download the simple Python wrapper for the app device, which is only one file that connects to A-tripe Demon, which has to be run during the usage of the A-tripe device. And it's a very straightforward written and provides all necessary data. Here is the Python code that connects to the A-tripe Demon when the user connects to the WebSocket. And after starting an event, it sends data periodically. The Python solution wasn't too good to present live stream of data, but it was much better for analyzing data. And this is the part for moving the text on the Python site. I didn't make a movie because it was too glitchy to even show. And the pure JS on without Python was even simpler to use. It was not lagging too much, but it also required another demon to run. So to fully run the site, you would have to have at least three devices, one for each demon, which is quite demanding on the resources. And as I was comparing the JavaScript solution and the solution, both are usable, but you have to choose what do you want to do. If you want to acquire data and analyze data to improve the website look, you should use Python because the rate of receiving data is much higher. If you want to create an AFOL or ATIT, you can use JavaScript. It may improve when the socket IO library with streaming capabilities will be also ready for Python, but nowadays it's not, and I hope it will change. And when I was thinking about this project, I came up with an idea for application that would be analyzing how the user is looking at a picture and write his knowledge about the art. By looking where he focused or how he moved his eye on the picture or even how long was he looking at the picture because we can measure that now easily. And this is how the IT looks like. It's really simple, small and portable, but it's hard to use with a notebook, you can only use it with desktop PC, to be honest. The second device is Mio Arband, which I'm using to the presentation, which is literally as you see, it should excel, which should excel because it's still advertised as an ultimate presentation tool nowadays. But maybe with some practice and without moving your hands, you could use it better. Firstly, it's a new way to interact with computers. It's more focused on the presentation because, as I mentioned, it tried to excel. The Arband reads your muscle activity using an electromyography sensor, with electric impulses going through your muscles. It also has accelerometer to allow you to control the software with gestures or motion you are using. And here is also an ad for Mio Arband. Some of the work is even good. I didn't try the game part, that doesn't work good. Sorry, I crashed it two times. But controlling movie or iTunes works good enough, but it's not a web update. I also didn't try to be a soldier. To be honest, the reality, as you can see during this presentation, isn't as colorful and cheerful as it is in the ad. Sure, you can see scrolling with gestures, wave in and wave out of the site. The first part was one time, it works fine. But the second sent a few times the same event, the whole page down. So there are still gritches on the Mio SDK part. There are many possible web messages, as I said, page control. For example, you have a metal band side, and imagine that to see the login screen, you have to make the famous rabbit sign. And after you make it, you will always see the login screen. It can be very useful in front environment, or when you have to work on something remotely. For example, imagine a scientist working at a volcano place, using cable probably won't be possible because of the heat. And you can still make any motions by your hands, and it doesn't have to be precise. And the usage I found could be useful, that it will be a fit add-on to your application when your client is working too hard and type too many words. He will have to make a pause, make a few gestures, and then he can continue the work he was doing. It could be also useful for programmers to stop working and relax their hand muscles for a bit. And the Python wrapper is still an active project with a very nice and helpful initiator. You can ask him anything and he will be still helpful. It's much bigger than the A-tripe wrapper, because it sends and receives much more data from the muscles and from the gestures than just the location of your gaze and the location of your eyes, which does the A-tripe wrapper. Here's the Python code, which is fairly similar to the A-tripe code. And as you can see, also on the connection, I started Demon, and I sent the current gesture, if it was wave in, it should send me a down event. And when I make a wave out sign, it should send me a wave up event, which should scroll my page, top or bottom. And the socket, the front end part is also similar to the part of the A-tripe, and in the next example it will be similar situation, because it's just for receiving data and starting the connection. I would like to think about the Tinder application, where you can dislike or like people, and imagine using it by your people. I dislike her and Slap, or her. You can make a heart sign or come to me. And it's interesting, because you can use even two armbands, each for one hand, and I think it will be rather more useful in games, where you could control your avatar by your hand. There's also many movies on YouTube, and it works quite nice. The third of the motion is Mio Arm 1, and it's quite similar to everything. And the third device is Lead Motion, which is really small. I think it's the oldest one of the three. Many people could already sense something about it. It's the most mature one also. The Lead Motion brings gestures from phones, smartphones and tablets to your notebook or desktop by reading what you are doing in front of the screen. It uses two infrared cameras with infrared lighting to get an image of your hands and the algorithm part which converts this image to a precise 3D image of your hands. Its quality is enough that you can get the data from each bone's joint up to the gestures, everything is there. And how it is presented by the producer. And this time the advertisement is quite real. And most of these things are possible and works. And not laggy, and the most precise one of these three. And this is how it was done. I was grabbing it and moving it this manner four times. It works perfectly, but it was a gesture. When I was trying to make a similar thing, like I point into a point and it should give me an advertisement or a ping on this location. The problem on the web socket implementation which was getting lags because of the request and responses all the time. And it wasn't working that great. The web usage is mostly to control the site itself. You can imagine that pointing a finger in an exact place could replace the mouse. It would be difficult to input and text. Or extending the current usage of the website. More further in my imagine app part. The Python wrapper was created by the lead manufacturer. So it's very precise. It's the biggest one of the all. And it gives you the most data from all of the three wrappers. And it only needs two libraries to work with the device. They don't have to even be installed globally on your machine. This is the Python code just to receive, just to get the index finger. And when you will be pointing it, it will send the data where the finger is. This is the also receiving part. It's very similar to the A-tripe because the usage was similar in this example. And now I would like you to imagine a touch your pizza application where you see all the ingredients on your screen and you can just move them on your pizza. And wait for receiving the pizza as similar as you created it using your own hands. There are already something like that in fewer restaurants that you have tablet. But it's locally and not made remotely yet. Okay. So many targets all the time. I only mentioned three devices and very briefly that emerged in last years. There's even a ring you can put on your finger and make gestures. But to sum up, the Flask.io is very simple and very easy to use with this technology. There was similar implementation of the old devices could be made. But not every implementation was working that good as you would like to expect. Some worked only good when it was all done on the client side, especially the A-tripe. The Python socket plus JavaScript continuous data stream was a little bit laggy all the time. And I also had a lot of fun playing all the devices. And I would recommend it to anyone who was into such a quiz to play with some of them. And I think they give unlimited possible usage. And I would get back to the code part and try to use because the presentation thing on the MiR month, which I didn't want to try during the presentation to make a few more gestures. One is that I can put a screen but it's hard to navigate to show something. And the second one was to zoom. No, I have to use it. The gesture to exit it and it's stuck. And the second one was at all, sorry. Okay, the whole code that was used here will be available at my GitHub account with the slides from the presentation. I just need to clean some of it from comments that are not necessary. You can contact me on anything you want. Of course, even if you forget my email, you can Google it. It should get you straight there. And as I was told on one of the presentations, it's always a good thing to add a kitty, a cute kitty at the presentation because it gives you 15% better feedback, which I'm counting from you for any feedback, how to improve myself. And yes, you can ask me now any questions you want. Yes. No, because it's only a sensor. So it doesn't send. Maybe you can engineer it. Any more questions? Just my question is how do you communicate between the wireless thing and the Flask? Because Flask is just a web framework. Do you keep the tab active on your website and then you communicate? Or does it pass the operating system to the Flask? If you get out of the tab which you are using, it will disconnect the web socket. So you will have lost your communication. But when you are in active tab, it's communicating with using web sockets as I explained it on the beginning of the presentation. Oh, I think it does. And it looked in the code as you do have some feedback. You guys didn't hear that? Because it worked. Yes, I was wondering if there is a way of using maybe Flash to communicate with these devices so that, I mean something that already is in browser that we don't have to deliver the back-end to the user. Yes, it's possible to use Flash. You mean to communicate with the back-end or to communicate with the device itself? With the device, so that we can connect the device to the user. He installs it, or she, and it works with our application which is on the server somewhere. Yes, it's possible. You can use only JavaScript, for example. There's a library for each device. And the lip-motion device and the armband device, you only just attach a JavaScript library. The eye-tripe is harder to implement because you have to connect from the server to the client's daemon to implement it. And this is the hardest one to implement. They do have some connection. I mean, browser can access it, or it can work in which way they work. They send just K-Board signals, because, I mean, browser obviously doesn't understand the fancy-use devices. You can read K-strokes, maybe mouse clicks. So they do send something like K-strokes or mouse clicks back to the browser. If you are using JavaScript library, you already have to use this device, you have to install it. If you have the drivers installed, it's enough for a JavaScript to receive the signal. You only have to allow to communicate in the browser. You have a pop-up that will show that this application, this site, you want to use your new armband or libmotion, and it will connect to the JavaScript part, and you will be allowed to send signals. Is that a problem? But I was thinking that, because you can also communicate with hash. It's also possible. I didn't try that, but I did it anyway also. And one funny thing is that all of the new armband and libmotion also have Chrome extensions. And there it works perfectly. Designed by... Any other question? Well, if not, thank you very much. Thank you, too.