 Sound check? Is it working? Say again. Again? Okay. Okay. Okay. Cool. Good afternoon, everyone. Thanks for coming. My name is Andre, and I'm going to talk about building open source and to the equipment. Thanks for having me here, and the other organizers are for them for inviting me. It's my first time here. I'm a biologist, actually, as a background. Is there anybody who's also in the biology field? Okay, so one, two, cool. I've been working for a number of years with Open Science. So just really briefly, Open Science is basically in very crude terms, for those of you who may not know, the open source version of open source movements inside science, right? So we want to make things more transparent, more reproducible, and more reliable. I have a little website called Open Neuroscience, where I'm just relaying open source projects related to neuroscience. So there is hardware, software, data collection. I work with an NGO called Trend in Africa, where a lot of this work has been derived from. And most recently, I'm working with Mozilla and the Wikimedia Foundation in Germany in a project to map what are the demands the researchers have in terms of equipment. So just a quick overview. We're going to talk about open science hardware, of course, and then one example of this, which is something that we've been developing for a number of years, called the Fly Pie. I'm going to show you some squishy things from the biology side. So what do we do with this device and other applications we have? Then we're going to see hardware and software. Think about what comes next. And think about how we can use open science hardware as a new norm inside research. So just one example of hardware, which is not open. So this is, I think, when everybody think about a biology lab. This is what people think about, right? So traditional microscope. And this was started to be developed in the, sorry, this is wrong, the 17th century. It's pretty much the same design in terms of lens ever since then. And to buy one of these, you need at least 5,000 euros to buy a research grade one. If you want to add functionality to it, you need to add quite heavily a lot of money pretty fast. And I mean, there is no scale here, but these devices are about 8 big and they weight about at least five kilos. So they're really hard to transport everywhere and they're plugged to mains all the time, which means that for some parts of the world, combining all of these is really hard to get them, right? So they're hard to customize. If you have 5,000 pieces of equipment, you don't want any student in the lab opening it up and adding LEDs and changing things or anything like that. They're hard to repair because there is only one company that provides this one specific model, for instance. Hard to get them updated. And as I said, only accessible in some parts of the globe. So a lot of people get shut out of this. And we have to remember that this is a tool for science and education, and therefore it's basically one of the main drivers of development, right? So without one of these, for instance, you cannot do a lot of types of diagnostics. You cannot do basic research. So working with Trend in Africa, together with Lucia and Tom, we started playing around with hardware. And this was basically when like about 10 years ago when we first started this. And we found the Arduino platform and we started as biologists, like very simple, blink an LED, read a temperature sensor, things like that. And we started applying these. And things got more complex over time. And I think for all the engineers here, this is still going to be like a weekend project, but it took us a little bit of time because we are not from that background. Eventually, this came out as a publication where you can read all about it in more detail if you follow this link. And basically what this is, and we're going to show this in a little bit, but basically this is what you can do with it. So these are 200 microns, and these are parasites that you can find if you drink bad water. So you can use this for diagnostics. You can use this also in a macro scale and not in a micro scale anymore to basically analyze or record the behavior of small insects or small mammals. And you can also do fluorescence. If you remember, I showed that if you want to do fluorescence with a regular microscope, you need at least 15,000 euros. And I don't know if I mentioned, but yeah. So if you source everything from Chinese distributors, you can build all of these for 100 euros. It's completely open source, obviously. I'm going to show that everything of this is on GitHub. And you can easily customize it. This doesn't have a scale, but basically fits on the palm of your hand. And you can power it up with batteries, so you can take it anywhere. We're going to see a little bit more details in a bit. You can also do videos with it. So this is a very tiny zebrafish embryo, which is like 2, 3 millimeters. And what you can see moving here is basically the vascular system, so the blood flow in the little animal. And this is used, and you can see the heartbeat here. And I mean, it's really useful because this is a very used animal model in research nowadays because of all the genetic engineering you can do with them. This is a larva of the fruit fly. And what you can see here on the edge, these bright spots turning white is basically the activity of calcium going into muscular cells that are contracting. So you can measure this. And this is called calcium imaging. And it's actually a very expensive technique to be implemented in the lab as well. Of course, these are all proof of principal experiments of what you can do with it. And we're hoping because this is open source, more people are going to pick it up and bring to their own fields of study and help improve it. One thing that we've learned from learning by ourselves about electronics and things is that any biologist can learn it. And because we are part of this NGO that is trying to help development in Africa, we started doing courses where we teach researchers in Africa which are inside academic institutions how to put electronics together to build their own equipment. So in Africa, AliExpress is also widely available. So you can get a lot of components there. It might take a little bit more time. And we've been showing them. First, we started showing how to build these devices. But then we realized that's kind of silly because not everybody needs one of those. So we switched the courses that we had to show them, look, this is how you put an Arduino together with electronics. And you get to do the biological things that you want. The last course we did last year, we actually asked participants before the course what they wanted to build, what they needed in the lab. Right, and they gave us answers. And I'm really happy to say that from that workshop, a group actually published an academic paper describing a new piece of hardware. So basically, they learned enough in two weeks that they later could continue doing the work by themselves and publish another piece of open source hardware for science. So it makes me really happy because I think then we plant a seed for people to have the same things which are fairly new in science here in Europe already in other places across the world. So this is all I wanted to show from the squishy side of things. Now we talk a little bit about hardware and how everything comes together. So as you can probably have guessed already, like all the frame for this is 3D printed, we use a Raspberry Pi, which is inside this case mostly because we take advantage of the camera and the library that is written for Python for this camera. Remember, I said I'm a biologist and things need to be super easy to use. So the Pi Camera API is really easy to get started with. Then we have an Arduino Nano coupled in communicating with this because we need time precision down to microseconds for a lot of the essays that we do in neuroscience. So it's really easy to have this time precision with an Arduino as compared to a Raspberry Pi. We made a custom PCB with KiCat. And you can power this whole thing with a 12-volt battery and having like a LED battery, which is about this big, you can power this for a couple of hours. Everything is released under the open hardware license. This is how the PCB looks like. So it's basically a bunch of modules. So you have 12 volts in. Then you have two power converters, one for the Raspberry Pi, which you can power from this USB exit here, and one for all of these other devices on the side. Here, you can put in your Arduino Nano. And this is an outdated version of the board because it's not CC Bi anymore for the hardware, but actually the CERN license. And what you can see here is just that we have a couple of repeating modules for the things we needed. So we have general ports with transistors that are then allowing us to control LEDs that we can use for fluorescence. We have an Adafruit ring, which has basically 12 RGB rings, which you can then control the lights and do optogenetics. You have a port for a servo motor, which we use a continuous one to turn gears and change the focus on the camera. And you have an H-bridge, which you could use normally for driving motors, but we use to drive it a Peltier element and a temperature sensor that gets then a feedback loop so that we can nicely control the temperature of this Peltier element. If you were here earlier, you saw information about Kitspace, and we have everything on Kitspace, so with this one click-bond extension, you can order all of the components and the board from Islayer to here, probably not. But anyways, we try to make this as easy as possible for everybody to buy all the components and get started. So, Casper, thanks for all the work with Kitspace. It's really a good tool, so I would recommend everybody who has their own projects to put there. Right, so this is hardware. In the software side of things, we are using a very simple stack. So it's basically the stock Raspbian from which is distributed with the Raspberry Pi, and then everything is written in Python tree. We take advantage of the Pi camera library, the user interface is Tkinter, but we're planning to move this to QtPi and Pi Serial to communicate with the Arduino, right? So we have an Arduino sketch, obviously that is then just taking care of the serial communication with the Pi, and taking care of the precise timing of things, right? The code that we wrote is then released under the Creative Commons by 4.0 license, and this is a little schematic, so basically on how everything works. The idea is to have all of this as modular as possible. So basically each one of the hardware components is a class in Python, so you can easily add or change the number of classes, or for instance you could repeat a number of ring classes if you wanted to use several rings in the device, right? Or the different LEDs and so on, of course provided you made the hardware changes to go with it, right? And then what you can also do is run protocols, meaning you can tell the device to run things automated for some time, so for instance, please turn on LED one for 500 milliseconds, turn it off, then wait another two seconds, and then turn it on again, and repeat this for 10 times, but you can actually run automated experimental protocols. This is the user interface, so basically you can turn on these things because of the classes on and off, and here's how the protocols are looking. If anybody has experience with user interfaces, we really love some help to improve this, because it works, but it's not fantastic. And so what's next, right? So we have everything, we try to keep everything as open as possible as GitHub issues, and here you can find the repository. And basically we want to turn the PCB which is one block to several smaller blocks. So for instance, if you want to add four LEDs, you just have four little PCBs and a base one, and then you just stack them. This is a little bit of what CID studios have with their Grove system, if anybody heard about that. Right, so we want to move, as I said, the user interface to another Python library. Right now, because of the way PyCam, the PyCamera is written, you cannot transmit the data directly via SSH, so we want to send all the camera feeding to a file or a buffer or something like that to be able to control this remotely, because right now you need a Raspberry Pi, a monitor, a keyboard, and a mouse, right? If you would then instead have everything via SSH, you could have several devices connected to one laptop, and you could have remote connection with it. Also improve the camera resolution, right now we're down to 10 micrometers, but we hope that by either changing software or hardware, we can go down. Thank you. And so by going down, we can do more, even more diagnostics and even more experiments that we're currently not able to do. So for instance, at the moment, we're not able to detect malaria with this because the parasites are too small for the resolution of the camera. Right, we would like to increase the types of fluorescence we can do, so this is all playing around with LEDs and light filters and things like that. Improve the user manual. There is a PDF version, but it's a bit outdated, and we're trying to put everything on something called Wikifab. Has anybody ever heard about it? So this is an open source tool for documentation, which is very similar to what Prusa has for their printers. So I have like a photo and then a little text description, and you can go step by step. And this is all on this link. And we want to try to apply this for different areas other than neurosciences. And then last but not least, I'm starting a project with a couple of colleagues and friends in Germany to basically provide this device as a kit or as a completed unit, still everything under an open source licenses. And we have already, if you Google for primitive science and fly pie, you're gonna find the website. And there is already a user forum for people who started to interact and bring up problems and things that they would like to contribute. And we want to use the fly pie in this company as a proof of principle that basically can do this for all their open source hardware because what normally happens at least inside academia is that you publish a paper describing something. As an academic, this is what you want to have and you don't care much if this gets to many people, right? You have this paper, you can cite this, this is good for a career, you move on to the next project. What we want to do is basically communicate with the people that are producing these things and become quasi the middleman in between researchers that are publishing open source hardware and the users on the other side. So we want to take all of the work of certification, documentation, customer support and provide this as a service for researchers and users of these devices. If this is going to be working, it's still to be seen. And now for the last part of this talk, basically, sorry, did I skip something? No, this is correct. The fly pies, but one of many, many projects available out there and we're gonna see this in a bit. So can we make this a new norm? And together with Louise and Casper and many other people around the world and Javier as well, we're working together with the gosh community, which are people from all wakes of life working together with this mission to make open science hardware spread everywhere by 2025, right? So this is very ambitious and we hope to get a lot of it done until then. And I would highly recommend, if this is interesting to you, go on the forum, look for gosh community online and get in touch because people are really, really supportive and really welcoming to everybody. And here, just to show you what I mean by there are many, many other projects. These are all open source projects that are related to science that are available out there. And so you have, you know, prostheses and you have different types of microscopes. You have artificial neurons for modeling and for teaching. There are little incubators that you can have several different types of plants in them. There is a whole farming bot, for those of you who might have heard of the farm bot. One of the ideas we had lately was to make a smaller version of this and have this take care of plants inside a lab, right? So basically this device will take care of everything by taking care of the pro plants and things like that, right? So there are many, many people doing many, many cool stuff out there. And so we want to see if this becomes a new norm. This one I think is especially cool. This is an atomic force microscope. So these things are really interesting because they actually do measure like the differences in atoms at a surface of a probe and can measure like really tiny stuff. So, and it's pretty cool that there is an open source project for it. So I think we're living in this open source hardware, cumbrane explosion, because like if you look for it on Wikipedia, you have more than 70 projects and they're mostly like to a commercial level than big projects. In these slides, there are another at least like 36 of them and many, many more in repositories online. And I think the reason for this may be a little bit summarizing what we've been hearing the whole day today is that the tools to create hardware are getting better and easier to use. So the software, KiCAD and OpenSCAD and all these tools are becoming much easier and better to use. All this fast prototyping tools, 3D printers, laser cutters, low volume PCBs are getting also easier to use. Low price for manufacturing and the internet infrastructure is there that we can store all these videos and tutorials and documentations. And also that there are like a lot of companies or some companies that are like now more than five years old and actually are proven to be working under a valid business model, right? Because this is what everybody's afraid of. Oh, if I don't have a patent then people are gonna destroy my company and this is not going to work. Which apparently is not true. Last but not least, last slide. This project that I'm doing with Mozilla and the Wikimedia Foundation, this started with this idea that projects like the Fly Pie started with the local needs that we had, right? So inside one lab, in one department, in one institution we have this one need for the Fly Pie. So we spent all this time building this device which is going to attend some of the population there for scientists but not necessarily who have the biggest impact, right? So can we make a map of what researchers actually need and actually once we have the map build open source hardware based on this map? So let's say from 1,000 researchers, 800 of them need a better version of a microscope. So let's build that first, get a huge user base using open source hardware and then move on to the next thing, right? There is already online survey going on. So I would ask you if you can please share this widely. The project has a landing page where you can find a lot of information and all the repositories about it are on GitHub. So with that, I would like to say thank you very much. Please get in touch if you can. And yeah, sorry. Sorry, yeah, of course. So this is the Pie camera, really the first version and the only difference, how could I forget that, that's absurd. So you can see it here, right? And this has a little lens that comes from surveillance cameras. So it's an M12 thread that you can get anywhere and we just adapt it in front of the Pie camera. And then by changing how far this lens is from the sensor of the camera, you can get a higher or smaller magnification, right? So you can find the version of the camera on Kitspace. Uh-huh. Why is the sage, why not multi-card? This is about 6,000 or 11,000. Ah, sorry, so the first question here before your question was, I forgot about the camera and this is what I just talked about. And your question is why we want to use SSH and not some other protocol. Yeah, right. Could they even write teacher or could you use it to like live experiment or something? Right, so depending on the application, that's totally fine. The problem is if you want to use this for diagnostics, right? Where you have like privacy issues and patient data and things like that, basically. Yeah, and I mean, again, because I'm a biologist, this is the best that I know of. And I don't know if anybody has a better suggestion for a transfer protocol. So I would love, huh? VNC. VNC or something like that. Yeah, I don't know. So I'm open to suggestions about this. Yeah? Not very hard. So the question was how do you get, how hard it was to get focused on the micrometer range? So it wasn't very hard. As soon as we find the proper M12 lens, and I think this one has like a six millimeter working distance characteristic, we just needed like a stable 3D printed surface where we could slowly move the tread of the camera and the object. So the application with the automated focus is still not complete. So right now it's really just pressing a button and making it move. So we don't have the algorithms to make it automated yet. Yeah? Yeah? Definitely. So the question was how can I compare this device with the commercial one? Of course, we're not doing everything that a commercial one can do. So for instance, the resolution is one of those things. That's the question. We didn't find a solution yet. So I could say we're very close in terms of like, how many micrometers we need to break? So the question is like, how close we are to comparing to commercial applications in the sense of like, how many cases we curve for use? We basically started by covering our own use cases because this is what we know best. But for instance, there are some people from plant physiology we are thinking about using it for their cases. So I would say we're kind of far still from all the possible uses. Yeah? So the question is what is the image resolution of the camera? And for the pi camera you have two versions. One of them is five megapixels and the other one is eight. And right now, five megapixels is more than enough for what we need. But I don't think so. The problem here is not like how many pixels you can put inside the sensor because they already want micron apart from each other. So the sensors themselves, the pixels themselves are small enough. The problem is the optics or some smart algorithm to process the data and get it working. One here? Yeah? I understand. It really depends on how. I see. No, I didn't. Again, because I just don't know any better. Pi QT was seen to be the best thing we could find. Question was, have I considered using DTK instead of Pi QT because of the dependencies that one has as compared to the other? So yeah, I didn't. And here, in terms of packaging, in a way, we're kind of stuck to the same hardware all the time. So in a way, it's a little bit easier that once we have everything working, we just prepare an SD card image and put that available online. So people can just download the SD image. Image delay, can you work with your hands and don't have too much delay? Right, so the question is, what is the image delay between the camera feed and what we see on the screen? And this is really wonderful because Pi Camera, the library, sends everything via GPU directly. So there is very little, if not delay. I mean, I could not observe any delay. So really, this is why you can focus. It could also be well-developed for manual works ordering at key things and so on. Yeah, I mean, this would work as well. And so, and this is one of the things we are struggling with transferring this to a buffer or to a file to then transfer via SSH because this will definitely increase the delay. And then we don't know how to work around that, basically. Thank you. OK, thank you. Thank you. Thank you.