 This seminar, my name is Jean-Yves Tinevez. I will be the main speaker for this conference. And it's gonna be about trackmates. So roughly, this is the schedule, the outline of the seminar. It's gonna be in five parts. The first one is for simply how to use trackmate with a brief live demo. And after that, like Julien Coulombély and the Nubias people, they told me that you guys, the Nubias audience, the Nubias Academy audience is pretty advanced, so we can do more advances. So I would like to show you some discrete features of trackmates, like if you know trackmate, that probably seems you don't know what are that news that were. How to extend trackmate, mainly with scripting, but also I would like to visit the extension mechanisms of trackmate. I would like also to publicize some trackmate extensions. Actually, what makes it better and go further in its scope? I didn't write all these extensions, so in every time I will tell you who to contact and who did them. And then finally, conclude with the future of trackmate. It's a technical webinar, right? It's not a scientific, it's not a lecture. I will not deal with the algorithms you can find in trackmate. For this, I direct you to the excellent lecture the excellent Robert gave last week that was recorded on YouTube. And so if you want to know the mathematics behind that and the algorithms behind trackmate, you can watch this lecture as well. Very importantly, since it's mainly technical, we want to learn how to do things or stuff with trackmates. There's going to be a question and answer in termitions every time after this part. Don't hesitate to ask questions, we will do our best to answer it. And finally, these things should last less than one hour and a half and it's going to be propelled by courageous people, like you, and he's very nice and very fine. Four people, Daniel, Ian, Robert, and Anas I'm very grateful for them. Are we okay? Is there, okay. If I shake my mouse, do you see it? That's going to be my laser pointer, right? Do you see, okay, in my case, I have the five figures on top of my screen. Do you see them too? Or do you see the presentation? Just the presentation. Okay, everything so you don't see my face, right? It just should be cool. Okay, what's next? I think I would start by telling you a little bit about, you know, why trackmate exists, mainly describe its core features through its history. I'm working as a research engineer in the Institut Pasteur of Paris in a small facility dedicated to service in bio-image analysis, right? And so I'm lucky to work with these people. Recently, Sébastien Ambert that you may know actually left us, now is working at the Biodescentrum in Basel. See if you attended to the New Bias Academy events, you know, Marion Louvre, she's an application scientist dedicated to the IC project in Jean-Christophe Riveau-Maslab. And when we are desperate, she comes and help us. But before that, this facility was, you know, created the end of 2017, it was functional early 2018. I was actually a microscopy engineer and image analysis engineer in the imaging facility of the Institut Pasteur led by Spencer Short and Nathalie Olner. And at the time, one of the project I was responsible for was related to these beautiful chips that you see here. Our teeth are micro mirror chips. Probably know them, that's literally a chip on which there's 256 by 256 micro mirrors. There are plenty of them. I think this is what fueled the most likely the retro projector or the visual projectors that we have. These guys, they were developed specifically by our partners to be able to be used properly in an imaging setup. Our role in this consortium was actually to put them at the back of the microscope and to run application with that. These micro mirrors, I will not go into the details about them, they had nice features such as, you know, very good reflectivity, old time and analogous tilting that gave us actually excellent control over the light which should sign our samples. And so our goal was to say, look, like this is three cells. This is what the cells are. This is the corner of light you use when you eliminate these three cells and that's the image you get. And so the setup with the micro mirror allowed us actually to control not only where we would send the light, but also at what angle we would send the light. And so with that, we wanted to actually tackle the very sensitive samples, imaging of very fragile and susceptible to light samples. And our goal was actually to make the sample with the lowest photo toxicity ever compared to other illumination modality mode. The question is, how do you prove that? How do you compare against other modality and prove that you have the lowest photo toxicity? So that was the question we tried to tackle actually. How do you measure a photo toxicity impact on the sample and compare a microscope with that? For this, we actually relied on seal against embryo. And the principle was cool because instead of using a microscope to learn something about a sample, we were using a sample to learn something about microscopes. And this sample was actually a seal against embryo. And so, you know, that's a short time lapse about the seal against early development at the site from one to four cells. And these gentlemen, and they have a very nice property is that the development is really, how can I say, reproducible in a physical sense. Like for instance, what you see here is a lineate. So we will see some lineages during this presentation. It looks like that. So that would be the first cell time runs from top to bottom along the Y axis. And every time you have a cell division, you have something like this, right? And so what you see here is actually not one lineage, but 20 lineages of 20 different samples that are overlaid. And so this is taken from a paper from the Waterstone lab in 2008. And probably like me, you just see one lineage. It's because the seal against development is really time. It's like a biological clock. And if the temperature is constant, the development is very, very the same from one sample to another. In the case where you don't have photo damage. And so we reason that we could use this property in something that would measure photo damage. For instance, let's say that you wait two hours after imaging the seal against. And so if you count the number of cells and if you actually plot the lineage of the seal against somebody, you should have something like this, right? Count about 50 cells. They are not the same temperature that we use. So in our case, we had 50 cells. If you actually now use a phototoxic illumination modality, the development will be delayed. And turns out the seal against was very sensitive and we could measure that. So that's the kind of movie we had. Each time we use very low power and at the normal development for intermediate illumination power, the development was kind of slowed down compared to this one and this one, for instance, these three movies are synchronized. And for very high power, we have a catastrophic failure of the development. So after a few divisions, like the development literally stopped and at the top of that, we have glitched. And so that was our actually metrics on phototoxicity. So I'm not gonna spend a lot of time giving details about that. But basically, if you're interested, there's this paper that described the modality to measure actually the phototoxic impact of a microscope. And the principle is actually to shine a very controlled amount of light on a single sample and to measure the number of cells you have after two hours. And for a low amount of light, how we say light dose, like you will get like no phototoxic effect. And as you increase the light dose, the development of the silicon stops or slows on and then you have a few cells in the beginning. And so this range, and I'm showing you with my fingers, that sounds very clever, but this range of light doses, this is the non-phototoxic range of your microscope. And the largest, the better. So that's how we compare microscopes. So we try to compare, for instance, how the impact of what happened when you deliver light. For instance, if you deliver light in short, intense bursts, you see that the range over which you have no phototoxic effect is smaller than that. Then if you use long and faint exposure and we even compare white here versus spinning this microscope, right? But technically, a quick question was behind that was how do you actually trace the linear edges of these embryos? That was almost, not so much, but almost a decade from now. From now on, there were actually already commercial tools such as, at the time, EMARIS or academic tools such as Sterenite or ICT that would allow to do the cell tracking and even actually editing linear edges. The only thing is that in my case, the light that I was playing with or the image quality resulting from this imaging modality was not something I could control. That was something I input. And so I had to re-lineage this kind of moving and this kind of moving. And of course, every automatic way would fail. And then this is how you have it. Trackmate was initially a way to, for me, to do silicon sliagging in ways that would combine automatic approaches, but also manual creation because there was no way actually something made correctly could track all this movie at once. And so typically the approach we use for these studies because there have been several was to generate the first lineage with what we could and actually manually create it. And because we had to do a lot of silicon sliagging, Trackmate was made to be easily usable by hand. Okay, so that's the explanation of the core features of Trackmate. So if you don't know Trackmate, it's a cell tracking tools that have several features. The first one that we needed was to have a good visualization of the tracks just so that you know when they're wrong and you can find where the lineage is around. One of the pitfall of tracking is that it always gives you an answer. It doesn't never fail, right? But you don't know if this answer is the right one. So you want to know that and one of the ways actually to directly visualize the track. So Trackmate has nice visualization tool overlaid with Fiji and HyperStack window. There was also a lineage browser and you could use some features to color actually some tracks or some pit or somewhere else. Sorry, spot cells or links between cells just to detect mistakes. At the time, there haven't been a 3D view of the cells so that you can orient them in space. And finally, contrary to most single particle tracking tools, we needed a tool that could store and detect actually cell division. And like for instance, when you follow physical traffic into the cell, you don't need to worry about division, right? You're going to have a track that's going to be made of at most one location per time point. When you want to plot the lineages of cells, you can't do that. You have to harness cell division. So you need to have this situation when you have one particle that becomes two. And so we needed a specialized data structure from that that's a graph. It is what you can find in TrackMate. And finally, given the image quality we dealt with, TrackMate had to do a lot of trial and errors, trying a set of parameters and then going back. And so the user interface of TrackMate, even the early user interface, it is right. So you could navigate back and forth and every time you would have the grid that say, hey, what do you want to use to filter out spurious spots? And so you could set the filters and if you weren't happy, you could go back there and change the parameters and so on. Plus the manual editing, but I have the chance to speak about that in a minute. And then finally, it was used a lot of the features in Fiji. I know that was a blessing. If you cut something for Fiji or IC or any existing software platforms, you benefit a lot from the facilities there, Royce and so on. But again, I have a chance to speak about that. Finally, in 2017, the TrackMate paper was out and it kind of develops a lot of these ideas now. Are we okay? Now that's the time where I have to stop. That's the end of the first part actually. I hope I'm in time. I have no clock around me. That's the time where I try and run a live demo. I defeated and we lost this several times every time stuff crashed. So it's gonna be fun. I suggest that if you don't know TrackMate, what you could do is to do it with me. And so you just have to launch Fiji. And there's a test image that you can open directly in the open samples. It's there. And then after that, just launch TrackMate that you can find there, then wait for me. If you already know TrackMate, it's not gonna be fantastically interesting. While you download the Test Image and launch TrackMate, maybe you can do a quick question and answers. What do you say people? By people, I mean Daniel, Robert, Ian, and Ananas. Yeah, there was an interesting question. I think, is it possible to get the velocity and directionality of individual particles? Yes. Okay, I'm known for short answers. So if you want me to develop, I am, you have to ask. So the velocity is built-in and directionality, Ian, maybe you can say hi because you just pushed it this morning, right? Your changes was merged into an extension that we will show. And you also have some angle detection features in your TrackMate examples already, but yeah, there is extend add-ons for TrackMate about it. That we will speak anything in the third part or second part, I forgot already. Any other questions? Okay. I will start the... Yeah, maybe another general interest question. Is it working only for lineage, but can you also do basic tracking of cell division or yeah, bit or yeah? Yes, okay. That's something I did not discuss in this story of TrackMate is that I was, I'm still actually a research engineer and a facility, right? And so we get a lot of requests for different projects. And you probably noticed expression when you have a hammer, everything looks like a nail. And so I started to apply TrackMate to a lot of different problems, including actually organelles or sub-presor particle tracking too. We even took the single particle tracking challenge that actually normally is just for that. So you can do basic tracking and cell tracking too. You don't have to consider lineages and everything. And again, I didn't do a slide about that too, but the usage of TrackMate now has been pretty diverse since the publication. People have used it in ecology, tracking animals, in science materials for things I don't even know. And apparently it's been fine and doing okay. It's probably not the greatest tracking algorithms ever, but the scope or the breadth of things it can address is pretty okay. Are we good? Please say yes. Yes, I think there was one other question that might be interesting. How large was the largest dataset you applied TrackMate to? Not so large. The test, we will discuss this point during the limitation. So if you could ask this question again at the end of the third part, we will discuss the limitations of TrackMate. But I can tell it right now, right? So you see TrackMate accepts every images that Fiji can open in an hyper stack. So for this, the full image must be in memory. So in the end, it will depend of how much RAM you have on your PC. And so there are some extensions and some offsprings of TrackMate Mammoth that's made to work around that. But again, we will speak about it in a minute. Okay. Can I go on? You tell me, you're the boss. Well, TrackMate looks like this, right? This is an image, this is a synthetic image of a time lapse of things that divide and merge together. I think you see it's a very simple image. There are very few spots and they're all driving towards the direction. So it's an easy image to track, just to actually show you how TrackMate works. And so if you launch TrackMate, it looks like this. And there's basically this next and previous button. Yeah, there's a log that you can switch and here you can save the session at any time. The first thing here on the star panel, you can actually verify that the metadata is set correctly. In my case, my image is uncalibrated, the pixel size is one pixel, but it's very important that you have it right in case you have actually special units because in TrackMate, everything is stored in physical units. The physical location of spots x, y, z are stored in micron if your image is great in micron. So it's important to get it right. And then finally, this is why you set a row, but again, we'll come over that. After that, it's all about clicking next. And so since TrackMate has a modular design, again, we'll come to that. Every time you can select actually a series of algorithms there. So TrackMate does single particle tracking with detections. That means it's a two-step tracking. You first have to detect the objects and then link them in tracks. And so the algorithmic part in charge of the detection is called a detector. And there's a couple of them, but they're roughly more or less all the same. If you don't know what to use, just pick this one. That's the log detector. Log stands for Laplacian of Gaussian. And it's good at detecting bright object of a black background, which are roundish. You just need two parameters. The first one is the blob diameter. So typically here, what you would do is trial and error. And so I see here, it's probably too big. The diameter of five is better. But you see that I have many spare spots. So we will get rid of them later. But if you want to get rid of them earlier, you can also play with the threshold parameters that will reject spot of low quality. When you're happy with the detection, you simply click Next. And then TrackMate executes the detection. That's not a very big image, so it's relatively fast. Text advantage of multi-core CPUs and everything. And then you can simply click Next. And so after that, you have a step that's called the initial thresholding. And so that's the histogram of all the spots that are found. In that case, 22,000. And the histogram is on quality of spots, quality of detection. So how likely is a spot to be true is reflected by the quality. And so the quality, it's a very important feature with TrackMate. The quality is high if your spot is bright. And of the same size, of the right size, sorry. A little bit lower, we say, okay, please find a spots that are five pixels in diameter. The quality will go down if the blobs you're trying to detect are not of five pixels. This step is optional. You could, you know, remove some of them here. I will include them all. After that, you have to select a view, where to display results. I can tell you straight away, this doesn't work anymore. So that just one route at work, that's the hypostack. So it means it will display the results here. So you have all the results there, right? Okay, so in all case, we have plenty of spurious spots. And so this is kind of, you know, fishing with a, not with a nest, with a fish net with a very small scale. And so you want actually to put filters to reject spurious spots. And so this is what this panel does to act as a filter. You simply click on the plus button here. And then you have a values of things that you can choose for filter. For instance, I could filter on X like this. That doesn't make sense in our case. You can filter on the quality. And then you see that if I put the filter here, I have only valid spots. Okay. Importantly, this is reversible, right? So you can always go back and say, no, finally, I just want to keep this one, this one, this one. Once you did that, sorry, you click on next again and then you have to select tracker. There's a couple of them, but actually the only one that matters is LAP tracker and simple LAP tracker. As explained here, it comes from the Jackaman Intel 2008 nature paper. And I didn't come with this tracking algorithms actually. Olu Jackaman did. That's just an implementation. And then there's a linear motion LAP tracker. And this one is good if you have particles that are transported. If you don't know what to use, simply start with this one. This one just need a few parameters, mainly how far can a cell go? And if we miss the detection, how long can it be missed before we lose it? And how far should we look for if we want to actually find the successor? And you click next and then you should have the tracks there. Again, like for the spots here with this panel, you can put filters on the track and actually maybe not the intensity. It's not very interesting. You can put filters, for instance, on the track displacement and say, keep only the tracks that move a lot, medium and everything and so on, or not simply like that. So this is the basic usage of TrackMate. And in the end, you have your tracks there. Now you see here, I didn't allow for cells to divide. So I didn't detect cell division. Come back to that in the meantime. So if I want to detect cell division, I have to change the tracking. So you simply move back because you can. And instead of using the simple LAP tracker, we will use the LAP tracker, the not so simple one. It's the same tracker. It simply has more configuration possibility. If you're wondering why I moved things around, it's because I have the Zoom window on my screen. And so in that case, you see that the frame to frame linking, that's linking one spot to the one just in the frame after. Gap closing is when you miss one and then you can allow track segment to split like that's to divide. That's right. And so if I check this guy and click Next, you see that now TrackMate was able to detect that here, there's a cell that divide, right? And otherwise it's the same. Well, that's the basic tracking interface with Next and Previews. When you reach the display option, this is the last panel. And this is why you can export your data to numerical values or click on Track Scheme to get actually the linear fields. And so this corresponds to that, this corresponds to that and so on like this. Okay, so that's the end of this first live demo and it didn't crash and I'm really happy. Before I go on, is there any more questions on this part? I think that everybody is very silent. So can we track also rings as opposed to dots? No. Again, that's a very good question, comes back a lot. I made a slide about it later. The detector I have is very basic, I'm very basic. It's good, but it's good at one thing, like detect bright stuff, blobs, what I call blobs, you know, stuff that is round and bright. Any deviation from this shape is actually gonna be lost and that's a pity because like, you know, for instance, I would like to track things in bright field or I wouldn't, in my case, I'm very interesting in bacteria, in bacteria are typically elongated and it works okay-ish, but not as good as when you have rounds. Fear not, we can work on that and that's gonna be part three or two, something like this. Shall I resume? Yes. Let's go, okay. So that was the basic parts. The next thing is going to be mainly, so I cannot be with you to look at your screen and to like teach you how to do things. So mainly the next part is going to be a collection of resources if you wanna dig a subject and so on. Don't hesitate to have question about it. So we put a lot of effort in actually writing a good documentation. If you ever have question about fact made how to do things just actually try to go to its main page, that's the page on the Fiji wiki and here you will find plenty of topics online even like technical documentation, what's the algorithms, what's the performance and the performance limitations and also there's a manual which is a bit more than 150 pages that tries to be pleasant to read and as complete as possible. There's tutorial to get started and there's also advanced subjects. And so if you have questions, it's a good idea to go there. Okay, now we move on to the next topic. It's typically the interoperability and how to extend the capability of tracking. Our important features when you have a tracking tool is actually interoperability. Trackmate is a tracking tools which means that it takes an image in the output and the output tracks. And tracks is just your collections of dots across time. But most of the time the scientific question you ask is not about just getting the tracks. You want to know how fast things go, where they go from where they come from and so on. And so the last step in the scientific question is often track analysis. I have my tracks, I want to analyze them. And so trackmate doesn't do that, it's not its goal. There's tiny pieces of visualization or things like this but there's no big stuff in track analysis. And so this is why there's interoperability tools which means import trackmate results into other softwares. So there are mainly two of them that I know of. The first one is MATLAB, simply because I like MATLAB and I started with that. And so there's in CG, plenty of MATLAB functions to import a trackmate file. And so if you remember in the, where's my house? I probably did something really wrong. Sorry, I wanted just to show you this button here. There's a save button. And so it will save the trackmate session, the link to the image, the data, and every parameter as you choose in a XML file. That's basically a text file. And so in Fiji, if you look in the scripts for subfolder Fiji, you will find here, there's five function called trackmate-something.m. That's MATLAB functions that will actually take these files and import them. And so I like MATLAB, that's a personal choice, but the nice thing is that in MATLAB, there's a graph data structures which is very neat and has a lot of facilities if you want to manipulate like cell divisions and so on. And for instance, you know, that's one of the SELEGANTS with a few of the cell lineages, highlight it here so you can do that. These functions, I'm happy to say, are almost properly documented. Like if you type the help section of this function, sorry, you will learn how to use them and how to import things. But if you want to know more in the chapter eight of the manual adjusting before, there's like a tutorial, widely explained and so on. And so here you can further your analysis. Also, at the time I was working in PASTA and you may maybe know that in PASTA, we are one of the institutes that develop, I think the main institute too, that develop the IC software. So typically if we were in the same room, I would ask people who knows IC and I would just count how many people raise their hands and so with the helpers, not that many people know IC, I'm surprised. And IC, I would say, it serves as an open source academic software for general image processing, but they don't exactly have the same goal that of like Fiji, I would say. It's more like on applied mathematics and algorithms makes sense on. They have a tracking interface too, which at this time they use a probabilistic algorithm mainly. And so I was in contact with Fabrice de Chaumont, one of the main IC authors. And so together he developed on this side and I developed on my side. In my case, I made an exporter of TrackMate 2IC. So it will generate a track that you can import into IC and on this side Fabrice actually made an importer for TrackMate files directly into IC. So once you generated tracks or linears into TrackMate, you can go back and IC and go back and forth and exchange data like this. If you never use IC, give it a go. It's a great software and there's a lot of things. It's worth trying it. Okay, now, is there any questions about this part? Nobody raised their hand, so I would like to spend the next 12 minutes speaking about this well-known features of TrackMate. I didn't dare putting advanced features. TrackMate is a simple tool, so I didn't dare to use the advanced way. But that's typically when I see people teach TrackMate, I see that they don't teach that a lot. And when there's users of TrackMate, that's things that they don't know that exist and most of them are not in the manual. And so I would just like quickly go through and show you that. So the first one I would like to show you is that TrackMate actually plays well with the voice that you have in Fiji, for instance. I'm gonna close this and then go back all the way to the beginning. Let's say that you have, where's my Fiji window now? Sorry, Valice. Ah, it's there. Let's say that you have an image, right? But you want to track only cells that are in a certain region. Nothing prevents you from creating a Roy in Fiji, and you can even actually make exclusion and so on. And so when you drown your ROI, you click on the refresh source and then process with the detection as before. And actually TrackMate will actually only include spots that are in T's Roy and generate tracks from that. That's very handy if you have not so simple images and so on. And, blah, blah. Okay, and so you can generate a happy face like this. The next things I would like to show you is the, I didn't speak about it in the introduction, that's the semi-automatic tracking. Now TrackMate has some manual editing so you can literally track cell by cell by clicking in the image. What you can do too, actually is to use semi-automatic tracking. All right, let me do that again. For instance, I have a cell here. You can, I hope it's going to work. Nope. I have to start again. Please bear with me. Big tracks, manual tracking for instance. So one approach you could take is to create spots, position it over an image, adjust the size manually, and then click shift A. You can't see me pressing on the keyboard, but just by clicking shift A, TrackMate will try to actually iterate and find spots in the vicinity. So that's the semi-automatic tracker. It's very handy if you have a really noisy image and cannot want us to follow some cells and so on. So that's the semi-automatic tracking, but you can configure how it happens with this tool here. You see, when you launch TrackMate, there's this small icon that appears on the top right of the Fiji toolbar, and if you double click it, this appears here. So there are several things, such as for instance, like tool to select a track or something like this, but there's also actually tools to control how you do semi-automatic tracking, such as for instance, controlling the quality threshold and the search range ratio. I'm going to put something completely logic and then track things, right? And so you can generate like quickly tracks that are semi-automatically found. And so I found that very handy when I wanted to follow, for instance, bacteria in a very dense area, but I'm just interested in a couple of different. Good? It's very, nobody answers. Next thing I wanted to show you, the feature values. Okay, so this is, how can I show you that? In TrackMate, there are what's called feature values. So I told you that TrackMate is not a track analysis tools. However, it computes certain values related to the spots and the spots are T subjects, that's the detections. Links, that's what's actually link to cells, and tracks. Our tracks is this one, it's a collection of a cell that you would follow through time. And so you can use that for instance, these feature values, they would be here. You would use them to color, like to give indication to the user if it's good or not. For instance, if I color my spot via Y, you see that I have this rainbow color scheme and for low X value, it's blue. And for Y value, sorry, and for large Y value is red, right? And you can even use that here for instance, I'm gonna color the links, which are also called edges, by velocity. And here you have a kind of display of instantaneous speed, the cells and so on. But you can use these feature values for other reasons. For instance, I'm gonna start again, do the detection again as quick as I can, or I must have done something wrong. Good, hold on there, right? Like for instance, if you go into the LAP tracker, you see that on the frame-to-frame linking, you have a maximum distance, that's the search radius between one spot to another, but you have also what's called feature penalties. And for instance, you could say, add a penalty when the quality is different. Let me explain that. Tracking in trackmate is based on actually minimization of link costs. And so when you have a cell, that you're trying to link to another cell in the next frame, you're actually computing the cost to link it. And so when you have multiple candidates, you simply pick the one with the lowest cost. All right, this is the case when you want to link one to another, but in reality, you have to compute the cost to link all the spots in one frame, to all the spots in the next frame, right? And these costs, they can be fine to a new trackmate using actually feature penalties. And you could say, look, I know that the fluorescence intensity of a cell is constant over time. And so if you see one spot, and then you see a spot in the next frame with a much, much, much higher intensity, that's probably not the same spot. And so you can even feature penalties to say, okay, in that case, increase the cost if the feature and different are valid. And so you can use that, actually feature penalties. When is this useful? That's a simulated case, for instance, I say, let's suppose that you're trying to track a lot of spots that are very densely arranged. And so typically it's very hard for trackmate to link that kind of thing, it's too dense. What you can do is actually say, look, I'm gonna put a penalty on the spot intensity, the fluorescence intensity with a weight of five. And actually in that case, trackmate will be able to retrieve the correct tracks there. Almost, we see that there's a problem here, right? But it can help you in actually very difficult situations. So that was it. Finally, the color scheme. And this is why Jan and Gleger comes into play. The color scheme you see before that, it can be configured. Now it's the jet color scheme, but you can edit that. And you can find how to do it in the edit options, trackmate, and then you can choose something else than the jet color scheme, for instance, the addis and so on. What many people don't know is that, you can also set the manual scale for coloring, for instance. Typically we will scale the color from the minimal feature value to the maximum value, but you can also actually enter manual scale for yourself. Now the way to bring this window is actually simply to click here, double click on the set color by, and that way you can set the color scale here. And that was something that was asked for by Fumio Ayaship from Kobe University. Next tip. Please let me go. Interactive results table, right. This is where I should have not played the fool that much. So let's say that I have noticed results here. If you click on the analysis button, trackmate will generate three tables for actually feature values. For instance, in the spots you will find XYZ, quality, mean intensity in many channels, median intensity, et cetera, et cetera. Now what I wanted to show you is that these tables are interactive. For instance, here that's the track table. On this image, I have seven tracks. There are some of them here too. So I have seven lines there. And so when you click one line, normally, as you see here, it should highlight the tracks you're interested in. That works for spots and individual links too. That can be very handy if you want to retrieve where you are, and so on and so forth. Now, also the track branch analysis. This feature was to deal with situation where, okay, let me summarize. For this, I will open the data of Alina Soma, who is a kind user of the facility. And she's been following stem cells in confined environment, a bit like us recently. Let's go take a little while to load. So this is an example of one of the Alina results. She's following a cell of a time as it divides. Let me close the window. And if I look at the lineage, this is a rather big lineage with many divisions. And so Alina, and actually I was interested, actually not so much in the lineage itself, but you know, how long does it take for a cell to divide? And so this is one track in TrackMate. So if you say select WorldTrack, it will select all the cells, all the data, everything. And Alina say, look, I want to actually analyze cell by cell. So just after a division up to the next division and so on. And so this would be branches. So there's a module to do that in TrackMate, but to get it, you must move beyond the display option panel and go to what's called actions. And so this is where you will find in TrackMate many miscellaneous actions and so on. One of them is called branch hierarchy analysis. And if you execute it, it will actually decompose each of these branches and actually give you information about how many cells after, how many cells before, sorry, how far does it go with what's speed and et cetera, and for how long it takes. And again, like the previous table, it is interactive because the lineage is very big. I probably have to search it below. I'm sorry, I probably did something that broke the link between the two, right? Oh, no, there it is, right. And so when you select a branch, you can know what it is. It's very handy when you have to analyze lineages quickly and such as, you know, measure how much time a cell division takes. And so initially, this was developed for Milan Esner that was a collaborator in the institute pastor. And the first result of that was in this paper. Okay, what can I show more on? Johnny, maybe you should look at the time a little bit. We are 20 past four now. Oh, 20 past four. Okay, I'm sorry. Maybe I skip that. You have that in the PDF and so on. Maybe we go back to the presentation. Sorry, I'm really sorry. And the resume the normal way. Okay, so limitations of fact. Maybe, Jean-Yves, I have a question that comes from several time and that's maybe a limitation of TrackMed. Is it possible to track cell with different size and if the size changed during the time? No, we don't do that well. The log detector actually has only one size. Again, that's a question that comes often. And so the, I think people need to implement such an algorithm at some point. Yeah, yes. Okay, the limitation of TrackMed. So this is exactly the question. There are some limitations that are linked to the implementation, the way it's coded and limitation linked to its algorithm studies. So in TrackMed, you don't have shape information. So there's no region of interest. And so if you have the results of a segmentation, you can't use it in TrackMed. The cells are simply represented by X, Y, Z and the radius. As it was asked in the beginning, the image must fit into memory. That's kind of make TrackMed not so good for very large images and so on. One thing that we noted is that it's kind of becomes sluggish and responsive if you have a lot of object like beyond 100,000. Like you can maybe 200 or 300,000 of cells. But if you think about it, that's not so much with modern problems and so on, right? And then you generate big files, that's the XML files, that's text files, that's the reason why. And then finally, that's me saying that the programming standards are not fantastic, right? I tried to fix them and to patch them, but there's room for improvement there. I was young when we started. On the algorithmic side, just exactly what's the user questions are. They only work well for blocks, like roundish, bright object of a black background and of constant size. If the size change too much, it's not going to work well. So far as there's nothing to track reliably and robustly bright images, mainly fluorescent images and one of the catastrophe, I say catastrophe, the main weakness of trackmate is these situations. If you have a detector where you have two spots per object you want to track, it's almost certainly will lead to a tracking failure. And in that case, you see that there's two tracks, two small tracks for one object. And so instead of having one very large and very long tracks, you have two small ones. That's a very common tracking failures in trackmates. So it's very important to actually have a good detection step that you probably detect all your cells correctly if you want the tracking to succeed. That's the main limitations, okay? How do you work around these limitations? Well, first thing you could do is to extend trackmate or script trackmate. Trackmate was initially thought of a user interface, but you can actually script it just to extend its capabilities. In my case, I use Python because that's a language I use a lot. And you can perfectly script trackmate to make it do things it's not supposed to do. Now here, for instance, we actually use trackmate to track cells that were actually represented by Roy's infigy. I don't have time to actually teach scripting trackmate here, but fortunately, there are very good resources again on the Wiki. And there's a lot of discussion about how to script stuff on the forum. So don't hesitate to go and look there. Here, you will find already plenty of ready-to-work scripts that can do a variety of things. I encourage you to do that. But the most important stuff is actually the trackmate extensions. And to justify that, I will simply go through the trackmate trackline. So trackmate actually started in 2010 with Nick Perry. That's this gentleman in town chief and I made him work on this. And after he left, I continued working on that. I was joined by the project on Jonas Schindelin. Or maybe you should know him is the person that founded the Fiji project with Albert Cardona and Pavel Domonczak and actually many others. But the development of trackmate actually took me that long and the paper was only accepted in 2016. Now, of course I was doing other stuff but it took very long. And if you think about it, right, this little human was this size at the beginning of the trackmate. And when the trackmate paper was accepted, it was this size. So just to say that the development of end user software takes really long. Like this is a lifetime, right? And this length is even longer when you consider the duration of a PhD or postdoc or software project. And so I would like to encourage you to consider the fact that developing tools takes so much time that it's probably worse actually extending users tools, sorry, extending existing tools or actually make your own tools extensible by others. Because at some point in research, what really matters is how fast can we enter? How fast can we deliver a tool that works? A particular in facility work, that's very important. And so this is my trackmate. It was, and that's, if you think about it, that's the only advantages compared to a commercial software which is always going to be better than trackmate. It's that you can extend it yourself and very easily. Trackmate is actually a collection of module and what the GUI does, simply plays the modules in order. And nothing prevents you from developing your own module, right? If you see something that trackmate doesn't do, you have to know a little bit of Java, you can make your own module for trackmate, you can extend it. And that's gonna be really well integrated into trackmate. There's a discovery mechanisms that's provided by side Java. That's a super great project of a Fiji that makes a lot of facility to develop good tools for science. And there's plenty of examples. For instance, computing the direction in which a cell moves, that's this code. That's Java, but that's not incredibly complex. And it's a good, good idea to do that. And everything in trackmate is made so that when you contribute an extension, it goes as a first class citizen. It appears in the GUI here and you can use it to color it. It's gonna show up in the table and everything. Like Yan and Robert, all of them contributed and that can be, we could say that I'm the main trackmate order. There's been a lot of contribution for others just for that. And so that's something I would like you to consider. Building your own extension yourself, again, we did a lot of effort on the documentation so you can find a kickstart page here and that's gonna, a programming tutorial that's gonna take you there. And online, there are example code and templates for you to extend them because programming is mainly copy-paste and modified it and you will find that here. And so if you have some Java skills, I encourage you to consider these solutions. Okay, I'm running out of time. I'm really sorry. I don't know what took so long. Quickly, some trackmate extensions has been contributed by others, not by me. Trackmate extensions is nothing more than the Java file, the product of what you simply made that you can drop here. So there's plenty of them. There's the basic one, for instance, that actually add features to trackmate. Like for instance, measuring these metrics there, I'll actually add new detectors and so on. I'm running out of time. So I will quickly actually stop before the end just to give a chance to answer questions and answers. But I would like just to say a few of them. There's even trackmate in nine now. There's an extension of trackmate, for instance, that allow to measure cell to cell contact. And so there's a detector that will actually, what you see here, if you have two templates and channels, measure the contact area between two cells and track them over time. And in our case, in this kind of experiments that was instrumental for us to actually measure the formation of contact between the T cells and the B cell. And now one, when they make a case like this, the calcium flux goes into the T cells. And it was actually a cool application of that in infections and actually diseases. That was really nice and so on. I am sorry, I would like to have speak to you about Mammoth and successor of trackmate, Mastodon. I'm really running out of time and I would like to really respect the time. You have the PDF with you. And I encourage you to say that. And so I think I need to finish there and thanks all these people and to make me a contributor, you see, I'm absolutely not the only one here. And I'd be happy to take any question from you if you have some. Okay, there is also a question that's come two or three times. Is it possible to import segmentation or detection from another software and to put it easily in the trackmate? Typically, if the people track using M track J or another tools. Hello, let me show you quickly this. There's a trackmate extension. Oops, sorry, not this one. If there's a trackmate CSV important. So if you have actually XYZ time, quality track, et cetera, there's a user interface that lets you import any kind of CSV files into trackmate that you can use. The limitation with the fact that, you know, we can't segment shapes. So if you have shape information, it's going to be discarded in the trackmate. But otherwise, you know, just, there's a trackmate CSV important and it's documented here. Jean-Yves, I may interrupt you. The official timing of these webinars is one hour and a half. So you actually have time until five o'clock, right? So we don't want you to stop too early thinking that you have to stop now, right? Ah. Well, then I have a question for Jean-Yves. So if some people ask for that, what can you, first of all, what kind of data can you export? What kind of tables, XML files, stuff like that? And also some people ask for how to export visualizations, so for example, videos where you see the tracks moving around. Maybe you could quickly show that. Yes, okay. Now I realize that I have more time, boom. I'm sorry, that's the stress, you know. Okay, so, okay, so there was three questions. What can you export when it comes to data and how to export visualization? Okay, so when it comes to data, the, let me find that. Actually, if you go into a trackmate XML file, you will see that it's a text file, right? And normally it should be pretty self-explanatory. Like at some point I've talked to you about the MATLAB importer, and so they just read the XML and actually construct from that. So this is how you can play. Otherwise, I showed you here that you can generate tables by clicking on the analysis buttons. And so these are image a table. After that you can save to CSV and so on and play with your own data. Finally, how to export visualization? There's a special action for that. And Robert is right, it's worth showing it. It's very nice. Nevermind, people don't care about it. I should look at like, right? So here what we would like basically would be to play something like this, right? To generate a movie like this. And so if you go onto the very last panel by clicking next up to the action, there's something called capture overlay. Now, so what these things will do when you press execute is that it will generate a RGB stack of everything which is displayed here, right? But capture as it is. So now it's not anymore track-made data, it's a movie, right? After that, what you do when you have this image that you go to file, save as AVI, I'm not gonna compress it with anything. And so you can save it as a AVI movie. I don't know if it's done. Yes. And then after that, okay. We, we, sorry, it's not our problem. In my case, I'm using a Mac. So if I double click on the AVI, it's gonna be converted to a QuickTime movie and then I can stick it in PowerPoint. This is how I prepare this presentation in most presentation with track-made actually. Is there other questions? Or did I lose you? So there was one question, sorry, for if it's possible to decrease penalties. So basically that you know the intensity is going to change, so that to account for that. If you don't specify a penalty, there's zero penalty associated, right? So when there's no penalty, unless that's the opposite, unless you specify it, there's zero penalty on intensity or whatever. That's, you know, that works the other way. Okay, thanks. I also would have a question or actually I'm just forwarding one. There isn't a spot measurements. There is a parameter called estimated diameter. And some people asked for measurement of size of spots. So what, how is this estimation? Maybe you can quickly tell us how this estimation works, how precise it is and what it can be used for and what it should not be used for. So, you see that in my C elegance movies, I, the cell as the cells divide and the development goes on the size of my spot change, right? And I wanted to measure that. And so that's what was meant for that. The way it works is that it makes concentric rings on spheres in 3D and of increasing size and measure the intensity. As soon as the intensity in a bend around the ring drops, the algorithm considered reach the border of the cell and say, okay, that's the estimated diameter. It works okay, even for 3D, provided that you have nothing else in the vicinity. Otherwise, it's not robust enough to actually measure the size of something like this. Maybe we could resume and I could say a little bit about the future and present of trackmate. Are you okay with that? Right? Yes, please. So, one of the key limitations of trackmate is the way we start the image. The image must be a Fiji hypostack and it must be in memory. At some point, as we were going towards developmental biology, there was a fantastic imaging technology called light sheet fluorescence microscopy that was very good at generating terabytes images and just ending them was a problem. A while ago, I started working with Tobia's speech in the lab of Pavel Tomonchak and Anastasios Pavlopoulos, the postdoc in Pavel Tomonchak lab too, as a way to actually bridge or actually plug these limitations. And that was Mammut. Mammut cannot be considered an extension of trackmate, but it's a new applications that actually depends on trackmate. And the goal was really this like, harnessing the very large images you find either with automated microscopy. So in my case, I was interested into bacteria and doing the guts and causing diseases or into developmental biology with new multi-angle data and so on, such as SPIM. And so that was the Parialay embryo imaging project actually involving all these very nice gentlemen and the whole data set we had to actually track was seven terabytes. So there was no way we find even a good computer with that much memory. And fortunately for us, actually at the time, Tobias pitch, you probably know, actually developed a very good viewer for very large image, the big data viewer. And so Mammut, it's literally just what you see on the screen. It's taking the big data viewer to visualize and to load the images and trackmate as a data model for tracks. And so it's literally actually tracking with not trackmate, but tracking on multi-view data with the big data viewer. It looks like this and you see that, you know, the interface really is close to trackmate. We didn't really learn the real except that, you know, the images are BDV or big data viewer windows and so on. And so the ultimate goal was to take one of the embryo you saw, SPIM embryo and actually made lineages like this one and so on. The main interest, one of the main interests of Mammut is that if you think about it, that's one of the few or the first applications where you can actually track objects on multi-view data. Now in our case, that was SPIM. And so we had like several angles of the same sample, but you could annotate cells in any of these previews independently. But if you think about it, that doesn't have to be SPIM, right? It could be any correlative imaging modality like CLEM or anything else. And so that's propelled by the big data viewer, which was really fantastic. And so it is very courageous people. I think it's mainly Anastasios Pavlopoulos and Carsten Wolf here on this paper. They actually use Mammut that we built to actually do that starting from an early embryo of shrimp. Now we're actually able to backtrack all the cells that generate these digitations that come here. If you know trackmate using Mammut will be fairly easy. And if you have large data, that's probably gonna be a good tool to start with. Mammut is available via a Fiji update site and you can find simply by subscribing to the Mammut webpage. There's documentation associated to it and more you can find in this paper. I say I hope that answered the question we had. The next and final part of this talk and it looks like I have like five minutes to do it, is gonna be about the future of trackmate. Is there any questions on Mammut so far? Let's come in. Okay. So Julien kind of asked me to start. What's going to happen with trackmate? What's gonna be the future of trackmate and so on? And then as a conclusion, I would say no future. Trackmate has no future. Well, that just happened mainly trackmate. It's now a tool that's relatively old, that's stable, I would like to say maintained and well maintained and so on. But it's a tool of the present. You can use it now. And if you start actually developing an analysis pipeline of an analysis workflow that depends on trackmate, you're safe because trackmate will not change that much. Like friends, you know, like Jan, Robert and a few other friends, we actually work on it, improve it incrementally, but it's a stable tool, it's a tool of today. The future of trackmate, it's not trackmate, actually it's masternum. And so masternum in that case, it's a full rewrite and that tries to arm us very large data, larger data. At some point masternum doesn't exist yet, it's not released, it's still a development project that's already beyond for probably more than four years and so on. It's incredibly difficult to code and to release. I would just like to show you as a conclusion what it's going to be when we finish it. And so basically that's something that's made to harness big data. By big data, we mean like very large images, but also very large amount of cells. I told you that trackmate has a limitation, like trackmate probably cannot go to one million of cells and certainly not to one billion of cells. And so masternum is a full rewrite to specifically address that, keeping the nice trackmate features such as automated, semi-automatic and manual annotations of the data, point-wise edittings plus nice things and so on. I'm gonna be frank with you. This is mainly an effort from Tobias and myself, but in reality that's mainly Tobias brain into that. I just get the chance to benefit from it and do nice conference and so on, but the real driver on this project is really Tobias. And we can already show you preview features and there's actually a preview that you can get in Fiji by subscribing to something called masternum preview update site. And there's already things that work already. Like so far, the new data model is much more efficient, takes less space in memory and it's much more faster to browse. And we were able to actually on top of the nice features we had in trackmate, put them back, but at a larger scale. And so all the movies I show you now, that's actually a real-time capture. And it's kind of good to actually address very large number of objects. And so we were very happy with that. Of course, it's based on the big data viewer, like for Mammut because so far, that's the main technology we have in the Fiji ecosystem for large images. And there's like plenty of things to allow a user to actually orient itself in the very large cells that you have. And there are plenty of visual cues, animations or shared data, shared between shared views and so on. So that's, you know, even if you have to track a full organisms, you can still pinpoint these cells. Interestingly, it took inspiration from video games. And so when it comes to interactivity and responsibility, it's a great inspirations. There's a special context for instance. I'm gonna skip that. What I really like is this. There's finally undo redo when you wanna edit track. And so up undo and undo, right? And redo is super useful when you have to actually track very large sample, the ability to go back and to fix mistakes is really great. Actually, it doesn't exist in TrackMate and it's very painful that it doesn't exist. As for TrackMate, since it's an academic open source software, it's only interest compared to commercial software is that it can be extended by yourself. So you can always add like feature calculator, like this, for instance, there's a plugin mechanisms. And so this will be preserved. There's a table to browse things or to annotate tags. Everything's made to facilitate that. There's an extension mechanism I mentioned. So if you know Java again, you can extend it. There's contrary to Mammoth, there's fully automated tracking of cells, even for large data on standard hardware, no GPU involved. And in the future, there's I think also the contribution of a Radimax woman on this, but if we get lucky, we will reinstate a 3D viewer for tracks and images in a mastermind. Well, so this would be the TrackMate future. I would say it's mastermind. It's gonna take, it already took us a huge amount of energy of paying a sweat and time just to get there and still require more. Please patient, please be patient with it. If you want to test the preview, it's in Fiji, but it's unsupported at the same time. So voila. Okay, my finally, I reached the last slide of my presentation. I hope you found something that was useful to you. I sincerely do. And then I'm available all afternoon and night just to answer questions, if you have some now. Again, thank you very much for your attention. A special thanks to my courageous co-panel, panelist here, sitting correctly, listening to me and rambling again. And then sorry for the small mishaps and mistakes I made. Thank you very much, Jean-Yves. I think we have a few follow-up questions. Robert, do you want to start? You're already unmuted. Yes, there was one question. If there is a feature to estimate Brownian motion versus directed motion. Yes, but not in Fiji. I made something like this in MATLAB. And so you would generate, okay, you would generate the tracks into a track mate and then import it into MATLAB and everything. Do you want to see it? Why it's documented? We would have one minute time, but you can also post the link later on the phone. We're okay, I will post the link later. There's a documentation tutorial with that. Right, I will follow up with another question then. Can you recommend a good resource or repository of publicly available images to try out track mate except the sample image that is included in Fiji? No, I don't know. Okay. I've been working with my users' images as test data set and I don't know of a repository, sorry for that. But maybe some other people know the panelists or the user. Yeah, maybe I don't know about what you think about the cell tracking challenge. There is a lot of data. Or the single particle tracking challenge, right? Yeah, single particle tracking of this kind of data, yes. I can put some link on the chat. Okay, thank you. Are we good? Well, maybe one more now, just like, so is there anything relying on GPU? Is GPU processing available or planned? No, this is hard to make a short answer. No, and I'm very happy about it. The, I'm gonna explain to you why the initially, Mahmoud, Mastodon and TrackMate, they were made as tools that should work out of the box. And so they should work on servers, on Linux, everything that should be Java, there should be no problem to run Mastodon and so on. As soon as you make a tool that depends on the GPU, you commit to a very difficult support to your users. And Robert can tell us a few words about that. Maybe Robert's never met any problems with tools, but it's very hard to support GPU because there's plenty of mistakes that are errors or bugs that comes from the GPU itself that's very, very hard to debug. It depends on libraries that are very technical, so that require a lot of skills and time. Like for instance, me, if I tried this to do on this laptop, this very laptop, if I try to do like, to use deep learning for J, my laptop immediately crashes for no reasons, there's no error, no mistakes, no error message, no nothing. And so we took a lot of effort to have something that's efficient even with smaller number of features, but that just run on the CPU so that we're sure that it runs everywhere and that it's simple to support. May I comment on that, Jean-Yves? Please. I'm convinced that GPU is always run without any... No, joke aside, we are going apparently towards an age where CPUs is this one thing and GPU is this other thing and more and more stuff will run on the GPU, eventually also a trackmate and a mastermind at some point. But at the moment, we don't have to support because as Jean-Yves pointed out correctly, is a lot of work and to have it running on many systems on all computers and clusters in the cloud and so on, is a lot of work. We will come there at some point, step by step, plug in by plug in, then we will have a great time on GPUs. Anyway, but in the meantime, it's important to have something that works out of the box, right? And after that, you can take this extension approaches when you say, okay, you have something that works on the CPU out of the box and now I'm going to build the next brick with GPU. Maybe if I may still ask a question, forward a question here. There's a way to export the jet color table or any other of the visualization color tables. I didn't think about that, to be honest. And the only thing I see that could do that would be to take you a screenshot of the trackmate last window when the color scheme with the scale is written. Yeah, maybe another possibility would be a script interacting with the API directly and getting the values, right? So it's possible, but difficult. Yes, maybe also there is a recurrent question about the 3D, you do not show anything about 3D. Could you say a few words about tracking in 3D? What will be the cost or what will be the, yeah. So trackmate works differently in 2D and 3D. So if you have a 3D image, trackmate is just going to work out of the box with it, right? So there's no nothing special to verify. Just check that the dimensionality is right in 3D, that you have like 3D stacks over time and not just one big stack and something, it's going to work. No, there's no, it's just that the 3D viewer that was the visualization in 3D does not work anymore since an update in Java or something like this. Apparently, there's a feature that not that many people works that trackmate also work in 1D, 1D, 2D and 3D. So if you have an image which is just made of one line and then that's, you follow it over time, trackmate will work on that too. Oh, I'm just looking at the number of participants. I expected 50, just so that I even register myself so that at least there would be one person. So I think there are a few remaining open questions, but they can probably be answered in the post that we post on the image SC forum, the summary post later on. Okay, okay. But that's about it. My hi, thank you very much for your patience with me and your time here. I sincerely hope everybody found something useful today and for the best, let's meet in the next new BS event on the forum itself. Thank you.