 Εντάξει, ξεκινήκαμε να κάθαμε. Και παραδείξουμε, ότι Jest κάθαμα οικονομικής, πλευρά για όλος και το δεχαίνουμε σε ένα και ναζόνιο του sozusagen-Vision- resolutions-ισομενρικής, όπως και όλοι, από όλους τις world wide neuro-initiatives. Είμαι Γελς Καφαιγις, ένα ομάδιό��σος μες λοιπόν, από τον Thomas Euler's lab και στις τελευταία πιέροδο της εκκαιτήσης μάσος,ín Meet Tom Bandon. Και as your host for today, I would like to once again begin by thanking Team Vogels & Panos Bosellos για να προσπαθήσουμε την εξαγεία της ΕΕΛΑΤΑΤΑΤΑΙΣΑΣ, για να προσπαθήσουμε την εξαγεία του ΕΕΛΑΤΑΙΣΑΣ, για να βάζουμε ένα γρήνιο και πολύ πιο ασσέσσυνο σέμμινα. Από αυτό, αλλαγώμαι, φυσικά, να πάμε back to the reason we all gathered here for today and introduce our guest from the Max Planck Institute for Biological Intelligence, Dr. Lisa Feng. Lisa did her master's studies in physics, studied a bit of biology, both at the University of Vienna, before her doctoral research at the Department of Neurobiology there, where working with Axel Schmid she investigated the performance of the visual system of nocturnal hunting spiders. Her first postdoctoral years were in Vienna with Andrew Stroh, before moving in 2015 to Rockefeller University in New York and the lab of Gabi Maimon. And in 2021, as part of the Liz Meitner excellence program, Lisa joined the MPI for Biological Intelligence in Munich and started the Activision Lab, where they aim to utilize advanced methods from genetics and physiology to behavior together with a model organism, the Drosophila, to build on the recent and very exciting findings that these fruit flies move their retinas via tiny muscles. Personally, I'm very excited to have Lisa today with us, as I guess all of you out there at the audience side, and to hear about their discoveries in her talk entitled Activision in Drosophila. So without any further ado from my side, please all welcome Dr. Fang. Lisa, the stage is officially all yours. Okay, well thank you very much for the kind introduction and also thank you very much for the invitation. This is a fantastic initiative and I'm super happy that I can talk here today. And I apologize in advance, I'm recovering from a cold, so I hope my voice folds and that you can all still understand me. And so I'm just going to share my screen. Okay, so I'm generally interested in active vision or active sensation and more precisely how active movements of animals interact with sensory processing. You can only imagine that when animals move through the environment just to get from point A to point B, that movement generates visual input on the retina, that the animals somehow have to interpret correctly. So this poses a challenge to the visual system. But then on the other hand, animals also move their eyes or heads on purpose to generate visual input. And actually many species do that. And those are just two very well studied examples of eye movements. The first ones are so-called fixational saccades in the primate. Those happen when a monkey fixates a point of interest. And in that case, the eyes are not really completely stable, but there are small, fast eye movements around that fixation point. The second one is a stability reflex called the optokinetic reflex that can be elicited by moving external visual motion around an animal. And that reflex consists in the eyes following that external motion and then snapping back from time to time. And this is an example in a zebrafish larvae, but you can find that in fish, in birds, in mammals, including ourselves. And in fact, when you view this stationary screen, your eyes are basically never stationary. They dart from one fixation point to another one in large staccatic movements. And this is how such movements, eye movements, would look like in a human observer viewing the set of nafretitis in a classical example by Jarvis. And all of those eye movements obviously come with benefits for visual processing, but they also come with a challenge in the sense that they result in visual motion on the retina. And we are not aware of that. We constantly move our eyes, but we don't see the motion on our retina and the visual world seem stable. And this is one aspect that I'm very interested in what the brain mechanisms are that allow for this stable visual process despite our ever-happening eye movements. And in my post-doc time in a main man lab at Rockefeller University, I was working together with Anmou Kim on a mechanism in the fruit fly that is very closely connected to that concept. So in a way, we found a mechanism that we think allows the flies to momentarily ignore self-generated visual input. So to start that story, I want to remind you that flies have a very strong stability, gaze stability reflex called the optomotor response. So basically anytime you put a fly in a visual arena and you show visual motion around the animal, the fly is very likely to turn in the direction of optic flow, either by walking or flying in that direction or by moving its head. And so in that case, for example, rightward optic flow, the visual system would respond and the motor system would be triggered to perform a leftward wind turn. But then of course, animals not only move in response to external visual motion, they also move spontaneously or voluntarily. And so if we imagine a fly in a stationary environment that somehow takes the decision to perform a leftward turn. What is happening then is that this leftward turn elicits rightward visual motion on the retina. The visual system is triggered and the optomotor response to the right is triggered, so opposing the voluntary left turn. And in a way, this one considered as a reflex trap. So strong gaze stability reflexes basically trap animals and oppose any voluntary spontaneous movements. And this has been recognized very, very long ago. And von Holst and Mittelstadt came up with the idea of how this reflex trap could be escaped and they proposed that with every efference, with every command to turn, an efference copy could be sent to the visual system to cancel out the consequence of the turn. So the efference copy would cancel out the visual system. Systems response, no optomotor response would be triggered and the reflex trap would be escaped. So that was the theoretical idea. And in the Neyman lab, we then required it from neurons in the visual system to look for signatures of such putative efference copy signals in the fly. And we did this in a very specific set of visual neurons, the so-called horizontal system cells and vertical system cells. And those are probably among the best studied neurons in any invertebrate and they are well known to respond primarily to whole field motion. So when large parts move in synchrony around the fly, the horizontal system cells to horizontal motion and the vertical system cells to vertical motion in different parts of the visual field. And it was recognized basically ever since the discovery that those neurons are very well poised to signal the movement of the fly through its environment. And so they are therefore very good candidates for underlying those gaze stability reflexes. And so we first wanted to check if we can really see or pinpoint the behavioural role of those neurons in gaze stability reflexes. And we did this by putting gluing flies to little tungsten wires and putting them in an LED arena, the riser lab LED arenas and showing them visual motion. And I'm going to play the movie and you're not going to really see the wings because the frame rate is off to visualize the wings very well. But I hope that you will be able to see that the head is moving in synchrony with the grating. And the reflex is so strong that you can basically make the head dance with this kind of visual motion. And so this is a wild type fly. And what we then did was we silenced the horizontal and vertical system cells by expressing and invatractifying potassium channels. So we were hyperpolarizing those neurons and they were unable to normally respond to the visual motion. And when we did that, we saw that indeed those flies did not show the same extent of head movement. So the head movements here in this red flies, which are the experimental flies, were strongly reduced. They're not entirely gone, so there might be another pathway. But this clearly showed that Hs and Vs cells are involved in the gaze stability reflexes of the head. And so we went on to look for signatures of potential inference copies in those neurons. And in fact, this is data by Anmou Kim, the first author of that paper. And so what we did was, the logic is the following. So flies fly through the environment in straight trajectories that they interspersed with fast flight turns. And those flight turns mainly happen in the horizontal plane. And so the consequence of a flight turn is mainly horizontal motion. And so we performed wholesale patch clamp recordings in those neurons and showed the mean response to the expected consequence of a flight turn to horizontal motion. And you see that all of those nine neuronal types respond to some extent as expected to horizontal cells, most and then the vertical system cells to a varying degree. And by the logic that I have outlined before, it would be beneficial for the fly to escape this reflex trap to ignore momentarily these visual responses to the self-induced motion. And so we looked what happens in those neurons when the flies are attempting a wing flick. So now we are not looking to the response of the neurons to visual motion, but just to what happens at the moment when the flies attempting a turn. And the flies are obviously rigidly tattered because we are performing patch clamp recordings, but we can pinpoint moments where we think they attempt a flight turn. And at those moments, we saw motor-related signals coming to the same neurons that are opposite in sign and roughly matched in amplitude to the visual signals. So we think that if the fly were really free to fly around, these motor signals and the visual signals would cancel each other out. And that those red signals are actually function as cell-specific quantitative inference copy signals. And we were really excited, or I am still very excited about the signals, and we would like to find out where in the brain they originate from and what the mechanisms are by which they are distributed in this cell-specific manner. And in a follow-up study, we were investigating if those signals also come during escape turns, because those signals were all measured for spontaneous flight turns. And at least in this one cell group, we saw the same sort of motor-related signals coming in. And there is an extended dataset that has been published by the SNELL Lab a year later. And so what you can take away from that is that if you swatter a fly next time, there is this beautiful mechanism unfolding in its visual system that allows flies to escape that efficiently. And so, as I said, we are still very interested in how these signals are generated. But before we were diving into that question, we paused and were asking if it's not perhaps possible that flies like other animals move their eyes as well. Because this was important for us to understand, since eye movements might come with an entire different set of motor-related signals. And this is not how, classically, the field thinks about insect eyes, but on the other hand, it is known that other aftropods move their eyes. And as you have heard in the introduction in my PhD, I studied eye movements in spiders. So spiders are well known to move their frontal eyes. But spider eyes and human eyes are at least superficially quite similar. They both have a lens guania apparatus and a retina behind. Insect eyes are very different. They are so-called facet eyes or compound eyes and they consist of hundreds to thousands of lenses that each have a set of photoreceptor cells underneath. And, classically, it has been assumed that those eyes move in perfect concert with the head. However, already in the 70s, Roland Tangstenberg found a muscle that attaches to this particular ingross inside the fly head and then runs and attaches to the frontal part of the retina. And he suggested that this muscle is very well poised to twitch the photoreceptors underneath the lenses. And this was in Muska. And 20 years later, Nikola Franceschini found a second muscle. So there are two muscles that attach to these eyes. And he found, short in a series of conference proceedings, that those two muscles move the photoreceptors underneath the lenses. And interestingly, although flies have already been a very important model organism for visual processing at that time, these very seminal studies have not really been followed up and this was basically all that has been known. And so we were, of course, extremely interested in knowing whether Drosophila, our model organism, also has eye muscles. And they sure do. So this is a stack that Igor Sivanowitz, our collaborator in Genelia Farm, provided. And you see here these two muscles that he has stained using phaloidin that attached to this particular membrane that surrounds the photoreceptors. And this is how he draws the 3D configuration inside the head. So there's this one muscle that runs from the front to the retina to this particular in-gross. Like the muscle that tanks in that was drawing. And this other muscle that attaches to the frontal part and then attaches close to the internal cup. So there certainly are two muscles. And we were wondering, would there be any benefits for flies to move their eyes? So I just told you flies have these facet eyes with many, many lenses, each with a set of photoreceptors underneath. So there are always eight photoreceptors. One is stacked on top of another one. So if you would cut through such a unit, you would always see seven photoreceptors. And flies basically see the entire world. I mean, not absolutely 100%, but very, very large parts of the world at the same time. And many flies, including trussophila, have not very well developed fovea. So it's basically homogeneous motion all around at the same time. And of course our own movements often serve the purpose of repositioning the gaze of our fovea. So the question was, why would we move the eyes? And we got inspired by published work in other model organisms and we came up with a list of putative functions. So for example, eye movements propelled to counteract photoreceptor adaptation. If you have a completely stationary eye viewing a stationary scene, the visual percept tends to fade out. Imagine that twitching the photoreceptors would bring that percept back. Second, and I think very excitingly, eye movements could help the discrimination of fine spatial detail. So if you imagine again a stationary eye in front of a stationary environment, the resolution is harshly limited by the angle between the photoreceptors, which in trussophila corresponds to the angle between these units, this so-called omatidia. But you can imagine that if you can move the photoreceptors over time and sample different points in space at different time points, and if there were a newer substrate that would allow you to compare those measurements, you could increase the spatial detail you have access to. And flies and other insects would hugely benefit from any mechanism to increase spatial resolution, because their eyes are limited by the diffraction. And so they have much worse resolution than vertebrate eyes at the same size. Then of course, eye movements could help image stability or gaze stability, or they could maybe in some dynamic way provide depth cues as well. And this is probably not an exhausting list of putative functions, but those were the things that we could come up with by reading the literature. And so we were wondering, can we see any retinal movements? And the first thing we tried was water immersion. So basically, we glued flies to holders that we normally use for wholesale patch clamp recordings, but turned the head 90 degrees. And so because the water and the biological material of the lenses have a very similar effective index, you can see through the lenses on the photoreceptors. And first, I'm showing you an optical section up here. So you're going to see a cut through the lenses at the surface of the eye and some pigment cells. And I'm now going to focus further down to that layer. And I hope you're going to see some whitish grape-like structures, and those are the photoreceptor tips. And I hope that you can see them switching around. So there's dirt on this slide, and it's not the best movie, but this was the first time we could visualize retinal movements. It was a very exciting moment for us. However, water immersion is not the best way to study eye movements or retinal movements because it's relatively hard to show visual stimuli to the eye while at the same time measuring the retinal movements. We made use of a crack of the insect eye or of the fly eye, which is called the deep pseudo-pupil. And the deep pseudo-pupil is called like that because it looks a little bit like a pupil. It looks like some darkish or whitish spot on the eye, but it has nothing to do with a vertebrate eye pupil. It's in fact an enlarged virtual image of the tips of the photoreceptors and it places deep inside the eye. So you have to focus in the eye to see this deep pseudo-pupil arising. And it's actually really easy to see. So it's sufficient to shine light through the head of the eye and we often use infrared light or far-red light because the visual system of the fly is not very sensitive to that. And then it's sufficient to point an air objective on the eye and you'll see this deep pseudo-pupil arising. So this is the compound eye, this is the head of the fly and this is the deep pseudo-pupil. And if you are so inclined, you can even count the seven whitish dots for the seven photoreceptors. And so if the head is rigidly tattered and does not move and this pseudo-pupil moves around, it tells us that the photoreceptors themselves have moved. We roughly know the separation of the photoreceptors in the eye. We can even calibrate roughly how much they have moved in angular space. And so we built, no, before we built an arena, we actually made sure that we were thinking about this correctly. And so we tried to image both the deep pseudo-pupil and the retinal water immersion in the same eye at the same time. So the top dorsal part of the eye is seen in water immersion and the lower part is seen here with this camera and we focus on the deep pseudo-pupil. And those images are very bad. So it's really hard to do both things at the same time, but I hope it's good enough to convince you that those movements are very, very well correlated and that we can use the deep pseudo-pupil to infer movements of the retinal. And so we built an arena that allows us to look at the two pseudo-pupils at the same time, so two cameras are pointing at the fly. The flies are flying or hanging on a tungsten wire, so we glue it to this little wire. In that case, we glue the head to the thorax, so the head doesn't move independently, and we also measure the shadows of the wings. And at the same time, using the riser and the deep penis, we can show visual motion. And when we showed a great thing moving around the fly, we saw these very reliable movements of the two pseudo-pupils. So I'm not sure how well this comes across in this zoomed out version, but we can zoom in. And so now I'm highlighting the centrate of the right and the left pseudo-pupil. And I'm going to draw down here the horizontal expression over time. And I hope you'll be able to see that whenever the great thing moves to the right, the retinas move to the left and vice versa. So this was really cool to see for us. And it's actually a very reliable behavior. So basically you see this in every eye of every fly that you put in this arena. And it's even kind of spooky because sometimes flies are just hanging on this tungsten wire and there's no way of telling if they're alive or dead. But then when you look at the retina, you see them moving around. And if you go through all the signs, the sign flip of the lens, et cetera, those movements are consistent with slowing down the relative speed on the photoreceptors. So we think that in function, basically they are an optokinetic reflex, very closely related to the optokinetic reflex. I have shown you in the zebrafish larvae in the beginning. And when the flies are flying in this little tungsten wire, then everything gets a little bit more lively and you see more of the snapback saccades as well. Unlike ourselves, for example, these flies can move their eyes completely independently. So now we are just going to stimulate the left eye and then only the left eye is moving and the right eye is very, very stable. And then at some point it's going to switch and the right eye is going to be stimulated. And this also is quite impressive in the average where you really see that one eye doesn't bother and vice versa. So we think that this is a very reliable and elicited optokinetic reflex. And then we were wondering if flies maybe also moved their eyes in spontaneous ways in a non-stimulus triggered way. And to do, so this is an example of a fly flying in a stationary environment or in a stationary environment so there's no acute visual stimulus and yet we see these jumps happening. Down here this is the left minus right wing amplitude there's a measure of wing turning and you can ignore that for now. But here again is the position of the retina and you see these jumps happening. Then at other times they are quite stable for a while and then those little jumps happen again. And I say little jumps because if you plot the XY position now of the centrate of the deep pseudo pupil you see that those stable positions are really separated by less than one agree very often which is a fifth or sometimes a tenth of the interreceptor angle in flies. And we are of course very interested why flies would perform such movements, right? And one idea would be that they could use it to refresh the image as mentioned before and or to scan space in finer detail then their receptors would otherwise allow and to increase their spatial resolution. And this is something that we really hope to find out in the future. It's also kind of a question why we hope to find out in the future. It's also kind of striking that superficially those movements looked like the fixational saccades in the monkey that I have shown you earlier on. And of course monkeys have a phobia and lots of what is happening here we think is connected to the phobia itself. So I think it's going to be interesting to understand why flies are making very similar eye movements in a completely different eye. But for the moment being when we discovered that we were actually kind of wondering if there would be any impact whatsoever because those movements are tenths of the receptor angle and it would have been entirely possible that the visual system doesn't care at all. This is just something that is happening and this is too small to have any effect. And to investigate that we performed again wholesale patch plan recordings in horizontal and vertical system cells. So these are the exact same neurons that I have mentioned earlier on when I was explaining the inference copy signals. And in that case we were recording from them mainly because we know so much about them for no specific reason other than that. And this is mostly data from Aditya and Nair. And so this is a classic example of how a horizontal system cell would respond to horizontal motion in two directions. So you see a hyperpolarization happening for one motion direction and the depolarization happening for the other motion direction. So this is extremely well known and well described. It got really interesting when we looked at those neurons in the context of a stationary environment because here what one would expect would be a flat stable baseline. And yet we see these depolarizations happening that are sometimes really large. So this is the same recording, same y-axis and you see that those depolarizations go up to the response of to the preferred visual stimulus. And when we plotted the retinal position on top of that we saw that those depolarizations seemed to coincide with these small retinal jumps. And so in order to analyze it further we picked moments of those saccades and averaged them. So we first averaged them for one fly and then averaged them over a population to draw averaged retinal jumps in two directions. And then we could ask what happened at those times to the membrane voltage. And what we found was that there was indeed a bidirectional membrane voltage change. A depolarization in one case and a hyperpolarization in the other case. And if you again go through all the signs those membrane changes are consistent with the retina moving in front of the stationary grating which generates relative motion and then the visual neurons visually respond to that. And this is also consistent with the fact that this response is gone when you switch off the lights. Again, same recording, same Y axis. If you look carefully you might wonder why this is not entirely flat. It's completely dark but there's still something going on. And so there's this initial hyperpolarization here and depolarization here that seems to be independent of the visual environment. And we think that maybe this is a mini motor-related potential and we're going to follow that up in the future. But what the takeaway from that data now is that retinal movements that are attached of the inter-receptor angles still lead to a bidirectional visual response in visual neurons. So flies care about these small spontaneous movements but then on the other hand there are also large movements in the natural behavior we can see movements up to 5 to 10 degrees. So this is like 1 to 2 inter-receptor angles. And we were wondering if that leads to an expected shift in the receptive fields of visual neurons and if vision is even maintained. Because if you remember the 3D animation by Igor Sivanovic you saw that the masses attached to this frontal part of the retina and it isn't entirely... I mean, if the masses pulled on this retina sheet they were actually here if the organization of the retina is well maintained. So we wanted to see, can flies still see when they do large movements and does it lead to the expected shift in receptive fields. And in order to address that we performed again wholesale patch clamp recordings and this time in so-called LC14 neurons so those are the neurons they connect the two lobulas in the fly brain and they have front-facing relatively confined receptive fields. And the idea was if the retina moves the receptive fields should shift. So if we move a stripe around the fly to which those neurons respond we should see a shift of the response in angular space. And in that case we didn't wait for spontaneous movements to happen but we just wanted to be more efficient and have reliably the same movement and so we made use of optogenetics. So we have access to the motor neuron that innervates the muscle here shown in green. So this is the motor neuron, this is one of the masses and we can express crimson a redshifted generobodopsin and then activate this motor neuron using red light. So here I hope that you'll be able to see those pseudo-pupils of both eyes moving when we shine light onto the fly. And we can analyze that then and quantify it and I'm showing you here the position of the left in orange and the right in blue pseudo-pupil. And so you see that whenever the light goes on the retinas, the pseudo-pupils move to the front we can hold them there and when the lights go off we can go back and then we can show our stripe moving around in two directions and at the same time perform wholesale patch-clamp recordings. And as you can already see in these exemplaries vision seems to be maintained, right? The neuron responds to the stripe independent of the position of the retina. And we can then ask does it lead to this expected shift in angular space but we know that we just average trials without the genetic lights on and average trials with the genetic lights on and so this is without and this is with the genetic lights on and you can see that indeed there's this shift happening in receptive field and, in fact, here this is 15 degrees roughly because with optogenetics activation you can pull the retina even further, then flies tend to do that Αυτό εδώ έχουμε ένα τελευταίο τελευταίο που είναι τρεις φορές από το τελευταίο ή το τελευταίο αντιμετωπίστημα. Αυτό είμαστε πραγματικές ότι αυτό μπορεί να υπάρξει σε αυτό το αξιόδιο, όταν η ζωή still maintained. Οι παράδειγματοι τώρα είναι ότι οι φορές προσπαθούν στήμουλες στήμουλες, σποντένους στήμουλες, στήμουλες, στήμουλες και στήμουλες στήμουλες. Εκεί είμαστε πραγματικές για να υπάρξει τα ζωή στο πρώτο μέρος. Αυτό είναι ένα τελευταίο τελευταίο τελευταίο με ένα πολύ σημαντικό υποστισμό για το οποίο θα είναι καλύτερο, το οποίο είναι το αυτοκίνατο τελευταίο. Γιατί όταν η ζωή είναι σχετική, η σύγχρονη αυτοκίναση, η σύγχρονη αυτοκίναση, η σύγχρονη αυτοκίναση, η σύγχρονη αυτοκίναση της εμπειρίας στον Βατινά. Για άλλα πιο σποντενές στήμουλες, δεν ξέρουμε ότι οι βαθλές είναι να κάνουν αυτό. Υπάρχουν πολλοί δικές εξοδεύτερες, γιατί υπάρχουν δικαίσεις σε δικαίκες πλευράς, αλλά πρώτα αφού έχουμε έναν υποθέσεις, που είναι ότι πιστεύαμε ότι οι σύγχρονες μπορούν να πηγαίνουν σε αυτοκίνασης. Και είμαστε σημαντικά από έναν καλύτερο στήμουλ για πέπες από το στραυάσλυμα, με το οποίο they showed that flies can adapt their behaviour when crossing gaps, depending on the size of the gap. Βεβώς το γέμπι είναι μικρό, they can do very acrobatic maneuvers to launch themselves to the other side, Αν είναι το τέτοιο μεγάλο, θα μπορούν να έχουν περισσότερο σημαντικό εξηγή. Λοιπόν, ώστε να μπορεί να μεταφαίνουν την λογή του τρίγου. Και φαίνονται και ειναι ειναι ειναι δρουσοφλα, ας εμπορυμείτες ότι τη δυναγητική υποδοχή του δυνατουρουσού είναι πολύ σημαντικά και η δυνατουρή είναι πολύ καθέμη, και ότι αυτό δεν είναι αρκετή για τη στιγμή της στιγμής. Βλέπουμε ότι οι φαρμότητες θα χρησιμοποιήσουν άλλες στιγμές για να κομμάσουν τις δημιουργές και το στραυγόκοπος να παίρνουν την αρκετή, που χρησιμοποιούν τον παραλαξία της δημιουργίας. Λοιπόν, νομίζουμε ότι πιστεύουμε ότι οι ευρωπαϊκές δημιουργές μπορούν να βοηθούν σε αδίσθηση να κομμάσουν τις δημιουργές. Λοιπόν, πριντάμε, σαν εσάς εμείς, καθόγοντας λογή μου, οι κατακτυάρια, που έχει διεύτερα, τελείωθεν τρίλλα κομμάτρα με όνειρο και οι αρκετές δημιουργούν να χρησιμοποιούν με τα διαφορετικά χρήματα. Οι αρκετές στον κομμάτρα και της εξήπτερα, είναι που να μεταφυμό danach αλλά να δημιουργήσουν και οι δύο αρκετές δημιουργούν να κάνουν μορσίο reason. αν αξιολογήμαμαστε, ειδικά αν αξιολογήμαστε, ειδικά αν είναι πραστά σε παρουσία τέτοιων ασχολιγών. Λοιπόν, πέντετε, με το αντιμετωπίσιο για να εξαφτήσουμε κατεύθυνες κοκτασία, και να εξαφτήσουμε, νομίζω, ειδικά από πάρα δύο τραυαλίες. Αυτό θα γράψουμε πρόκειται, είναι να εξαφτήσουμε, και οι δύο τραυαλίες ήταν και πολύ σπαρσί. Βρίσκοντας πραγματικά δημιουργεί τα δυρυθμότητα των δυρυθμών. Από την άλλη δυρυθμότητα they were also kind of weak. So when we expressed the KIR, my movements were not entirely abolished. They were reduced, but they were not entirely abolished. But we used those two driver lines and let the flies walk over the catwalk and then used deep lap cuts to trace the trajectories. And those are two examples of the crossing trajectories. One for a fly that was crossing mostly on top and another one for a fly that was crossing sometimes on top and sometimes walked in one detours. And actually this pattern that the KIR flies would walk around more often held over population of flies for the two different driver lines. So whenever we dampen eye movements it seemed that the flies are a little bit more clumsy or a little bit less efficient in crossing the gaps. And so this maybe hints in the direction that they use eye movements for distance discrimination. And this would fit actually very well with the discovery of what the retinas are doing when they cross the gaps. And to do that we had flies walking on a little walking wheel that had two wedges cut into it that served as gaps which allowed us to look at the retinal position and at the same time we can track the trajectory of the wheel. And this is an example trace and this is actually data collected by Gabi Mehman. And what you see here is the wheel position down here. The vertical lines indicate whenever the fly has crossed the gap and on top you see the two example traces of the right and left pseudo-pupil. And you might already be able to see that whenever the fly crosses the gaps there's this large movement happening and it's a large convergent movement. So one can average over all these gap crossing events and it still holds. So you see an averaged convergent movement of the eyes in that fly and it holds again over a population of flies. So when flies cross a gap they make a convergent movement. If you dampen eye movements, flies are a little bit more clumsy in crossing the gaps. And this is of course not hard proof or anything that flies use eye movements for 3D vision but it would point in that direction. And this is something again that we want to follow up in the future. And one can think about how flies might use eye movements to do that. And one way would be to use them in sort of a dynamic binocular ruler in which confined front-facing receptive fields are swept in the frontal space of the fly. And then after some coincidence detection you could try to angle the distance. And so candidates for underlying such a computation would for example be the neurons that we use to show the shifting receptive fields because they have a confined receptive field that is front-facing and they connect the two optic lobes. And with that I would like to summarize what I have shown you today. So first I wanted to show you that there's motor-related inputs in the fly that inform the visual system about ongoing behavior. And more precisely I showed you that their e-ferrence copy signals or signals that look like e-ferrence copy signals during fast flight turns. And I only told you about the data that I was involved in but there's a whole set of modulations that come to the same visual neurons in walking that interestingly have the opposite sign and that help in straight walking. And this is work from the lab of Eugenia Schiapin. So this is just to tell you that the visual system is modulated by ongoing behavior. And then I showed you that the drosophila has two masses that attach to the retina and that you use those masses to perform retinal movements. Some, not all of them, some of them look like vertebrate eye movements. And the visual system cares even about the smallest of those movements. And I then showed you that we think that maybe muscular movements might aid depth perception because they make this peculiar convergent movement when crossing gaps. And I think in the future it's gonna be very interesting to think about how this would interact with photomechanical contractions. I imagine that many of you have seen the talk by Mikko Jusula earlier this year in the same, in the Sussex Vision series. And he also, that the lab also suggested that those photomechanical contractions could help you to be vision. So it's gonna be interesting how those two things interact. And with that I would like to thank you for your attention and I would like to thank my lab. It's, those are basically the very, very first members that joined in the last year and they enormously helped in building the lab and like bearing with me and getting the first interesting data and then very, very grateful to all of them. Then I want to thank the main lab first of all Gabi for being an incredible mentor over the past years. Anmokim who is now a public leader at the Han Yang University before my collaborated on the reference copy projects and who taught me a lot when I joined the lab. And then all of those people who have directly contributed to with data and help to the retinal movement papers. So Sophia, Jess, Aditya, Thomas, Stephen and Tatsuko and our very important collaborator Igor Sivanovich at Chanelia Farm. And then I want to thank the funding agency and the Max Planck Institute for Biological Intelligence for a very warm welcome in my first year. Thank you so much Lisa for this very interesting talk of your really astonishing findings. I cannot help but like voice how jealous I feel sometimes when I see what amazing things you can do with Drosophila and how accessible they are. So let me go ahead and there are people thanking you on the YouTube because you don't have it. I'm conveying this to you. I'm already posting the link in case they want to keep us company already and until they post their questions I will take advantage of the opportunity to ask a couple of mine. So maybe it's a very stupid first one but like from my understanding the Drosophila photoreceptors like the same subtypes are in like are setting their optical axis, right? So when you move the retina do they still remain like in registration or like? So you're asking if the neural superposition is maintained, right? Yes, exactly. So I think that was partly of why we did this experiment where we pulled on the retina optogenetically because we were asking the same thing if there would be a slight distortion of the retina. This whole perfect neural superposition might be messed up and might actually harm the ongoing vision. But first of all experimentally we see that the flight just responds as usual. So whatever happens it doesn't seem to harm vision in any measurable way. And second when we pointed cameras at different parts of the retina in the very front, in the middle and in the back and again did this optogenetic activation and we saw that the whole eye was moving in concert. So what we think is that the whole retina sheet moves as an ensemble and that the neural superposition is maintained in that way. Okay, and going back to your idea that this could also improve the special resolution that eyes can see, which I find very interesting. The question I have is like because you already touched upon this you mentioned that we also need the neural substrate to be in place for this to be of use, right? So and this would not be only like special substrate but also temporal, right? What is the temporal window of integration? So my kind of related question to that is like do you think they would use the same like to the same extent the eye movements when they are flying? So actually they move it more when they are flying. So I glossed over that if the only hang on that little tungsten wire and they don't move and there is no visual stimulation they move the eyes from time to time but very rarely and then once they start flying or walking then there are lots of those movements happening. But this could it be because for the tungsten wire you had fixed, like you fixed the head so it cannot... It cannot move, no, it cannot move and you see them so you see also the two eyes moving independently in different directions and we see the eyes being stable so it's not that the movement of the wings or the vibrations... I mean those also happen, it's not perfect and the flies like vibrating a little bit on the tungsten wire but we see really like stable positions that then have discrete jumps and those are independent of the ongoing vibrations from the wings. How then that would be implemented then afterwards to increase spatial resolutions are very good questions and I keep thinking about that I don't know enough yet about all this work that has been done by engineers, right? And engineers can use little vibrations to increase the spatial acuity of camera images, for example. And I definitely need to dig into that work a little bit more but one thing that one could imagine is that for example the movement would just light up at edges in the image, right? That you don't really sample one point after the other and you know how much you have moved and you know when you move, etc. Which is very complicated but you could just like have the image fade out and then make a little jump and then every dark edge in that direction lights up and so you could scan horizontally vertically and get like all the finer structures, for example. But we want to figure that out in the future, hopefully. Yeah, I mean for me like thinking about like very naively I would expect like during flight the movements would not be so necessary, let's say and especially if you don't have like a 4V80 kind of formation that needs like to focus on stuff. But you are like, it's very interesting like and if I may you should look into how that might be happening and like one last question before I go to the question. So there are like pretty much everyone out there is thanking you and mentioning how exemplary the presentation was. I'm just reading the last comment from Simon Laughlin that he says exemplary presentation of elegant work, exciting possibilities, can't wait to learn more. The comments will stay there so we can see them later. So the question I had is like always with this self-induced motion of the scene and actual motion out there. What happens, like which neurons will be the ones discriminating between, ah, it's me moving or is the surrounding that is moving. Is there like a neuron that you have identified that kind of can perform this computation? Like you mentioned the motion sensitive neurons that you see both of them. But when it comes to the minor like motor-related Potentious, yeah. Like it looks like it's not even either preceding or coming after the actual stimulus-driven change, right? So couldn't you expect some latency if it could? Yeah, so I mean depending on where it comes from, right? If it could be from some visual feedback. Are you still here? Yeah, yeah, yeah, sorry. I just said we appear to be getting on people's screens now that we have guests also joining. So if you would think that there's a feedback, right? From some mechanical sensors for example then obviously it should come later. But if you imagine that there's some predictive signal like let's say there's a center in the brain like some neuron that is going to command the eye movement that is connecting to the motor neuron and that one would branch to the visual system or the motor neuron itself would connect to the visual system somehow then it should slightly precede it or like be at the same time. And I think what we see is like that it's like roughly at the same time maybe a tiny bit early. So I don't think it's consistent with feedback. Okay, yeah, because I think from primates they always saw that there is this latency always this difference between proceeding and following. So this is why it's what motivated this question. For flight terms that the efference copy signals that I've mentioned before it comes early so ANMO has quantified that very carefully and I think on average it's 10 to 20 milliseconds before the wing moves. Okay, great. Thank you so much. So in case like the guests that we already have in the room do not want to ask something publicly I would thank at this point Lisa for her wonderful talk. Thank you, thanks for listening. I will remind to our audience that they can click on the link that they can see in the YouTube chat to join us. Under this time I will be terminating the broadcast so we can continue in a more informal setting. Thank you very much. Thank you.