 OK, so what I'm going to do is to, in some senses, take you on from what you heard about yesterday from Douglas, start to talk about the molecules that do stuff in cells. And I'll, well, in the first lecture, I'll just give you a kind of an overview of all the kinds of computations that they do. Is this thing on? Well, can you hear me anyway? All right, well, maybe I don't even need it. It's actually intended for the video. Oh, it's intended for the video? Yeah. So is it working for the video? Yes. OK. And then in the second class, what I'll do is I'll focus on memory, or to be more general plasticity, synaptic and other kinds. And I think that'll actually be a lead into some things that, for example, Astrid and others will be talking about later on. So let me start off. So you've probably all seen a slide like this many times before. The brain computes at many levels. And we'll have a debate in a, what is it, a friendly discussion or some such phraseology, a dialogue, that was the word, in a couple of days in the evening about what is a good way, a good level of description to apply to the brain. But I'll make the argument for now at least that it's really short-sighted to restrict yourself to any one level of analysis. You really do have to try to be quite holistic. You have to have broad canvas in order to figure out what's going on. Because the brain really doesn't restrict itself to any narrow range of computation, whether it's channels or networks or molecules. It really does employ all of these things in coordination and across levels. And we'll see a bit of that. So these are some equations which I presume you are familiar with. Everybody knows these equations? OK, so what is this equation? Anybody? Come on, what does this remind you of? This is chemistry, right? A plus B gives C. What's that? That's the OD. That's the rate equations for the same chemical reaction. Use this thing's pointer anyway. OK, so what about this? Everybody knows this one? It's diffusion equation. You know the diffusion equation? This one? Nernst. That one? Kebo, great. That one, those two to be precise. Hodgkin-Huxley. And the one on top. No, actually this is one of the standard models for synaptic conductance. It's an alpha function. So you give an impulse, which is the actual potential coming in, and that's the typical conductance change in the lower snap channel. Anyway, so I should actually extend this list because many groups, including us, are interested in extending this to the realm of mechanical and structural changes. So there should be some mechanics in there as well. But anyway, the point of this is, this little list, is that this set of equations is actually a very good starting point for doing a great deal to understand how the brain works. It's not everything by any means, but a lot of stuff happens and is described pretty accurately by the set of equations. And so we'll be seeing things, particularly at the lower couple of levels down here in the next few hours. OK, so this is just to reiterate the point. The brain does its calculations at electrical and chemical levels in addition to others. The time scales range from very, very fast to quite slow. The length scales likewise range from very, very small to very, very large. And we're going to be sticking primarily to the chemical side of the picture. OK, so this is a thing I put up to annoy people who like to think of the brain in terms of networks in electrical computation. And the point here is that the brain does most of its computation, I will assert, through chemicals, not through electricity. So in this little calculation, let's say that the time course of a typical electrical calculation is over a millisecond. That's about the time course of your alpha function for synaptic transmission. Because of the fairly long electrical length constant of most nerve cells, you can expect that you can subdivide it into, say, 10 or so regions which are electrically distinct. There are that many cells, 10 to 11. And so you end up with a certain number of potential operations or calculations per second. Now, that's the electrical view of the world, of the brain. How about the chemical computation? Now, granted that it's slower, generally, though strictly speaking, synaptic transmission is also chemical in nature, and that is extremely fast. The electrical part of synaptic transmission is sort of an afterthought. It's the last bit of it. The number of zones is very large because the diffusion length constant for chemistry is quite small. So you effectively end up with one chemical computation zone per dendritic spine or synaptic region. And this is just extending that out. If you just look at what happens along the dendrite, you still get a large number. The number of pathways signaling pathways I'm being conservative. I'm saying that there are 100 distinct channels of information, though Douglas and others have pointed out just how many molecules are doing stuff. Same number of cells. Well, you multiply it all up. You end up with about 100 times more calculations done chemically than electrically. So this is the perspective, at least, I want to get you started with. That is that the brain is not primarily even doing electrical computations in the electrical domain. It's doing a huge amount. And the electrical computations are sort of a minor correction to its real work, which is to think in terms of chemistry. So let's work on that. OK, so I'm going to skip briefly over these gentlemen, who I gather will be introduced more completely later. Everybody knows them? Good. You know them, personally. Oh, very good. OK, Hodgkin and Huxley. And this is another gentleman who is also very important for electrical computation. And again, you'll be hearing more about his work later. And I'm going to skip over all these components from neural networks and jump straight to the last one, which is models incorporating chemical signaling. Not that you'll miss out on all the other good stuff, but other people will deal with it. OK, so I'm going to now start to think primarily about even if you want to think about electrical computation, one of the fundamental things and I'll be coming back to this again in the next lecture is the connection between cells. How do networks learn? In other words, how does a brain learn? And it depends on the molecules. So the basic idea is that synaptic weights change. Now discussing more of this, you have a whole lot of requirements for computation that decide when the weights will change, how the state changes will be maintained, how you ensure a balance between these. And that's what we'd be getting into. Oops. OK, so here we have just so since we're talking about computation, how do you treat computation in chemical terms? Well, the basic idea is that you map the identity of a signal, let us say light falling on your photoreceptors, into chemical identity, which in fact is what happens in early visual transduction. It is chemical signals that get activated. The wires, in this case, are not distinct axons or different wires on a printed circuit board. They're distinct chemicals, even leaving aside the matter of spatial separation. And then stuff happens when you have chemical transformations because if you're transforming, so now, assuming that you treat the chemical signals as carrying information, then when you have a chemical transformation through a reaction, that actually is a computation. That is, you are combining two signals and generating another signal. And so that's the basic idea. For example, this sequence of events, which could be electrical signaling or chemical signaling or just information flow, simply says that a signal A or a chemical A gets transformed into something else, A star. That causes the transformation of B into B star, which then interacts with something else to give you D star. Now this could have been just a schematic representing information flow. You can also interpret it as chemical flow. And that's the line we'll be taking. That's the framework. So let's start by looking at the kinds of molecular network. And I have somewhat arbitrarily decomposed it into three kinds. First of all, they're DNA networks where you have these arrows represent segments of DNA. And those are operators, proteins binding to operators. Those are proteins produced by the genes. So this segment of DNA produces that protein, which controls that segment of DNA in vice versa. So you can get pretty elaborate circuits through the operation of genes and their operators, repressors, and so on. And actually quite a bit of computation is certainly done in this manner. There's another more recently recognized level of fairly intense chemical computation, which is the networks of RNAi. That is, small RNAs are capable of regulating each other and therefore regulating which proteins are produced. And it does so through Slicer Dicer, all these nice recently described names. And again, I'm not going to be discussing this in much detail. What I will be focusing on are networks of proteins and their interactions. And this is happening independent of, or I should say downstream, of the DNA and protein synthesis steps. Whoops, it now worked. Maybe I can go back. Wonderful. Okay, and so this allows us to... This is what the level at which I'll be describing the chemical computations. Though the principles that apply here as well, except that, of course, with DNA and other things, there are other elaborations about specificity of binding and so on, which are not quite the same as we have over here. Okay, so here just for a reference is a small diagram of the DNA-based logic of E. coli. And anyway, so you have all, or maybe it's not even E. coli, it's something else. This is, since we're talking about misoderm. It's early development. It's early development. It might be, or then it'll be C urchin probably. Yeah. Okay, anyway, so you have a zillion genes over here and they all regulate each other. And so this is a fairly interesting network. This is another interesting network at the biochemical level and here are all some of the, some, a small fraction of the proteins that are active in doing processing in your neurons. And we'll be meeting a few of these. I'm not going to interrogate you about the alphabet soup we have up here, but you will get to see some of the kinds of things you can do through things like feedback loops and so on. So that was the, so one of the misleading things about diagrams like this is that you look at these things and you imagine that you've understood that you can now do some models of something on the scale. And this is something that Douglas also alluded to. The fact is that you actually have to go into quite a lot of detail because these blocks and these little, pretty little arrows are very deceptive. Every single one of these blocks has probably 10 to 20 different chemical reactions, each of which you will need to parameterize. Every single one of these arrows is very likely to contain its own complement of reactions and in some cases the arrows mean completely different things. So for example, binding of that to its receptor is one kind of event, but the phosphorylation of a molecule which is also represented by an arrow is a completely different kind of event. Sometimes you have arrows which indicate movement of molecule. So arrows are also very deceptive. So what one has to do then is to expand this kind of a diagram out into more complete reaction level diagrams. And this is one set of such reaction level diagrams where each of these can now be described in terms of the rate equations we saw earlier, right? Forward and backward rates, enzyme kinetics and so on. And so here are some of them. Whoops, it went fast forward there for some reason. It doesn't want to stop at that one. Okay, I was trying to introduce some of the people who've done the work. Ah, success, okay. So this is Arnold and that's Pregati who have been working on some pathways involving trafficking of receptors and involving synthesis of proteins, of proteins at the dendrites. And this too is all a whole bunch of chemical reactions. When you put it on the computer, if you can persuade it not to flash pass too fast, you get a kind of a haystack diagram of this kind where you've got lots of arrows going everywhere. But this is the sort of thing where you double click on any one of these things and you get a whole list of parameters and so on. So it's hard work putting all the parameters in, putting all the mechanisms in, but this is what is required to make ODE type models of systems like this. I say it's hard work. Actually, it's even worse because every single one of those arrows and rate constants is probably several years of work of some grad student somewhere who's spent time with nasty radioactive assays and things like that. Actually measuring what happens in cells to get these numbers. So it's a lot of work putting this together and ours is the least part of it. But anyway, you put this together. Just to put things in perspective, about 12 years or so ago, this was a small subset of the essential signaling molecules known at the time. And this is what I had to work with when we did some of those early network diagrams. More recently, this is something which Douglas and others have worked on to more statically enumerate the molecules and interactions over there. And things of course start to get even more intimidating when you look at the numbers of possible interactions. Whoa. Okay, so now what's distinctive about these networks? Well, compared to, say, most kinds of mammalian cortical networks, I won't discuss insect and invertebrate networks because those are actually kind of interesting and distinctive in their own way. But compared to, say, a cortical network or the hippocampal network of neurons, that is, signaling networks are a little bit unusual because every single molecule is distinctive. That is, it has its own complement of reactions, its own complement of substrates, things that it will bind to, its own rate constants and so on. So everything is unique. You can't just say that here is, let's say, a pyramidal neuron and you just multiply it by 10,000 or 10 million with, of course, cell-to-cell variation, but the categories are very, very much the same and you'll say there's a pyramidal neuron, there's a pyramidal neuron 10 million times if you go and look for those cells. Here it isn't like that. These are highly inhomogeneous networks. They're 25,000 genes and perhaps five times as many unique distinct proteins based on splicing and so on. Each of those has typically five or so regulatory interactions, sometimes many more, not to mention the number of bases where you can possibly have these interact with the DNA. So this is a very, very large network. It's a very messy network, but this is what we've got to try to figure out. Okay, so let me change gears and I seem to be having some trouble changing gears here, but never mind. No, this is working just fine. It just gives everybody a preview of what I'm going to say to slides down the line, which is all right. Okay, so let me now, so now I've told you how horrible this is, right? There's zillions of molecules and each of those has lots of reactions and so on. So now I'm going to tell you how one actually goes about dealing with it. Okay, so how can you describe signaling? Okay, good. So the standard way, the test tube approach, as you say, molecule A binds to molecule B. In this case I'm describing a very simple series of reactions which is involved in receptor ligand binding and then activation of a G protein. So a ligand binds to receptor, forms a complex. This binds to the inactive form of the G protein, forms a bigger complex. This interacts with GTP and there's an exchange and then you have further molecules form and go the distinct ways and have the signaling effects. Okay, so that's the zero order description. This is what you would get if you were chemist and put it all in a test tube and shook it well. Of course that's not what the cell does. So the level at which most people describe this kind of signal is this way. You say that there's a compartment, let's say the extracellular compartment from where the ligand comes in. There's the membrane compartment in this case where a lot of these reactions take place and there's an intracellular compartment and then you can have further compartments say for the nucleus or you might have a spine treated as a separate compartment because it's a diffusionally isolated and so on. So this is the level of description which most people use and this is largely because it's a lot easier on calculations but it's also because diffusion parameters are a pain to get accurately. I mean you can do certain estimates but it's a pain to do it right. Okay, here's the diffusion level so now you consider space explicitly and the calculations suddenly become a lot harder but you can imagine the diffusion of molecules and diffusion of the plane of the membrane and so on. So this is a more complete level of description if you will and as I said I'm advertising for a couple of days from now when we'll have a debate on what's a good way to do these representations. What's the right way to think about it? Here's another way you have to think about it which is that in many cellular calculations at a chemical level the numbers of molecules involved are very small and because of this the nice old fashioned chemical reaction A plus B gives C cannot be represented in a differential form because of course differentiation depends on being able to subdivide things indefinitely but now if you're down to integral number of molecules you can't do it. A plus B bang into each other and then with a certain probability they react. So I don't have a blackboard here but I like to do this little calculation with people in settings like this where we estimate the number of free calcium ions in a synaptic spine. Okay, so let me just give you the numbers. This typical synaptic spine is about half a micron across. The typical resting concentration of calcium is about 0.1 micromolar and you all know Abogadro's number and you all know your volumes and so on. So if you grind through the calculations someone here has already been through this calculation so I'm not going to ask him. So but if you grind through the calculations how many think that the number of free calcium ions will be say of the odd of a thousand? No takers. All right. How many you think there'll be minus thousand? Okay, at least you're awake. All right, how about 100? A few takers for the 100 range. How about 10? How about one? How about zero? All right, how about a million? I mean over a thousand? A lot of people are just non-committal. Do not want to vote on this. Well, if you actually go through the calculations it's of the order of 10, it's six. Yeah, you just put all of those numbers together. So that's definitely in the range in which you need to worry about individual chemical reactions. Individual, I should re-phase that. Individual reaction events where individual molecules bump into each other and decide whether or not they're going to combine. Okay, so this is, that's the baseline level. And I, right, so point one micromolar goes up in the spine estimated by almost a hundredfold. So you can get a lot of calcium ions when interesting stuff is going on. But you know, this is not an unusual number. Six is not an unusual number. For example, the number of receptors is amper and NMDA and again in typical spiners of the order of a hundred. Some of the key signaling molecules are in the tens, although camkinase is a outlier, it's there in the several hundreds. So you have a range of numbers, but they're all small. They're all relatively small. I mean, you have lots of water and ATP. But again, those are not, well they're relevant for these things, but you don't usually have to worry about their influence on these calculations. So this is the level at which it's many people have also adopted to do these calculations. And note that this is not exclusive of the diffusion and so on. In fact, here is what you would do if you wanted to do diffusion, reaction diffusion calculations, including stochasticity. So now you have to think about the random walk of individual molecules, sometimes in the plane of the membrane and sometimes not. In fact, you would probably want to go further and think about structural changes which happen when you introduce actins and change the very shape of the dendritic spine and of the cell, which of course will eventually have a huge influence on what the network does. So these are all levels at which you may choose to decide to describe your cell, your chemical computations. And the level which you choose, of course, depends on the various constraints. To me, the biggest constraint is what do you know, what are the parameters you have, but there are constraints on computer time, the tools available, and what are the questions you're trying to ask. But I'm just laying before you the kinds of things that people do and the kinds of computations that do. Okay, this is more of some of the mechanical, amazing mechanical things that happen at the cell. Okay, so this is math, action, kinetics, and this is the level at which I'll be sort of conceptually operating most of the time, though I don't think it's very hard to span up and down once you've got the general principle in mind. Okay, so let's do some chemistry basics and I apologize to many of you who've been through this kind of thing before, but just to remind the others of the basic chemical laws and principles which underlie the calculations I'm going to discuss. So you'll almost certainly all know this and I hate to belabor the point, but just so that everybody's on the same page, A plus B gives C, there's a forward rate, there's a backward rate, and you express that in terms of rate equations. This is how A depends on the rate constants and on the concentrations, okay? I'm sure this is all very, very familiar to you. You can write down equations for each of the constituents, you can eliminate some of the terms through mass conservation and so on, though it's rarely necessary to do that. By the way, don't hesitate at all to ask questions, yeah? I'm chugging along at a pretty fast clip, but I really would appreciate you to stop me and interrupt and ask questions rather than me to barrel along and leave you behind. Okay, so this is what would happen if you had instead of a single A, you had two As, and if you just sort of scale up the equation in the same way. So these are the equations, whoops, these were the equations, these are the equations which the computer is grinding away to solve when we have a chemical system of this nature. This is what the set of equations generates, and this is something you can easily measure. So for example, if A converts to B with those rate constants, you will get a curve of concentrations of this. Supposing A started at one, its concentration would decline, and the concentration of B would go up. Now this is a useful curve to keep in mind because you can actually go back from curves like this, which is what experimentalist measures, and go back and estimate the rate constants which is what you need to do if you are doing modeling. So I assume that many of you are interested at some point in doing modeling if not necessary of this kind but of related kinds. So it's good to know what to look for in the literature when you've got pictures of this kind. And so can anyone just glance at this and tell me how you would constrain those parameters? How do you know those parameters generated that? Those curves. Anybody? How could you look at these curves and drive those parameters? Dan, you should remember. Huh? I'm sorry? It's like the capacitance constant. You talk to one, so you just have like one minus E and one over point one. Very good. So you're talking about the time course. Yes? Great. So the time course is excellent. So hold on to that thought. But even before that, there's something very, very simple you can do. Perfect. Perfect. Yes. So the steady state of B is twice that of A. And that gives you the ratio of the rate constants. So that's the ratio. And now you throw in the time course. So if you just look at this, just by I. So the time course is one by E is about somewhere over there. So about five seconds. And so that's tau. And so one over tau is of the order of magnitude of the rate constant. And lo and behold, that's what it is. So you have the ratio. You have the tau. And then you can compute the individual rates. So we've just gone through this little exercise. This is to introduce you to what I had to do and what students have to do when they are grinding through the literature trying to parameterize those nasty messes of equations. And actually, we're very happy when we get beautiful curves like this to work with. Usually you don't. What people often do, though, and it's a huge help anyway, what people often do is they do give you the ratio of the fold change, the number of times by which the product has increased following a reaction. And so at least you've been able to constrain one of those terms. And then somewhere in the methods, they'd say, we did this measurement 15 minutes and five minutes after the things were mixed. And so now you have an upper limit, at least, on the rate constants, on the time course. Anyway, so this is sort of giving you a glimpse into what it takes to parameterize chemical reactions when you are presented with lots of stuff in the literature. Oh, I don't have to go there anymore. OK. So enzyme reactions are a little bit nastier. In fact, a lot nastier. And it's been the subject of intense study by biochemists. I'm not going to hammer the point in, except to say that this kind of simple enzymatic reaction scheme can be expressed in a simple form. This is the Michaelis-Menten form, or a more complicated form, or a yet more complicated form. This is the level at which we and many people do calculations. But it's relatively easy to see that. Basically, it's a sequence of relatively simple individual reaction steps. The standard formulation is that a substrate binds to an enzyme, gives a complex, and the complex forms a product and the original enzyme, which can then go back and activate some more molecules and do some more reactions. So this is what we use and which most people use. So there is the Michaelis-Menten formulation. And you can see equations of this kind. And again, I'm not going to spend a lot of time on this, but you get curves like this. And the chemist will use those curves to give you values for km, which the Michaelis-Menten constant, and kcat, or Vmax, which tells you how fast the overall thing goes. This does not fully constrain the system, so you have to do some more assumptions. But the long and the short of it is that these are also the kinds of numbers that you get from the literature. Question? You've got, apparently, you've got S going to P. I mean, in a normal chemo-peraction, you've got A to speed gives you a compound of A and B. Yes. But in the end, the less effective that is S, doesn't it? Yes. So, I mean, are there actually other molecules involved in this? No, there don't have to be other molecules involved. Though, very often, there's hidden molecules hidden, like ATP and water. And you just describe the thing in terms of the other species and get effective rates out of those. But even this is a perfectly legitimate type of reaction, and they do happen. Yeah? OK. All right, so these are all just the very basic biochemical steps. This is what such a system looks like. If you start with that much, one molecule or one micromolar substrate and one micromolar of enzyme, you would get the substrate decreases, the product increases, and because there's the formation of the enzyme substrate complex, you get a small but brief dip in the amount of free enzyme available to interact with something else. OK. And again, you look at these things, and you hope to try and figure out what the rate constants were. OK, so we've already gone through this little exercise, and so I'll spare you that. But let me see if I can run a little demo. It's interesting. That's on a different screen. OK. Let me run through a little demo, which I think is a lot of fun on what happens when you have reactions in different volumes. I was hoping to show you this in Moose, but in Moose you have to keep reloading it. So it's not quite as straightforward as a much older program that I use. I'm not used to the new Ubuntu. Ah, I don't know my genesis linked. Oh, wonderful. OK, I can't run this demo because as you may have gathered, I've recently switched to the latest Ubuntu and it's giving me all sorts of hiccups. What I was going to show you was what happens when you play with the volumes in which your chemical system does its reactions and how the quote unquote noise in the system, in other words, how the stochastic individual reaction steps change the shape, the appearance of the outcome. So in many systems, it looks just like noise added to the smooth curves that describe the chemical reaction. But in systems which have high nonlinearity, such as by stable systems or some kinds of oscillators, you get things which look very, very different from the deterministic kinds of curves. So that's the take home message. It's a pity not to be able to show that to you. But so here's a kind of picture. So here's a reaction involving protein kinase A. The solid line is the deterministic calculation. And the horrible jagged line is what you get if you do the same calculation, exact same rate constants and numbers of molecules done stochastically. This is using an algorithm called the Gillespie algorithm. And it's absolutely hideous. In a slightly different context where there's more molecules, you get something that is, again, so there's the deterministic curve, the solid line. And then the gray cloud around it is the outcome of one particular stochastic run. So the point is that even if you think you have a lovely chemical system which is carrying information, once you've taken into account the fact that you are dealing with very few molecules in small volumes, the computations may not be as straightforward anymore. And this is actually a huge issue in studying these systems. The message, and there is no doubt that you've got the numbers right and everything, but I'm wondering if the story is not quite as bad as it may seem at first times, because a lot of these chemical things, you know, the chemical stuff has to be kind of reliable on a millisecond or second time scale. But a lot of these chemical things really take an effect on longer time scales. And if I look at them under an average over, you know, seconds or hours, here it comes out to me. Yes, exactly. So the average over this time scale will be reliable. But I think you would agree that for protein kinase A to be doing that is even, and protein kinase A is one of the faster enzymes, just so that you know, it actually affects even things like the after hyperpolarization and fairly rapidly it changes the properties of cells. So for it to be doing that over even 100 seconds is actually a big deal. This is, of course, happening in a very small volume, in a spine-like volume. And typically, it would have an effect on the cell if it were happening over a region of dendrite, over which you would simply have averaging effects smoothing out something horrible like this. So in other words, if you have some process that has a threshold of 0.4, or some period of 10 seconds, then the average would never get there. And it's also, I don't know if I have that slide here. No. If you have very, very nonlinear events where you get a stimulus, which is a one-soft stimulus, let's say a volume of synaptic input, and if you have to make a decision based on that stimulus, and if you now have noise like this, then very often you'll end up in a situation where it could go either way. So there will be a certain probability that it will go up or below. But on any given stimulus presentation, you don't know what's going to happen. And a very concrete example of this, which is easy to measure macroscopically, is that if you measure synaptic transmission through a single synapse in, say, the hippocampus, the probability of synaptic transmission is actually below half. So you get your perfectly reliable action potential to come in. But the number of vesicles available for release and all of the biochemistry that triggers the release of neurotransmitter means that you actually only transmit information to the next cell less than half the time. And so that's where you get a very, and it's easy to measure these transmission events. So you can actually get effects of this kind of stochasticity in very real and straightforward measurements. Of course, if you're a single channel physiologist, you see this sort of thing all the time. OK, let's move ahead. So let's look at signal processing through these kinds of molecular logic. So here you have, has this little thing been sitting up there the whole time? It's just a hookup. It just sends your tower. OK, now I just have an arrow over there. OK, so let's look at some simple kinds of computation. Amplification, an enzyme is an excellent amplifier. Why? Why? Anybody? Come on. What does an enzyme do? It's a catalyst, yeah? It can be reused. So you turn on an enzyme, and it's going to trigger the conversion of lots of molecules, until something else comes along and turns off the enzyme. In fact, you won't get into it right now, but frequently the product goes around and turns off the enzyme. So you can get a large-fold gain. So you turn on one molecule of enzyme, you might get a hundred-fold gain. That's actually very modest. Some enzymes have an incredibly high turnover rate and produce a huge amount of product. So enzymes. OK, got it, got it, got it. So let's take just a simple binding reaction, or conversion reaction. This might seem like a not really straightforward thing, yet you get some very nice curves, which could be very close to linear, or sigmoid, or something very close to a thresholding curve. So even a very simple binding reaction, depending on the order of the reaction, can give different kinds of nonlinearities, which can be used for computation. You put some of these together. You can get, for example, this is actually a block diagram of a very real set of chemical reactions. You put these together. You can get a linear step over here. So these two forms of cyclase both produce cyclic AMP, so the amount of cyclic AMP adds linearly. But the regulators of these interact in a very, very nonlinear way to control the levels of cyclase. And so you can get both linear and nonlinear kinds of calculations. You can put all of this sort of thing together. You put some kind of a thresholding operation. And I've sort of mixed together chemical and logic symbolism here. But you can put together a thresholding operation. You can have inversion operations and an OR, depending on where you set your threshold. You can change the kinds of logic gates. And these are all just sort of trivial consequences of the kinds of analog computation you can do with the chemistry. So the point of all of this is just to say that there's lots of chemical operations that are available to you, and you can do pretty much any kind of analog computation you want. So here is a kind of analog computation, which is chemical in nature. This is associativity. And this is extremely important for learning in memory. All of you know about this, right? Yes? No? Maybe? OK. OK, so this is the associativity of the NMD receptor at the Synapse. And it's associative because the channel acts like a molecular detector and AND gate saying that we have both the presence of neurotransmitter and the presence of depolarization at the Synapse. When you have both of these present, then you get current flow and calcimetry into the Synapse cell. OK, so this kind of logic operation is performed at the level of a single molecule with inputs of other molecules and voltage. So now, once you've got all of these things, you can just sort of picture to yourself. And I invite you to picture to yourself the sheer amount of chemical computation that goes on even in a very, very reduced diagram of what goes on in the Synapse. So for example, here's a feedback loop. Now, those of you who've got some inkling of electrical engineering know that you get all sorts of fun and games when you have feedback loops. There are negative feedback loops. See, there's a negative feedback loop. There's another negative feedback loop. There's positive feedback loops, some very, very tight, some rather larger, and some not really even apparent at this scale. And all sorts of crosstalk and other good things that cause interesting computations to take place. So let me just run through another set of things that you can do with this. One is timing. You can put together oscillators through this very, very simple interaction. For some reason, this is faded. IP3 receptor has a positive feedback loop with calcium. But there is an interesting bell-shaped curve for the opening of the IP3 receptor, which means that you start out with a very strongly positive feedback loop. So you get some calcium coming out that opens the receptor some more, more calcium comes out. Eventually you get a flood of calcium coming out. And then you go to this regime, where there's lots of calcium in the cell. Now you shut down the receptor. And the upshot of all of this is that here's the calcium rushing in. Then it shuts down the receptor. The calcium goes down again. And you get calcium oscillations. Here's the Bellis of Zabotinsky reaction. And this has got actually a very nice, somewhat tragic story associated with it, which I won't get into. But Zabotinsky died just a few months ago. And he was working in Brandeis with John Lissner. But anyways, maybe I can tell it to you over a coffee break. This set of reactions oscillates, too. These are other kinds of oscillators out there. We'll be playing with a few of these in the tutorial. So I won't spend a huge amount of time. But you get nice oscillations of different kinds. So these are all the simple oscillators. Here's a complicated, whoops. OK, here's a moderately complicated oscillator. This is the cell cycle oscillator, or at least one implementation of it, by John Tyson and Bela Novak. And this is pretty horrible. And this is also a block level diagram. This is not individual chemical reactions. This is yet more complicated when you expand it out to the chemistry. But it produces beautiful oscillations. And it also incorporates the various cell cycle checkpoints and other important things in the cell cycle. So all of this is chemical computation for you. The circadian rhythm is another oscillator. Can you think of an even longer term oscillator than the day-long cycle? So circadian rhythm is one day. Any longer oscillators you know of? Menstrual cycle. Menstrual cycle. Excellent. Any even longer oscillators you know about? Hibernation. So that's triggered by, but that may be constrained a bit by the weather. But there's some even longer ones. Yeah, the plague of locusts oscillator. So that's almost two decades, some of them. So you have some amazingly long oscillators, and they have different kinds of summation and ways which accumulate information over extraordinarily long periods. OK, so you can do interesting computations once you have these tools at your disposal. And got it. OK, so this is allenturing. So there's been actually a lot of interesting discussion over allenturing following his anniversary. But I'm sure you'll come across him in other contexts. I'm going to talk to you about his role in describing how you can get oscillations in space as well as time. And once you have oscillations in space, that allows you to build patterns. And here are just some of the patterns that you can get through during kinds of interactions. And these are very important, for example, in setting up structures in regular structures in cells and in organisms in development. Zebra skin patterns, tiger skin patterns are just the superficial expression of this. But they're believed to be very important in defining the shape and function of your cells. OK, here are some interesting tearing patterns. Actually done in time as well as space, just so that you get a picture of how cool they really are. OK, these are all governed by the same basic set of equations. Here are some other kinds of pattern formation that you can get with not just tearing type interactions, but just with interactions of a couple of molecules. This is the tearing type scheme. So you have a local feedback excitation and a long range inhibition. And that gives you the interesting spatial and temporal patterns. But even without those, you can get some rather spectacular combinations to give you organization in cellular development. OK, all of this is chemical computation. So we've seen that. Here's another kind of chemical computation. This is the operation of differentiation in time. So what you do is you have a stimulus A that directly activates molecule B. And that produces an output. But there's a side branch which activates molecule C, which turns off molecule B. And because this is a side branch, it takes a little bit longer to come around and inhibit it. So you get a stimulus on. There's a turn on. And then the molecule C kicks in, and it turns off again. OK, so this way you can get differentiation. So what's going to happen once the stimulus ends? Anybody? A has gone off. We're over here. What's going to happen next? Any thoughts? Come on. What do you have to say? What do you think is going to happen? First B goes off, then C. So what do you think is going to happen here? Upward space, then 0. Well, B goes off, and B is not 0. So it will go down. So it really will do a differentiation. It will now measure the negative transient as the stimulus goes off. OK. So you can now take these computations in time and do interesting kinds of temporal filtering. So you can have cases where, so this is a series of experiments and models that we did where we looked at, so this is the model which predicted that there was a kind of tuning, like an old-fashioned radio dial tuning to a particular frequency. In this case, a rather low frequency repeated stimulus. This was in synapses. In fact, it was to do with memory. And you can think of it in terms of the question, what is the best way to remember a subject? If you have an exam coming up in two weeks, what's the best way to prepare for it? OK. So I know the approach that I use, which is I did everything in the very last minute and tried to cram the night before the exam. And do you think that's the optimal approach? How many of you approach did the last minute cram approach? Eh. Some people are owning up. OK. The rest of you, doubtless, did the actually sensible and wise thing, which is you studied, and then you took a break and you worked on something else for a while. Then you studied again, you revised it, and then you took a break and revised something else. And you revised again. And I really admire you for this, because I've never been able to do that. But this is repeated spaced learning. Well, it turns out that, at least for flies, you can actually show this works chemically. That is, you have a peak of learning, quote, unquote, efficacy of synaptic plasticity, which you can measure physiologically, in this case actually in brain slices from a rat. And you can measure chemically as well. And you can also measure behaviorally in flies. And it turns out that, just like you, flies also do best if they revise at spaced intervals rather than try and cram at the very last minute. OK. So this is, again, you can express this entirely through chemical networks. This was just an outcome of some of those chemical signaling networks you saw earlier. OK. So how many of you recognize this? Yeah? Good. So this is the spike timing dependent plasticity curve. This is on a completely different time scale. This is on a millisecond time scale. That was on the many seconds. That was on the hundreds of seconds time scale. So 1,000 times faster, you still have exquisite pattern sensitivity. What stimulus came first, pre versus post? OK. This is not the chicken and egg problem. This is the problem of whether the spike came first at the synapse or whether the spike came first at the post-synaptic cell. And that makes a huge difference to what happens for synaptic plasticity, whether the synapse gets weakened or strengthened. And this is a very important form of learning. And I imagine that you'll encounter it again. And yet, this too has to be accounted for. And I don't think it's been completely accounted for in chemical terms. There's some interesting combination of receptor dynamics and calcium dynamics involved. But I won't say that it's been conclusively pinpointed yet. OK. So here's just a summary table of the time scales of computation, which go down from 0.1 milliseconds all the way up to months. And you can go longer than that as we discussed some of the oscillators have extraordinarily long time courses. So all of these are in the domain of chemical computation. All right. I will now move on to the last phase. How many of you recognize this painting? Hey, good. Finally, I have a response. OK. So this painting is the persistence of memory. And I bring it up in counterpoint. Now it's, of course, going to flash past. And I miss my counterpoint. OK. Yay, got it. OK. The persistence of memory. So this is something which we'll spend the next lectures looking at more closely. The problem is that it's actually very, very hard to build any stable structure in the cell. Most of all, a really tiny structure like a dendritic spine, like a synapse. Not all synapses are, of course, on dendritic spines, but they're all very, very small. How do you get stability in a structure which has three things going against it? At least three things. One, whack. OK. Stochasticity. So you already saw the issue of the very small number of calcium ions. How are you going to store information reliably when the chemical processes that presumably store information are bouncing wildly all over the place? When you have a feedback loop, a bystable, as your model for doing this, you can express this much more explicitly in terms of Cramer time, which is the spontaneous time for the system to flip from one state to the other, which is actually hard to get to be very long if you have small numbers of molecules. This turnover, which I think is another huge problem, which is that the lifetime of molecules in your cells, in your spines, is of the order of minutes to days. So now, if you want to store information in a structure where the molecule is all turning over on that time scale, it's actually very hard. Another thing is that if you just put a molecule there and hope that it's going to stay there, that won't work either because it'll just diffuse away. So now you have to take recourse to some kind of mechanical structural change. So these are three strikes against the ability for small structures in your brain to retain information. And yet they do it. And I and many people have spent a lot of time and effort trying to figure out how you can store information reliably, how you can do chemical calculations reliably. So here's one way of thinking about it, is that if you have a positive feedback loop, A turns on B, B turns on A. You can have two possible states. One is that a resting state where both are inactive. And then you give it a strong stimulus, and then you have an active state where both are active. So now A is active. It turns on B, B is active. It turns on A. And with the correct kinds of rate constants, you can actually get something which is bistable. And you'll hear a lot more about this later. So you have individual molecules of this kind and a strong stimulus will turn them into different molecules floating out of that kind. And the reason why you do that is that, supposing you have a group of molecules, this very faint gray background is meant to be reminiscent of a dendritic spine, but never mind. I don't know if you can even see it. Can you see it? OK. I can barely make it out. All right. So you have turnover. You lose molecules. If this were to go on, you would forget it. You'd forget whatever happened. However, you have fresh protein synthesis, except that the new molecules are completely naive. They don't know what happened earlier. So the great advantage of a bistable system for storing information is that it is resistant to perturbation. So a bunch of naive molecules coming in, molecules that don't know about the original stimulus, is not a problem because they will be recruited. They will be channelized. So they all are now part of the new memory. To put in chemical terms, if they were unphosphorylated, they'll become phosphorylated and now participate in the memory just like the others. So this is a way to get around the problem of turnover when you're trying to store information for a long time. OK. I'll skip over the details of how you can look at bistability through these curves. We'll come back to it later, perhaps. OK. So what I've done is I've discussed chemical networks, levels of signaling, whether it's test tube level or compartmental or 3D reaction diffusion. We've gone over some chemical basics and we've discussed some of the kinds of signal processing that you can do through chemistry. And then finally, we've had a glimpse of memory, which I think is one of the very interesting applications of chemical signaling in how the brain works. And we'd like to look a bit more like this. That'll be in the next lecture. So to wrap up then, we've looked primarily at some of the scales here with chemical computation and I've made the argument that these are more important than those. I won't say more important. I'd say that they're doing more computations. I'll leave you to decide which is more important. And the way we do these calculations as a preview is through this friendly animal called a moose. We have a simulator called moose, which stands for the multiscale. The idea being that it can do calculations at chemical and electrical and pretty soon also structural levels. And that's what we use for our modeling. And that's Aditya there, who's one of the grad students who's involved in the project. And moose allows you to, as I said, calculations all the way from molecules up to fairly large networks. Moose is not a standalone program. One of the inspirations and in fact, sponsors of this course is Neurinformatics and the International Informatics Coordination Facility. So the point of moose, and I would say most software that is designed to do these calculations these days, is that it should talk to the standards, it should talk to the databases. And we're designing moose to do all of these good things. It can talk to various databases, to various communication standards, to other databases and standards. And it uses Python, which is, of course, everybody's favorite programming language, unless those of you like Fortran. Okay, so let me just wrap up there. I'll give you a glimpse of what we're trying to do with all of this, which is that we would like to be able to understand enough about what's going on in the dendritic spine and in the dendrite to be able to sort of make like a potted plant of it, in the sense that we would like to be able to have a sufficiently complete understanding, and of course, even an outline of such an understanding, whereby by looking at the inputs that come into the cell, synaptic and otherwise, by looking at the physical laws and the molecules that we know are present and compose them, and by having some assumptions about somatic and nuclear events, which are hideously complicated, we don't want to get into details. We'd like to be able to put all of this together and make this plant, so to speak, thrive. We would like to be able to, for example, have built into just the chemistry, just the interactions and maybe some structural details. Be able to build into this sufficient information and principles of the system that your spines will be able to undergo plasticity. But more than that, that your spines will be able to form and retract depending on the kinds of input that they get, that the excitability of the dendrite is also regulated by these chemical interactions. We'd like to be able to understand that, and there's not just us. I mean, many people here have been looking at this kind of regulation and feedback and homeostasis. But in the chemistry, I would say primarily in the chemistry, but of course, also the biophysics has to come into it when you're doing this kind of modeling. We think that we should be able to understand this and implement our understanding as a computer model, which does all of these things. You feed it the molecules and the physical laws and then let the system take care of itself. So that's sort of a very, very long-term goal that we have. But I think that this is something which is feasible given the kinds of tools we have and hopefully the kinds of data that we'll be getting. Okay, so that's the brain and that's what we'd like to understand, but at least start at this level of synapses and molecules. Okay, thank you all. And... Thank you.