 So let me begin with the story of AlphaGo's victory over Master Li. Now many people have raised the question, is now machine intelligence superior to human being? My answer to this is no, of course not. The machine can calculate the move very quickly, but it's the human actually has to make the move. So they have an assistant there. Now the brain is actually the most complicated structure in the universe. It is a product of evolution. During this non-evolutional, at one time there's an accident. Some mutations have created explosive growth of the human brain, making a very thick tissue that's where the intelligence is. The modern technology of brain imaging, such as Pat, can tell us that different brain regions are actually devoted to different function, vision, hearing, and speaking. For a more complicated thought process, such as thinking, it involves global activities. Now this global activity is actually carried out by a very complicated network of 100 billion cells of hundreds of different types, making the network that's very complicated. And the functions carry out within a very specific pathways within this network, we call it neural circuits. The activity going on is electrical, and the unit for communication is called action potential. It's a pulse of electrical current that passes through the network. And we now know every neuron in the brain actually carries out coding for different purposes. For example, you can code space of animal in a familiar environment, so-called place cell. The unit of communication is called nerve cells called neuron. It actually has a structure of input with a lot of dendrites which receive input from other cells, an output called axon that goes to send out information to other cells to make connections. One of the most important discoveries over the last 50 years, I would say, is the realization that this communication, the synapse, where the signal is passed from one cell to another, this actually can be modified by experience. Now if the cells, two cells have correlated activity, the efficiency of transmission is increased. If they are not correlated, the efficiency is reduced. Now this changing efficiency is actually happening in this network with enlargement of structure of connectivity between the cell or shrinkage of the connection. So the efficiency is actually caused by changing the structure. Now the idea that the memory or the experience is stored in the network, this best example is the idea of a cell assembly or neuronal assembly. Now think about our memory of a circle. This activation of a group of cells by a circle constantly will link the connection between them. The memory of our grandmother is because of exposure to our grandmother's face during our childhood that links groups of neurons that are coding for specific features of the grandmother's face and that linkage is the memory of the grandmother. So actually activation or part of it can reactivate memory of the grandmother. Now this network is actually formed after the birth. During the first two years of a baby there's an increased growth of the network and this network is formed by experience. So if you look at this connection, the synapse between cells, there's a great increase during the baby period for the first two years and then there's a decline of the synapse. The experience is shaping the connection, pruning some connections to make it more mature, more useful for the connection. This pruning changes goes along throughout the life. Now the artificial intelligence had a big progress during the 1980s where the John Hartfield introduced the idea of the plasticity of connections into the artificial network. He showed that one can modify the synapse and then the network can learn. This idea is now further promoted recently by this deep neural network which is the basis of AlphaGo's ability where the plasticity of connections actually play an important part. There are also new elements of the brain science introduced into this. For example, multi-layers, structure and recurrent connections, the idea like that. So today I'm telling you a few new ideas about multiple cell types within the networks that has not been introduced in the network. There's modification of networks by experience. There is the memory mechanism where we keep our memory of various things like our grandmother's face. This needs to be introduced into the artificial intelligence. When that's done, we have a much more powerful artificial network.