 This course is about introducing you to a new approach or paradigm for modelling and analysing social phenomena based upon complex systems. Complexity science represents an alternative approach to our traditional scientific framework. As such, it brings with it a coherent alternative paradigm, a new set of theoretical models based upon that paradigm and a new set of computational methods. So in this module, we'll be taking a very high level view to this whole approach to the social sciences and try to give some outline as to how it differs from our more traditional approach. We'll talk firstly about this more traditional approach to make it explicit before going on to discuss social complexity. We can loosely define social science as the study of human beings and the relations between those individuals that give rise to macro-patterns of social organisation, what we call a society. Like all empirical sciences, it is engaged in the enterprise of trying to describe some subset of phenomena in our world. In this case, the phenomena of interest is human society and we do this by amassing empirical data and developing logically consistent theoretical models for effectively interpreting patterns within that data. But this scientific enterprise does not happen in a vacuum, it happens within a certain cultural context. It depends upon a certain set of philosophical assumptions about the way the world is. Physicists don't go into their lab every day and question whether there really is such a thing as an objective universe out there. This is a philosophical question, not a scientific question. So what really happens is that researchers go into their lab every day and operate based upon a set of assumptions about the way the world is, what are important questions to be asking, what are valid processes of reasoning etc. As long as the whole community of researchers shares these assumptions, then they have the supporting context within which to conduct their research enterprise. This set of assumptions that supports a scientific domain and constitutes the whole philosophical framework within which they work is called a paradigm. The Oxford Dictionary defines a paradigm as such, a worldview underlining the theories and methodology of a particular scientific subject. The paradigm or set of assumptions within which the enterprise of modern science operates was born approximately 500 years ago with the massive cultural transformation of the Renaissance and scientific revolution that gave us the cultural foundations to our modern world. This new paradigm really came together and found its most coherent, full expression within the work of Sir Isaac Newton, whose work was extremely influential for centuries to come and laid down the foundations for the enterprise of modern science. And of course, built into these foundations was a set of assumptions about how the world works. This whole set of assumptions is called the Newtonian paradigm or the clockwork universe. In slightly more technical terms, it can also be called linear systems theory. Linear systems theory forms the backbone to all of modern science. It is used in every domain from physics to biology to economics to psychology. The Newtonian paradigm is materialistic and atomistic in nature. It sees the world as a set of isolated objects that interact in a linear cause and effect fashion. The Newtonian clockwork universe receives its name because within this paradigm the universe is seen to be comparable to a big mechanical clock. It continues ticking along like a perfect machine with the turning of its gears governed by the laws of physics, making every aspect of the machine perfectly orderly, knowable and predictable. Within this paradigm we can understand and know this whole machine of the universe by understanding all of the parts and the simple linear interactions between those parts. The whole clock is nothing more than the sum of its parts and thus to understand it we can use the process of inquiry called reductionism also called analysis whereby we break the whole thing down into these isolated individual parts, study the properties of these parts in isolation and how they interact with each other. If we can then create a set of equations that describe this then it's game over. We've completed this process of inquiry and now know everything there is to know about that system. This approach to the scientific inquiry called analysis was very successful within classical physics and came to define what modern science is considered to be and got applied to many different areas through the 18th, 19th and 20th century. Its application within the social sciences has given us what is called methodological individualism that is used in many different areas of the social sciences, most prominently within standard economics. Methodological individualism is the requirement that causal accounts of social phenomena explain how they result from the motives and actions of the individual agents. It considers that the only thing in the social world that is real are the things that we can touch and see which is individual human beings. This is the materialistic and atomistic nature to the Newtonian paradigm All phenomena have to be traced back to some discrete, tangible entity that can be defined in isolation and described in terms of a set of properties. Methodological individualism is the requirement that causal accounts of social phenomena explain how they result from the motives and actions of individual agents. Within this paradigm when all is said and done society can be nothing more than all of its constituent individuals. This paradigm of methodological individualism then gives us a whole approach to studying social phenomena one that is focused on the properties of the individuals and their linear interactions. So in using this approach we're going to want to amass data about the properties of the individuals like a national census where you fill in your age, gender, occupation etc. Once we have all of this data we are then going to look for linear interactions between the variables what is called correlation analysis, a statistical technique that can show whether and how strongly pairs of variables are related. For example we might ask if there is a linear correlation between an individual's level of education and their income. We would then collect data about individuals and do a scatter plot to see how closely the values of these properties move together. This approach can describe simple linear interactions, the interaction between two, three or four variables. It works well on the micro level and this was the primary focus of science before the 1800s where we were dealing with things like the relationship between temperature and pressure, population and time, production and trade etc. During the 1800s scientists developed methods for dealing with macro systems composed of many parts by using statistical methods and probability theory with most of this happening within the domain of statistical mechanics where they were trying to model such phenomena as a gas in a chamber with billions of atoms. Phenomena of this kind are sometimes called disorganized complexity. In such cases we're dealing with systems composed of many disorganized parts that is to say a large set of random variables. The variables have to be independent and identically distributed, what is called IID. If each random variable has the same probability distribution as the others and all are mutually independent then these statistical methods will work. These assumptions only really hold within linear systems but by imposing them we can say things about the macro system without actually getting our hands dirty and looking at what is really going on inside. We can say that the macro system will follow the law of large numbers, the central limit theorem. We can use mean field theory. We can make estimations, talk about the average normal person and so on. This is a very important and useful shortcut but it has its limitations. We aren't going to any more of the details to this approach but suffice to say linear system theory works well for linear systems that is to say systems that have a finite amount of independent homogeneous elements interacting in a well-defined fashion with a relatively low level of interconnectivity. This is often not what we see when dealing with social phenomena. Many social phenomena such as whole economies, social institutions, cultures and the human psychology to name just a few are fundamentally complex in nature. By complex we mean that they consist of a very many autonomous and diverse components that are highly interconnected and interdependent. In these complex systems the scientific underpinning of our traditional formal approach begins to break down. And this leaves the social sciences somewhat divided in their response to the question of whether we go on using these formal methods whose assumptions when applied to social systems are fundamentally flawed or do they abandon formal methods altogether. For example we can see this divide between economics and sociology where standard economics has fully embraced linear systems theory giving it quite powerful formal mathematical models but in order for it to do this it has to package up quite subtle and complex social phenomena into a relatively simple set of assumptions leaving it subject to continuous critique surrounding these foundational assumptions. While much of sociology and other social sciences feel this approach is throwing the baby out with the bathwater and they continue to pursue their inquiry without the support of any real coherent formal system which leaves certain doubts surrounding their status as a science as formal languages are an integral part of the whole enterprise of science. And this is giving us what is called economic imperialism where economics the only social science that has a formal basis is increasingly dominating the others. Complexity theory is fundamentally a set of formal models so we'll just make a quick side note about formal methods before moving on. Formal languages are what make a scientific domain coherent and robust without them the domain is essentially a descriptive science. As the scientist Ernest Rutherford once said all science is either physics or stamp collecting. This is clearly a very arrogant statement but there is some truth to it. Physics is by far the most robust and advanced domain of science largely because it is directly supported by the sophisticated formal language of standard mathematics. The higher mathematics used in fundamental physics is not about the X's and Y's that you learnt in high school algebra. It's about fundamental and powerful concepts that describe patterns of organization in terms of symmetries, transformations and invariance. It is these very abstract and powerful concepts that are captured within the language of higher mathematics that gives physicists the tools to tackle very difficult phenomena in a coherent fashion. The social sciences often lack these abstract formal methods that are powerful tools for solving difficult questions. A formal language is what gives a scientific domain the capacity to speak with one voice without the support of a formal language you end up with many different sub-domains speaking many different languages without the capacity to interrelate them and when someone comes asking for the answer to some question you end up giving them a hundred different models. Over the past few decades we've seen the beginnings of a formal language for modeling the complex systems that social scientists study without resorting to reductionist methods. Complexity theory is based upon very abstract formal mathematical models but probably not the kindly you're used to and we should be clear from the start that although a lot of complexity theory really originates in mathematics and physics it's not another excuse for trying to reduce social life to little particles of matter that get moved around on mass by forces. It starts with the recognition that these reductionist methods have their limitations. So complexity theory starts with an alternative paradigm to that of analysis this paradigm is really inherited from systems theory. Systems theory is based upon the process of reasoning called synthesis which is the opposite from analysis and reductionism. This paradigm is referred to as being what is called holistic meaning that it is characterized by the belief that the parts of something are intimately interconnected and explicable only by reference to the whole. Synthesis means the combination of components or elements to form a connected whole. It is a process of reasoning that describes an entity through the context of its relations and functioning within the whole system that it is a part of. Thus synthesis focuses on the relations between the elements that is to say the way those elements are put together or arranged into a functioning entity. Within this paradigm we're trying to identify the complex relations within which an entity is embedded. Its place and function within the whole and within systems thinking this context is considered the primary frame of reference for describing something. We are then not particularly interested in breaking things down and talking about the properties of the parts and their simple linear interactions. We are more interested in these interactions and what emerges out of them. Paradigms like this are always quite abstract so let's take a quick example to solidify it. Let's say we're trying to understand the origins of the First World War. Well from the analytical perspective we would talk about how the Archduke Ferdinand was assassinated in Sarajevo and how this effect caused a reaction from Russia which caused another reaction from Germany which in turn caused England to react and so on. In this paradigm we would talk about the properties of the parts and the cause and effect interactions. Now from a systems perspective we would be looking at the whole context both in space and the process in time. The nexus of relations out of which this phenomena emerged. We might then talk about how through industrialization and nationalization the international political environment within pre-war Europe self-organized into a critical state and it was out of this whole context that we got the emergence of the First World War. The assassination did not cause the war, nothing directly caused the war. It was out of the non-linear interactions of many different factors that we got a critical state to the system and out of this critical state we got the emergence of this phenomena of the First World War. So this gives us some insight into this alternative paradigm but how does this actually translate into models that we can use? Complexity theory represents a combination of a number of different modeling frameworks that have developed in different areas in response to dealing with complexity all of which have in common a focus on the interactions between parts and how these interactions give rise to emergent phenomena on the macro level. Agent-based modeling is one good example of this. Agent-based models are a class of computational models for simulating the actions and interactions of autonomous agents in order to try and model their effect on the system as a whole. As an example we could think about trying to model the spreading of some virus within a population. We have a traditional equation-based model called SIR which would describe this process in a top-down fashion but we can also describe this with agent models where we ascribe simple rules to the agents and then run the program to see what aggregate phenomena emerge from the bottom up. Another major modeling framework within complexity theory is that of network theory that is focused on the connections between actors and how the structure of those connections affect the actors and the system as a whole. Network theory gives us a formal language to model such things as power and influence within social systems by looking at the structure of connections that surround an individual. Network theory gives us a language for talking about how things spread through a network. Non-linear systems theory is another major modeling framework that helps us in talking about the non-additive interactions between agents in space and over time. How, through these non-linear interactions of synergies and interference we get the emergence of macro-level non-equilibrium phenomena that make the whole more or less than the sum of its parts. This language of feedback loops and chaos helps us in talking about non-equilibrium processes of change where the whole system moves rapidly in one direction and this is just a quick sample of some of the topics we'll be covering during the course. Finally, we'll talk about the new set of practical methods and tools that complexity science uses. Complexity science is a science fundamentally based upon computation. The rise of computation within the social sciences is one of the quiet but major revolutions taking place within contemporary science and I'll quote the social network scientist Duncan Watts as he describes this phenomena as such, up until about 10 years ago it was impossible to observe these social interactions and it is very very hard to do science when you can't observe things. It is very hard to do science when you can't measure the things you're interested in and what has changed in the past 10 years or so and why it is so exciting for people like me to be at the intersection of social and computational science is that the internet has really unveiled, has really made the invisible visible has really given us the capacity to measure the interactions between hundreds of millions of people in real time and over extended periods of time. It feels like for many of us in the social sciences like we've stumbled upon are equivalent to the telescope, the device, the technology that made the invisible visible and historically that has led to dramatic improvements in science. To date the primary sources of data for social scientists were survey research, government statistics and one of in-depth studies of particular people. The statistical databases of governments and the World Bank are full of information about individual people and their properties. They tell us almost nothing about the connections between those individuals because up until very recently we didn't have the computational capacity to manage and utilize large complex databases of this kind. But with the rise of the internet and particularly online social networks this is all changing. We're going from a limited amount of randomly selected historical data on individuals to a mass of real data about the connections between people and this big data is said to revolutionize our insight into human interaction. The future of the social sciences is a lot to do with the new approaches that are arising from these new computational capabilities and data sources. With these new opportunities for the first time we have the capacity to not just model societies and their simple statistical interactions or aggregate behavior but instead in terms of the actual context. We have for the first time in a rigorous way the capacity to map and model context. The context of choice, the context of behavior the complex interplay of a lot of different free parameters all at once. This has always been very difficult because of the lack of data and computational intractability. These new tools of computation and data sources are very important but at the end of the day they are just tools. They will not in themselves help us solve difficult problems within social theory. Agile questions about the relationship between individual agency and social structure. Questions about the exercising of social power about the formation of the individual about the rise and fall of civilizations. But with these new computational methods and a new set of sophisticated theoretical tools from complexity theory we can apply them to see what fresh insight we can get on these perennial challenges within the social sciences. In this module we've been giving a quick overview to the application of complexity theory to the social sciences what we called social complexity. We started off with a very broad discussion surrounding the scientific enterprise as we talked about paradigms in general and the Newtonian paradigm in particular. How it forms the basis for modern science and how this approach of reductionism gets translated into methodological individualism within the social sciences an approach that is focused upon the properties of the components within the system and the linear cause and effect interactions between them with the whole being a simple summation of its parts allowing us to use statistical analysis and probability theory. We talked about how the basic assumptions underpinning our traditional formal approaches begin to fail when we start to deal with complex systems consisting of many autonomous diverse components that are highly interconnected and interdependent as often is the case within the social sciences. We briefly introduce complexity theory as an alternative approach to modeling these complex systems an approach that is based upon a paradigm inherited from systems theory that uses synthetic reasoning instead of analysis. We briefly touched upon some of the major modeling frameworks that operate within this paradigm including agent-based modeling, network theory and non-linear systems theory. Finally, we talked about how complexity science is based upon a new set of computational methods and how big data is said to have a transformative effect on the social sciences in the coming decades.