 Information technology is an integral part of the whole theory and science of complex systems. It is essentially the only way through which we can access data, process it and infer patterns within these complex systems due to their scale and number of components. This is also true of complex engineered systems. Although they are much more than just information technology, they are also the product of it and virtually impossible without it. All of this technology that we have developed has to be managed, operated and maintained in some way. And information technology is today essentially the only method for managing the massive technological infrastructure of advanced economies. In the same way that researchers can't study and interact with vast networks without computation, it is also becoming increasingly unviable for us to interact with these complex engineered systems. Without being enabled by information systems, the two are critically intertwined. The information revolution is in many ways the backbone to the current rapid proliferation and transformation in our technology landscape. As we've already noted, the information revolution that started with the advent of the digital format, microprocessor and the personal computer has today moved on, as we move up from the micro level of individual computerized devices to whole systems of people, technology and information. We may have a good grasp of the individual computerized components and their internal workings, but we have a very limited understanding of the systems that emerge out of the interaction between all of these different devices, people and physical technology. And these are what we call information systems. When we scale up the basic operations of computing that involve the storage, manipulation and exchange of information up to the macro scale, then data sets become big data, no longer a single file on your hard disk, but a cloud of data points streaming from a network of different sources. This data is often unstructured and noisy, such as the millions of images uploaded to the internet every day, the hashtags on social media or data from financial trading platforms. At this macro scale, computer programs become advanced analytics, which is the set of algorithms that include sophisticated statistical models, deep learning and other advanced data mining techniques in order to reveal patterns in large data sets. We can't go into the details of how these algorithms work here, but at a very high level, they automate the process of turning data into information that can be acted upon. These machine learning and deep learning algorithms are cutting edge technologies that have only really come of age in the past one or two years, with companies like Google scrambling to get their hands on people with expertise in this area. We've never really had the capacity to automatically turn very large data sets into valuable information before. This technology already provides solutions to many problems in image recognition, speech recognition and natural language processing, and it will very likely be the engine behind a lot of innovative groundbreaking technologies in the years to come. This is the forefront of the information revolution today, and it is still a radically disruptive force. The last technologies that we'll briefly mention are social networking and mobile computing. Social networking gives people a representation in this world of information systems and makes explicit our actions. Computing is becoming increasingly pervasive as the digital and physical world are converging. Social networking and gamification are coming out of their box and will become increasingly a part of and overlaid on top of the physical world to give everything a social dimension. This again is very much cutting edge technology, but there's growing research on the convergence of the internet of things and social networking to give us what is called the social network of things. All of these different technologies combine to give us the acronym SMAC, which stands for social, mobile, analytics and cloud. These technologies are currently at the forefront of shaping our IT infrastructure. The formal definition of an information system is the combination of user, technology and process to complete a given goal. Information systems collect, store, process and exchange information. Today they are used in all forms of organization of any size from enterprise information system to manufacturing to transportation and urban information systems. Information systems serve a number of critical functions within complex engineered systems. They serve the function of basic control, automation and of coordination between different systems. We'll discuss each of these functions separately. A control system is a specialized subsystem for controlling another system. A very high level generic model of this would consist of a sensor for receiving information about the system being controlled in its environment. A logic unit that processes this information according to some set of instructions and an actuator that executes on the controller's instructions. In order to control a system, all of these elements need to be present and working together. How we control technology has of course evolved over time. If we think about a hand tool like a shovel, all of these control functions are being performed by the person operating the technology who is also inputting the physical energy into the system. Industrial technologies and new energy sources removed humans from the direct physical control over the system as it became mediated through mechanical levers. The electrical revolution gave us electrical interfaces, but all still largely controlled by human operators. With the advent of information technology, basic control processes such as on production lines and other industrial processes have become automated. There is only so many technologies that a single human can interact with and manage. At a relatively low level of technological saturation, such as in pre-modern societies, we can interact with and directly control all the technologies we own. But as we increase the number of technologies and the complexity of the technological infrastructure, this becomes no longer possible. To develop the large infrastructure systems of the industrial age required a certain level of automation, allowing for any single individual to be enabled by many more and diverse technologies. It no longer became possible for us to manually interact with, directly control or even understand all these technologies. The more technology we have and the more complexes infrastructure becomes, the more we need information to interact with it and manage it. The advent of digital computing and advanced telecommunications is driving a new level of automation that is required to manage the ever-growing complexity of the technological infrastructure that supports post-industrial societies. A single premium class automobile may contain close to a hundred million lines of software code that are executed on 70 to 100 electronic control units networked throughout the body of the car. The physical operations of whole mass transit rail systems such as that of Dubai have now become automated. And many other basic control processes have become automated such as manufacturing processes and factories, switches and telephone networks, steering and stabilization of ships, aircrafts and other applications. General purpose process control computers have increasingly replaced standalone controllers with a single computer able to perform the operations of hundreds of controllers. Process control computers can process data from a network of PLCs, instruments and controllers in order to implement typical control of many individual variables. They can also analyze data and create real-time graphic displays for operators and run reports for engineers and management. For example, the Union Pacific Railroad placed infrared thermometers, microphones and ultrasound scans alongside its tracks. These sensors scan every train as it passes and send readings to the railroad's data center where pattern matching software identifies equipment at risk of failure. And increasingly these systems will be connected to cloud platforms in order to run the kind of advanced analytics we discussed above. As major corporations such as General Electric and Cisco are currently investing heavily in this technology. Information systems also play an increasingly important role in coordinating, load balancing and optimization between disparate systems. With pervasive networking we can sense our world like never before, get real data about how these things are performing and there is a vast amount of space for optimization both on the micro level and the macro level. It is estimated that we waste somewhere between 40 and 70% of electricity on the grid worldwide. The cost of traffic gridlock in Europe is estimated to be a few percentage points of the entire GDP. And reports have estimated that over 30% of traffic in a city is caused by drivers searching for a parking spot. All of these different systems could be greatly optimized through common protocols and platforms that enable information exchange and coordination. As previously mentioned the industrial models organization that underpins the infrastructure we inherit was very much domain focused. We have departments for the domain of energy, departments for the domain of water, departments for transportation and so on. What we don't have is departments for processes that cut across domains and thus our infrastructure systems may be somewhat optimized in isolation but they're currently not optimized on the aggregate level. Getting these different systems to talk to each other is key to developing sustainable systems through energy efficiency and recycling. Smart cities are good examples of this where different systems have to work together to actually make the whole system smarter. For example, when an emergency is reported in the city of Barcelona, Spain, the approximate route of the emergency vehicle is entered into the traffic light system, setting all the lights to green as the vehicle approaches through a mix of GPS and traffic management software, allowing emergency surfaces to reach the incident without delay. This is a form of the kind of complex cross domain coordination that is required to make the whole system more efficient and smarter. Because of the siloed nature to our industrial systems they are not optimized for how end users actually use the system, that is as part of processes and this is where social networking comes into the mix. By having a digital presence and making explicit our activities we can begin to design systems for aggregating different services and coordinating them along the processes that people are actually engaged in and eventually do this in real time. Once a process is made explicit different systems can be notified and begin to coordinate their activities to enable the process to take place in a seamless fashion. These types of adaptive systems that operate in real time require a very different architectural paradigm, one that is called event driven architecture. When supply chains or manufacturing processes become networked and can respond to events occurring in other systems immediately. When prices on the electrical grid can adapt in real time to supply and demand then things become more contingent upon time and events that play out through processes this is not just about making things faster because when a system reaches a critical level of dynamism the whole paradigm changes towards one that is process orientated and these systems are driven by event signals in time. The aim of predictive analytics is to find statistical patterns in these real time data streams so that we're no longer reacting to things that have already occurred but can be preemptive by preventing them from occurring in the first place. Lastly before wrapping up we'll touch upon the subject of security which is of course a major issue here. We should not be naive about the scale of the risk involved as our critical infrastructure becomes automated, networked and remotely controlled via common IP platforms. Today a typical car's airbag, steering and brakes can all be hacked and controlled through the internet for malicious ends. Control systems in nuclear power plants can be broken into and with the rollout of IoT platforms software will soon be permeating all types of technologies as our critical infrastructure becomes increasingly dependent upon it. We can think about security with respect to control in terms of either access to control or the use of control itself. Distributed systems like the internet and IoT drive a new form of security. Traditional security is built around having something to secure some well defined information or system that typically belongs to one organization and we can employ a professional IT security team to build a secure wall around it because we know what is part of the system and what is not. But in this world of distributed systems like the internet we're dealing with billions of devices that may belong to end users with little awareness or concern for security and these many exposed and vulnerable end user devices can be harnessed for distributed attacks. In this way security can become a tragedy of the commons. It may be of no great value for me to change the default password on my router but when millions of other people do likewise the net result can be a macro scale security issue with many devices vulnerable to be harnessed for attack. This is just an example of the nature to security within distributed systems but as mentioned security is more than just ensuring prevention of attackers from breaking into a system it is ultimately about the appropriate use of control and power and with this next generation of information systems we're consolidating and handing over an extraordinary amount of power to these automated algorithms. A system is only really in control when awareness responsibility and power are all aligned this means they're exercising of control through a multi-tiered framework with more intelligent and aware systems guiding systems that are lower in their capacity for information and knowledge processing Whereas information and data may be growing at an exponential rate this only works to make intelligence an increasingly scarce resource. Information technology on the one hand commoditizes information and data driving its value right down but because of this it also increases the value of knowledge and intelligence making them scarce resources. Wherever there is demand for a scarce resource there is a hierarchy based upon access to that resource. This drives a new kind of hierarchical structure that is emerging out of the information revolution captured in the acronym of D-I-K-W which stands for data information knowledge and wisdom. Controlling these systems in a long-term sustainable and secure way means understanding this hierarchy and building it and building it into our systems of technology so that this world of complex engineered systems that we're going into is governed and controlled by true knowledge and ultimately some form of wisdom. In summary then the information systems that have developed over the past few decades are both an added source of complexity within our technologies and also the solution to complexity for end users. These information systems enable us to harvest fast amounts of data harnessing it to coordinate and optimize systems. Through the convergence and integration of a number of technologies like cloud computing, analytics, pervasive sensing and social networking we're reshaping our technology infrastructure to make it more adaptive process-orientated dynamic and real time. It gives us the capacity to greatly increase the efficiency of our systems of technology through automation and real-time coordination happening within a new event-driven architecture but it also holds out many security concerns that require intelligent design and management to achieve long-term sustainable solutions.