 Now it's time to explore the new economic space that transformational technology creates. This is the fun part where venture capitalists get to play after the transformative infrastructure has been deployed and the technology is ready for commercial prime time. In the aftermath of the great tech bubble, Brad DeLong of Berkeley had a lot of fun himself. When the bubble burst in 2000, the losses were huge, but the losses of investors in building the internet as in building railways a century before were outweighed by the social returns from what the investors financed. As noted, when the companies that constructed the infrastructure, the railways, the electricity grid, the internet went bust, no one pulled up the tracks or the dark fiber or tore down the power lines. But DeLong has another message too. It takes time to identify and exploit the financially and commercially valuable uses of the new technology, the killer apps. In the case of the railroads, it was mail-order retail, deployed almost 50 years after construction began on the first American line, the Baltimore and Ohio. Montgomery Ward and Sears Roebuck ran on Railway Express, a level of abstraction that insulated users from the underlying technology. All transformational technologies, very much including electricity and digital computing and communications, mature through such levels of abstraction, and the new economy emerges through trial and error experimentation, both with respect to different ways to implement the technology and with respect to the different applications it supports. Now the innovations that really matter constitute general purpose technologies. Tim Bresnahan of Stanford explains, the key attribute of a GPT is the positive feedback between improvements in the core technology and invention and deployment of novel application sectors, what he calls AS here, and that feedback produces innovation complementarities. A singular model of a GPT is inadequate. Note the multiple dimensions of potential innovation, enhancements to the core technology driven by internal logic, application-specific enhancements of the core technology, new applications, obvious and not so obvious. So railroads were an application sector of the GPT steam power, and note that both the core technology and one or more application sectors may be the focus of speculation. Companies generating and distributing electricity and companies driving radio wireless communications off that electricity infrastructure. Digital communications and social media. Many alternative institutional structures complicate modeling the innovation process. Now a critical feature of the deployment of GPTs has been the construction of networks, from wareways to the internet by way of telephone and telegraph lines and electricity grids and highways. Metcalf's law says that the value of a network is a function of the number of nodes on that network. But the value of a network is not fully captured by Metcalf's law. It is not merely a function of the number of nodes, the value is also a function of the applications riding on the network, such as mail order retail or e-commerce. And the links between the GPT and the application sectors are various. In the history of industrialization, steam power is the prototypical GPT. Steam engines were first invented to pump water out of coal mines, and the diffusion process was slow. Eventually, the positive feedback between steam power and coal mining was profound. But first, there were critical limitations that especially blocked the development of steam powered railroads. James Watt, the great inventor, and his effective venture capitalist, Matthew Bolton, exerted their patents to block enhancements to the core technology. Protecting intellectual property is not always socially optimal. The impact of a GPT across all the sectors of the economy is extremely heterogeneous. I must say I like Bresnahan's use of the word interesting in the first line of this excerpt. Financial accounting software and diagnostic imaging systems are both radically different. It's the interaction between these various application sectors and the core general purpose technology that generates increasing returns for society. Bresnahan then maps out the externalities created by the general purpose technology and the application sector investors. His symbols simplify the complexity. T stands for the technological level in the core and per each application sector. V is the total social return to innovation. I see those are the innovation complementarities generated by the positive feedback of innovation at both the core and the sector. And lambda is the fraction of the value that is actually captured by the inventors, whether in the core or in any or each application sector. And the subscripts G and A refer to general and application respectively. Now note as Aro and Nelson each stated, neither type of inventor can capture all the returns they create. And of course, neither can know in advance what the total return or their share in it will be. But the tendencies are well defined. Social increasing returns are generated by positive feedback between advances in the core GPT technology and in the application sectors. But there are externalities. Each application sector wants others to invest more than the return captured by the inventors in the application sector justify. An increased capture of value by core innovators in the general purpose technology means either there will be excessive investment in the core ahead of the absorptive capacity by the application sectors or reduced return to application specific innovation and less effort applied to it. Now critical to generating increasing returns is that improvement in the core GPT enables new application sectors. The ability to envision new markets matters, not just lower costs of implementing the technology. The scholars Helpman and Trayton Berg outlined the conditions for a new application sector to be defined and developed. A key factor is the number and difficulty of complimentary inventions that are required as well as the intensity of demand. Forecast errors about the trajectory of a new GPT are to be expected. Remember, the needed information is not distributed. The needed information does not yet exist. Note that at AT&T, the great telephone company, the early focus on hearing aids was not a pure accident. It was related to the fact that the mother and wife of Alexander Graham Bell, the founding inventor, were both deaf. And he actually had made his living as an elocutionist teaching speech to deaf people. Now the infrastructure that implements the old technology in a specific application sector can get in the way of deploying the new. The classic example is the belts and pulleys that transmitted power in a factory, first from the water wheel and then from the steam engine to the machines. When electricity generators and motors became available, at first they just replaced the steam engine. The productivity revolution in manufacturing that followed depended on two key complementary innovations. The electricity grid distributed power locally and eliminated the need for site-specific generators. And the unit drive motor embedded power in the machines themselves. So the production process could be reconfigured without tearing down the building. More generally, the impact of a GPT on the economy as a whole may well depend on its deployment as a system. Certainly this was true of electricity. And even if steam engines and computers were first deployed individually, networks of railways and of computers followed. In all this, it is critically important to remember that technology revolutions do not appear as one unified wave front. They diffuse over time. There's no data timeline on the x-axis, the horizontal axis of this chart, but it can take decades, even generations. A well-known wisecrack by Bob Solo, the great MIT economist, around 1989. We see the computers everywhere, but in the productivity statistics. Well, you might see them everywhere on the desk of tenured professors of economics at top universities. But they certainly had not become pervasive by 1989. And similarly, around 1900, electricity generators had not become pervasive. In this chart, track the median, the 50th percentile versus the 90th percentile. And note that the first wave of adoption came before unit drive motors were widely disseminated. But compare this with the dissemination of IT, of computerization through 2005. The steep increase for the 90th percentile to 20% of the capital stock maps to the tech bubble, of course. But the median, the 50th in 2005, was still very, very low with only about 4% of total capital stock accounted for by digital machines. Effective deployment of a novel general-purpose technology requires investment in intangibles, training, for example, that are not measured. So GDP productivity numbers that do not take account of this necessary investment are understated. This analysis, however, does not explain the recent and continuing productivity slowdown, which I'll discuss in the next lecture, as the missing investment is much larger before 2004 than afterwards. The productivity J-curve here shows the retreat in advancing production when resources have to be devoted to training the workforce on the new technology. Now, this message is reinforced from the perspective of Schumpeterian growth theory. During the period between the discovery of a new GDPT and its ultimate implementation, national income will fall. Well, this is an important insight, but it is entirely focused on the supply side of the economy. How quantitatively significant is the shift of resources into research and development and training versus the demand-generating effect of deploying the new applications of the general-purpose technology. Railroads, electricity generating and distribution systems, computers and data networks have all involved massive amounts of investment increasing aggregate demand. Attempts to quantify the economic effects of the American railroads go back to Robert Fogel, Nobel Prize winner, and his book Railroads in American Economic Growth published in 1962. Fogel forced the railroads into a neoclassical model where all that mattered was the reduction in transportation costs that they achieved versus a hypothetical expansion, a counter history, no railroads, but an expansion of canals and turnpikes on the assumption that all resources would always be fully employed in the most efficient ways. Hohnbeck and Rottenberg in this very recent article demonstrate that the impact of the railroads was an order of magnitude greater than Fogel estimated. The radical reallocation of manufacturing based on differential gains in market access drove an enormous sustained increase in productivity, a phenomenon completely outside Fogel's analysis. And also note the very large discrepancy between social and private return, as in the article by Nick Bloom and his colleagues that we reviewed in lecture four. Of course, as also discussed in lecture four, the state can induce acceleration in the development and deployment of the new general purpose technology as the U.S. Department of Defense did with digitalization. But the state can also induce the search for new application sectors through the use of sticks as well as carrots. Around 1960, in the first half of the 60s, the Defense Department became much more concerned with efficiency in the defense supply chain. DoD forced competition among suppliers who in turn reduced prices and managed inventories tightly. The policy was designed and implemented by Kennedy's defense secretary, Robert McNamara, who came from the Ford Motor Company and brought with him a set of quantitatively driven whiz kids. When the lucrative military market tightened radically, the suppliers whom DoD had sponsored were driven to the emergent commercial markets, computers, telecommunications, and alternative public sector customers, NASA and the Apollo program. They had to learn in real time under intense financial pressure to create markets. This was a real world historical example of facing and managing market risk. Let's now revisit the basic models of creative destruction. Schumpeter Mark I and Mark II through a remarkable survey conducted by the Duke University team that defined the division of innovative labor between innovating firms on the frontier and established firms with channels to the market. The question is, where do innovations come from? The availability of external sources of invention affects both the overall rate of invention and the extent of investment in invention within the firm. If inventions can be acquired at a fixed cost that is less than the marginal cost of internal invention, then first, the total stock of invention increases, and two, external research and development replaces internal research and development. Here's the geometry. The demand for invention is a function of marginal revenue, here defined as R. The supply of internal inventions is a function of the marginal cost of inventing, the heavy line designated C. External invention is available at a fixed cost. Q2 on the horizontal axis is the equilibrium rate of inventions without external supply. Q3 is the greater equilibrium rate of inventions with external supply, and Q2 minus Q1 represents the substitution of external for internal inventions. Now, this assumes a homogeneous content of invention between internal and external sources, and of course that may not be empirically correct. But the survey that was conducted by this team at Duke University is quite fascinating. Now, the survey explored the significance and sources of inventions during the years 2007-2009. It generated a 30% response rate, which is very high indeed. NOSI, NOSI, is an invention that's new to the firm or significantly improved. NTM is a much more important invention that's new to the market. NOSI minus NTM, that means imitation. And the focal innovation from the survey is the single most important product innovation that the firm achieved. Large firms are either units of Fortune 500 companies or non-Fortune 500 companies with more than 1,000 employees. The key column to focus on is B. 18% of all firms introduced inventions that were new to the market, and 43% of large firms did that. Focus on the row for all manufacturing, the row that's fourth from the bottom. 49% of all firms sourced inventions externally. Customers were the largest source, especially for firms that sell to other firms, but customer sourced inventions are pervasive. They're cheap to develop and commercialize, but they're cheap, it would appear, because they represent incremental improvements to existing products. Clayton Christensen of the Harvard Business School in his well-known book, The Innovator's Dilemma, pointed out that such demands by customers for marginal improvements inhibit fundamental innovations. Henry Ford is supposed to have said, my customers wanted a faster horse. They didn't want an automobile. And Steve Jobs was notorious for saying, a lot of times people don't know what they want until you show it to them. Most valuable inventions in this survey come from tech specialists, and that supports the need to take this Schumpeter Mark III, the model with division of innovative labor at its core. But as outsourced R&D, a perfect substitute for internal R&D, it may be in pharma and biotech. For high-tech companies, including pharmaceuticals, market channels, not customers, are the most important source of external inventions. Market channels are defined to include consulting relationships with specialist firms, licensing of technology, and actual acquisition of new innovative companies. Finally, startups in this division of innovative labor play a disproportionate role. Startups accounted for 14% of the innovative activities of other established firms, while constituting only 2.5% of the sample. This is a roundabout, perhaps, backdoor way of validating the role of venture capital in the innovation economy, providing that reservoir of new inventions which, even if not successfully exploited by the startup itself, feed the overall innovation system. In the next lecture, we're going to consider the maturation of the digital economy over recent years. It's been accompanied by very large increases in industrial concentration, persistent and pervasive acquisition of startups, especially by the digital giants, inequality returning to historic peaks, and now perhaps even exceeding them, and an overall decline in American economic dynamism, along with intensified political polarization and paralysis. This lecture was fun. The next one, not so much.