Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Oct 10, 2015
Measuring the impact of the information economy is difficult. By some intuitive measures, the power of information technology seems almost miraculous. By many orthodox measures, however, it seems tepid. We encountered this paradox in the 1980s when Robert Solow famously noted that “You can see the computer age everywhere but in the productivity statistics.” Computers finally flowed through to the data during the brief productivity boom in the late 1990s and early 2000s. The conventional view that median incomes have stagnated for the last four decades, however, runs counter to a presumption that Moore’s law, now 50 years old, should have substantially improved living standards. In 2011, George Mason economist Tyler Cowen ignited the debate with his book The Great Stagnation, which argued that the Internet was a great source of “cheap fun” but not jobs or incomes. Paul Krugman in 1998 had famously said that by 2005 we would view the Internet as no more important than the fax machine. With “The Demise of U.S. Economic Growth” in 2012, Northwestern’s Robert Gordon drew an even darker picture. And yet according to The Wall Street Journal, 66 American startup firms are worth more than a billion dollars. The market values of just seven publicly traded tech leaders — Apple, Google, Microsoft, Facebook, Oracle, Intel, and Amazon — total nearly $2.38 trillion, more than the entire value of the German or Australian stock markets. Some believe technology is so powerful that "robots are taking all the jobs." What is the true state of American technology and its impact across the economy? What are prospects for American innovation? And what are the key public policy decisions that will help define America's technology future?