America Needs to Innovate Its Innovation Ecosystem | Human-Centered Change and Innovation
GUEST POST from Greg Satell
The world today just seems to move faster and faster all the time. From artificial intelligence and self-driving cars to gene editing and blockchain, it seems like every time you turn around, there’s some newfangled thing that promises to transform our lives and disrupt our businesses.
Yet a paper published by a team of researchers in Harvard Business Review argues that things aren’t as they appear. They point out that total factor productivity growth has been depressed since 1970 and that recent innovations, despite all the hype surrounding them, haven’t produced nearly the impact of those earlier in the 20th century.
The truth is that the digital revolution has been a big disappointment and, more broadly, technology and globalization have failed us. However, the answer won’t be found in snazzier gadgets or some fabulous “Golden Era” of innovation of years long past. Rather we need to continually innovate how we innovate to solve problems that are relevant to our future.
The Productivity Paradox, Then and Now
In the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so bizarre that they called it the “productivity paradox” to underline their confusion.
Yet by the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage, a first mover advantage, would lead to market dominance. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for technology to hit critical mass.
Yet by 2004 productivity growth fell once again and has not recovered since. Today, more than a decade later, we’re in the midst of a second productivity paradox, just as mysterious as the first one. New technologies like mobile computing and artificial intelligence are there for everyone to see, but they have done little, if anything, to boost productivity.
Considering the rhetoric of many of the techno-enthusiasts, this is fairly shocking. Compare the meager eight years of elevated productivity that digital technology produced with the 50-year boom in productivity created in the wake of electricity and internal combustion and it’s clear that the digital economy, for all the hype, hasn’t achieved as much as many would like to think.
Are Corporations to Blame?
One explanation that the researchers give for the low productivity growth is that large firms are cutting back on investment in science. They explain that since the 1980s, a “combination shareholder pressure, heightened competition, and public failures led firms to cut back investments in science” and point to the decline of Bell Labs and Xerox PARC as key examples.
Yet a broader analysis tells a different story. Yes, while Bell Labs and Xerox PARC still exist, they are but a shadow of their former selves, but others, such as IBM Research, have expanded their efforts. Microsoft Research, established in 1991, does cutting edge science. Google runs a highly innovative science program that partners with researchers in the academic world.
So anecdotally speaking, the idea that corporations haven’t been investing in science seems off base. However, the numbers tell an even stronger story. Data from the National Science Foundation shows that corporate research has increased from roughly 40% of total investment in the 1950s and 60s to more than 60% today. Overall R&D spending has risen over time.
Also, even where corporations have cut back, new initiatives often emerge. Consider DuPont Experimental Station which, in an earlier era, gave birth to innovations such as nylon, teflon and neoprene. In recent years, DuPont has cut back on its own research but the facility, which still employs 2000 researchers, is also home to the Delaware Incubation Space, which incubates new entrepreneurial businesses.
The Rise of Physical Technologies
One theory about the productivity paradox is that investment in digital technology, while significant, is simply not big enough to move the needle. Even today, at the height of the digital revolution, information and communication technologies only make up about 6% of GDP in advanced economies.
The truth is that we still live in a world largely made up of atoms, not bits and we continue to spend most of our money on what we live in, ride in, eat and wear. If we expect to improve productivity growth significantly, we will have to do it in the physical world. Fortunately, there are two technologies that have the potential to seriously move the needle.
The first is synthetic biology, driven largely by advances in gene editing such as CRISPR, which have dramatically lowered costs while improving accuracy. In fact, over the last decade efficiency in gene sequencing has far outpaced Moore’s Law. These advances have the potential to drive important productivity gains in healthcare, agriculture and, to a lesser extent, manufacturing.
The second nascent technology is a revolution in materials science. Traditionally a slow-moving field, over the past decade improved simulation techniques and machine learning have improved the efficiencies of materials discovery dramatically, which may have a tremendous impact in manufacturing, construction and renewable energy.
Yet none of these gains are assured. To finally break free of the productivity paradox, we need to look to the future, not the past.
Collaboration is the New Competitive Advantage
In 1900, General Electric established the first corporate research facility in Schenectady, New York. Later came similar facilities at leading firms such as Kodak, AT&T and IBM. At the time, these were some of the premier scientific institutions in the world, but they would not remain so.
In the 1920s new academic institutions, such as the Institute for Advanced Study, as well as the increasing quality of American universities, became an important driver of innovation. Later, in the 1940s, 50s and 60s, federal government agencies, such as DARPA, NIH and the national labs became hotbeds of research. More recently, the Silicon Valley model of venture funded entrepreneurship has risen to prominence.
Each of these did not replace, but added to what came before. As noted above, we still have excellent corporate research programs, academic labs and public scientific institutions as well as an entrepreneurial investment ecosystem that is the envy of the world. Yet none of these will be sufficient for the challenges ahead.
The model that seems to be taking hold now is that of consortia, such as JCESR in energy storage, Partnership on AI for cognitive technologies and the Manufacturing USA Institutes, that bring together diverse stakeholders to drive advancement in key areas. Perhaps most conspicuously, unprecedented collaboration sparked by the Covid-19 crisis has allowed us to develop therapies and vaccines faster than previously thought possible.
Most of all, we need to come to terms with the fact that the answers to the challenges of the future will not be found in the past. The truth is that we need to continually innovate how we innovate if we expect to ever return to an era of renewed productivity growth.
— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.