Strategy for a Post-Digital World | Human-Centered Change and Innovation
GUEST POST from Greg Satell
For decades, the dominant view of strategy was based on Michael Porter’s ideas about competitive advantage. In essence, he argued that the key to long-term success was to dominate the value chain by maximizing bargaining power among suppliers, customers, new market entrants and substitute goods.
Yet digital technology blew apart old assumptions. As technology cycles began to outpace planning cycles, traditional firms were often outfoxed by smaller competitors that were faster and more agile. Risk averse corporate cultures needed to learn how to “fail fast” or simply couldn’t compete.
Today, as the digital revolution is coming to an end, we will need to rethink strategy once again. Increasingly, we can no longer just move fast and break things, but will have to learn how to prepare, rather than just adapt, build deep collaborations and drive skills-based transformations. Make no mistake, those who fail to make the shift will struggle to survive.
Learning to Prepare Rather Than Racing to Adapt
The digital age was driven, in large part, by Moore’s law. Every 18 months or so, a new generation of chips would come out of fabs that was twice as powerful as what came before. Firms would race to leverage these new capabilities and transform them into actual products and services.
That’s what made agility and adaptation key competitive attributes over the past few decades. When the world changes every 18 months, you need to move quickly to leverage new possibilities. Today, however, Moore’s Law is ending and we’ll have to shift to new architectures, such as quantum, neuromorphic and, possibly, biological computers.
Yet the shift to this new era of heterogeneous computing will not be seamless. Instead of one fairly simple technology based on transistors, we will have multiple architectures that involve very different logical principles. These will need new programming languages and will be applied to solve very different problems than digital computers have been.
Another shift will be from bits to atoms, as fields such as synthetic biology and materials science advance exponentially. As our technology becomes infinitely more powerful, there are also increasingly serious ethical concerns. We will have to come to some consensus on issues like what accountability a machine should have and to what extent we should alter the nature of life.
If there is one thing that the Covid-19 crisis has shown is that if you don’t prepare, no amount of agility will save you.
Treating Collaboration as a New Competitive Advantage
In 1980, IBM was at an impasse. Having already missed the market for minicomputers, a new market for personal computers was emerging. So, the company’s leadership authorized a team to set up a skunk works in Boca Raton, FL. A year later, the company would bring the PC to market and change computer history.
So, it’s notable that IBM is taking a very different approach to quantum computing. Rather than working in secret, it has set up its Q Network of government agencies, academic labs, customers and start-ups to develop the technology. The reason? Quantum computing is far too complex for any one enterprise to pursue on its own.
“When we were developing the PC, the challenge was to build a different kind of computer based on the same technology that had been around for decades,” Bob Sutor, who heads up IBM’s Quantum effort, told me. “In the case of quantum computing, the technology is completely different and most of it was, until fairly recently, theoretical,” he continued. “Only a small number of people understand how to build it. That requires a more collaborative innovation model to drive it forward.”
It’s not just IBM either. We’re seeing similar platforms for collaboration at places like the Manufacturing Institutes, JCESR and the Critical Materials Institute. Large corporations, rather trying to crush startups, are creating venture funds to invest in them. The truth is that the problems we need to solve in the post-digital age are far too complex to go it alone. That’s why today, it’s not enough to have a market strategy, you need to have an ecosystem strategy.
Again, the Covid-19 crisis is instructive, with unprecedented collaborative efforts driving breakthroughs.
Drive Skills-Based Transformations
In the digital era, incumbent organizations needed to learn new skills. Organizations that mastered these skills, such as lean manufacturing, design thinking, user centered design and agile development, enjoyed a significant competitive advantage. Unfortunately, many firms still struggle to deploy critical skills at scale.
As digital technology enters an accelerated implementational phase, the need to deploy these skills at scale will only increase. You can’t expect to leverage technology without empowering your people to use it effectively. That’s why skills-based transformations have become every bit as important as strategic or technology-driven transformations.
As we enter the new post-digital era the need for skills-based transformations will only increase. Digital skills, such as basic coding and design, are relatively simple. A reasonably bright high school student can become proficient in a few months. As noted above, however, the skills needed for this new era will be far more varied and complex.
To be clear, I am not suggesting that everybody will need to have deep knowledge about things like quantum mechanics, neurology or genomics a decade from now any more than everybody needs to write code today. However, we will increasingly have to collaborate with experts in those fields and have some sort of basic understanding.
Making the Shift from Disrupting Markets to Pursuing Grand Challenges
The digital economy was largely built on disruption. As computer chips became exponentially faster and cheaper, innovative firms could develop products and services that could displace incumbent industries. Consider that a basic smartphone today can replace a bundle of technologies, such as video recorders, GPS navigators and digital music players, that would have cost hundreds of thousands of dollars when they were first introduced.
This displacement process has been highly disruptive, but there are serious questions about whether it’s been productive. In fact, for all the hype around digital technology “changing the world,“ productivity has been mostly depressed since the 1970s. In some ways, such as mental health and income inequality, we are considerably worse off than 40 or 50 years ago.
Yet the post-digital era offers us a much greater opportunity to pursue grand challenges. Over the next few decades, we’ll be able to deploy far more powerful technologies to solve problems like cancer, aging and climate change. It is, in the final analysis, these physical world applications that can not only change our lives for the better, but open up massive new markets.
The truth is that the future tends to surprise us and nobody can say for sure what the next few decades will look like. Strategy, therefore, can’t depend on prediction. However, what we can do is prepare for this new era by widening and deepening connections throughout relevant ecosystems, acquiring new skills and focusing on solving meaningful problems.
In the face of uncertainty, the best way to survive is to make yourself useful.
— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.