We Need a More Biological View of Technology | Human-Centered Change and Innovation
GUEST POST from Greg Satell
It’s no accident that Mary Shelley’s novel, Frankenstein, was published in the early 19th century, at roughly the same time as the Luddite movement was gaining momentum. It was in that moment that people first began to take stock of the technological advances that brought about the first Industrial Revolution.
Since then we have seemed to oscillate between techno-utopianism and dystopian visions of machines gone mad. For every “space odyssey” promising an automated, enlightened future, there seems to be a “Terminator” series warning of our impending destruction. Neither scenario has ever come to pass and it is unlikely that either ever will.
What both the optimists and the Cassandras miss is that technology is not something that exists independently from us. It is, in fact, intensely human. We don’t merely build it, but continue to nurture it through how we develop and shape ecosystems. We need to go beyond a simple engineering mindset and focus on a process of revealing, building and emergence.
1. Revealing
World War II brought the destructive potential of technology to the fore of human consciousness. As deadly machines ravaged Europe and bombs of unimaginable power exploded in Asia, the whole planet was engulfed in a maelstrom of human design. It seemed that the technology we had built had become a modern version of Frankenstein’s monster, destined from the start to turn on its master.
Yet the German philosopher Martin Heidegger saw things differently. In his 1954 essay, The Question Concerning Technology, he described technology as akin to art, in that it reveals truths about the nature of the world, brings them forth and puts them to some specific use. In the process, human nature and its capacity for good and evil are also revealed.
He offers the example of a hydroelectric dam, which uncovers a river’s energy and puts it to use making electricity. In much the same sense, Mark Zuckerberg did not so much “build” a social network at Facebook, but took natural human tendencies and channeled them in a particular way. That process of channeling, in turn, reveals even more.
That’s why, as I wrote in Mapping Innovation, innovation is not about coming up with new ideas, but identifying meaningful problems. It’s through exploring tough problems that we reveal new things and those new things can lead to important solutions. All who wander are not lost.
2. Building
The concept of revealing would seem to support the view of Shelley and the Luddites. It suggests that once a force is revealed, we are powerless to shape its trajectory. J. Robert Oppenheimer, upon witnessing the world’s first nuclear explosion as it shook the plains of New Mexico, expressed a similar view. “Now I am become Death, the destroyer of worlds,” he said, quoting the Bhagavad Gita.
Yet in another essay, Building Dwelling Thinking, Heideggar explains that what we build for the world is highly dependent on our interpretation of what it means to live in it. The relationship is, of course, reflexive. What we build depends on how we wish to dwell and that act, in and of itself, shapes how we build further.
Again, Mark Zuckerberg and Facebook are instructive. His insight into human nature led him to build his platform based on what he saw as The Hacker Way and resolved to “move fast and break things.” Unfortunately, that approach led to his enterprise becoming highly vulnerable to schemes by actors such as Cambridge Analytica and the Russian GRU.
Yet technology is not, by itself, determinant. Facebook is, to a great extent, the result of conscious choices that Mark Zuckerberg made. If he had a different set of experiences than that of a young, upper-middle-class kid who had never encountered a moment of true danger in his life, he may have been more cautious and chosen differently.
History has shown that those who build powerful technologies can play a vital role in shaping how they are used. Many of the scientists of Oppenheimer’s day became activists, preparing a manifesto that highlighted the dangers of nuclear weapons, which helped lead to the Partial Test Ban Treaty. In much the same way, the Asilomar Conference, held in 1975, led to important constraints on genomic technologies.
3. Emergence
No technology stands alone, but combines with other technologies to form systems. That’s where things get confusing because when things combine and interact they become more complex. As complexity theorist Sam Arbesman explained in his book, Overcomplicated, this happens because of two forces inherent to the way that technologies evolve.
The first is accretion. A product such as an iPhone represents the accumulation of many different technologies, including microchips, programming languages, gyroscopes, cameras, touchscreens and lithium ion batteries, just to name a few. As we figure out more tasks an iPhone can perform, more technologies are added, building on each other.
The second force is interaction. Put simply, much of the value of an iPhone is embedded in how it works with other technologies to make tasks easier. We want to use it to access platforms such as Facebook to keep in touch with friends, Yelp so that we can pick out a nice restaurant where we can meet them and Google Maps to help us find the place. These interactions, combined with accretion, create an onward march towards greater complexity.
It is through ever increasing complexity that we lose control. Leonard Read pointed out in his classic essay, I, Pencil, that even an object as simple as a pencil is far too complex for any single person to produce by themselves. A smartphone—or even a single microchip—is exponentially more complex.
People work their entire lives to become experts on even a minor aspect of a technology like an iPhone, a narrow practice of medicine or an obscure facet of a single legal code. As complexity increases, so does specialization, making it even harder for any one person to see the whole picture.
Shaping Ecosystems And Taking A Biological View
In 2013, I wrote that we are all Luddites now, because advances in artificial intelligence had become so powerful that anyone who wasn’t nervous didn’t really understand what was going on. Today, as we enter a new era of innovation and technologies become infinitely more powerful, we are entering a new ethical universe.
Typically, the practice of modern ethics has been fairly simple: Don’t lie, cheat or steal. Yet with many of our most advanced technologies, such as artificial intelligence and genetic engineering, the issue isn’t so much about doing the right thing, but figuring out what the right thing is when the issues are novel, abstruse and far reaching.
What’s crucial to understand, however, is that it’s not any particular invention, but ecosystems that create the future. The Luddites were right to fear textile mills, which did indeed shatter their way of life. However the mill was only one technology, when combined with other inventions, such as agricultural advances, labor unions and modern healthcare, lives greatly improved.
Make no mistake, our future will be shaped by our own choices, which is why we need to abandon our illusions of control. We need to shift from an engineering mindset, where we try to optimize for a limited set of variables and take a more biological view, growing and shaping ecosystems of talent, technology, information and cultural norms.
— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay
Sign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.