Pattern #3

Comments

We invite your participation in evolving this pattern language with us. Use the comment section at the bottom of this page to comment on its contents or to share related ideas and resources.

Pattern Card

Click to enlarge or download Pattern Card.

Appropriate Innovation Card - version 1

Download

To download the 70 pattern cards, an overview, and the complete Wise Democracy Pattern Language use the DOWNLOAD button.

Appropriate Innovation v1.0

We are nature’s creativity on speed. Something is “appropriate” when it fits. To be more a blessing than a curse, our creativity needs to fit within natural constraints and the wisdom embodied in and promoted by these patterns. So create benign but effective ways to wisely monitor our powerful collective creativity.

Related: 17 Deep Time Stewardship, 26 Full Cost Accounting, 33 Iteration, 40 Nature First, 51 Restrained Liberty, 60 Systems Thinking, 69 Wise Use of Uncertainty

Go deeper …

This is an edited version of the video on this page.

This pattern comes out of an awareness of the trade-offs associated with the technological breakthroughs that we’ve been making for the last 50-100 years – a development that’s accelerating.

Innovation is currently held up as an unqualified good. But when you look at some of these innovations – particularly when they’re being hyped and few people are seriously considering their potential downsides – it becomes obvious that we are creating an increasing capacity to generate collective disaster.

Bill Joy – who is one of the creators of Java and one of the gurus of the Internet – wrote an article for Wired magazine in 2000 entitled “Why the future doesn’t need us”. In it he describes how advances in biotech, nanotech, computing power and robotics will generate the capacity for individuals or small groups to create self-replicating entities that can consume or destroy us or the environments we depend upon. And when (not if) we reach that point, it’s really hard to imagine how human extinction will not happen, whether on purpose or by accident.

There are a number of people who would love to remove humanity, since they think of us (with some justification) as a cancer on the earth. Imagine giving them the power Bill Joy was referring to. Or just think about people who are futzing in their garages with the power to genetically engineer viruses…

We have extended our power way beyond our normally evolved cognitive systems and ways of responding to the world around us. We now operate at microscopic, planetary and atomic levels. We are fiddling around there and empowering ourselves to do that more simply, more efficiently, more cheaply, and with fewer people.

So now our creativity – our capacity for innovation – is itself an issue in our collective survival. In one of the other patterns – Nature First – a key question is, “What is nature telling us?” We think we’re independent from nature but we’re not, and we need to take account the potential downsides of this unbelievable power we’re accumulating as a collective civilization.

Now, appropriate means “it fits”. Nature is about fit. Evolution is about fit. Ecology is about how things fit together into coherent wholes. Natural selection is about removing things that don’t fit. To be more of a blessing than a curse, our creativity needs to fit within natural constraints. Nature tells us where we shouldn’t go. We need to really measure our creativity in the light of that and constrain our urge to learn and do simply for the sake of learning and doing – or worse, for the sake of profit, power, or comfort and convenience.

Also we need to constrain our creative power within the wisdom embodied in and promoted by these patterns. These patterns are trying to offer guidelines for creating systems that could – among many other things – monitor our innovative impulses. If we don’t create ways to wisely monitor our collective creativity and our collective power, we will almost certainly destroy ourselves. It is literally almost inevitable, despite the fact that it is hard to see. We will become more and more and more powerful – and so will our creations (like artificially intelligent robots). From The Sorcerer’s Apprentice to Frankenstein and countless science fiction tales, there are many archetypal cautionary stories that warn us about our technological and creative capacities. These stories are facets of our collective intelligence and wisdom, our collective effort to come to terms with our role in the world.

We need to think twice – together and effectively – about the immense trade-offs that we face by freely exercising our collective brilliance and power.  It may not be easy to handle.  But clearly it is part of taking into account what needs to be taken into account for long-term broad benefit.

Video Introduction (7 min)

Examples and Resources

A key example of monitoring appropriate innovation is the precautionary principle. It says that technology should not be applied in any broad or potentially risky way until it is proven benign. The precautionary principle is an extremely conservative one, very different from the progressive principle that says we are and always should be developing and advancing. Everything is up and up and up all the time which is our civilization’s bias at the moment. So the precautionary principle is understandably resisted by ambitious technologists. And it’s actually very hard to apply in a broadly collective way. If the U.S. adopted the precautionary principle, what about the Chinese, what about Al Qaeda? How do you get the precautionary principle applied everywhere?

That question should not be seen as a rhetorical question. It should be seen as a real question that demands some creative answers: “Okay, how do we do this clearly necessary thing?”

Full Cost Accounting is another one of the patterns, very relevant here. Let’s not just look at the upsides of our developing technologies. We have this brilliant ability to make these tiny robots which can go around inside us and kill cancer cells. Okay, so if you can do that, you should be able to create tiny little robots that go around inside us and kill brain cells or heart cells. The wrong person having this technology would have very troubling capacities in their hands. So do we want to go there and look at the full cost accounting when it comes to any new technology? If we thought in terms of full cost accounting, I suspect we would more often apply the precautionary principle.

An existing protocol which is very much along these lines – albeit quite mildly, given the way it is usually applied – is environmental impact statements (EIS). Essentially an EIS asks “If you are going to do this new development project or create this new technology, what is the environmental impact of that going to be?” Unfortunately, it’s a corrupted system in practice. But the idea behind it is much in line with this particular pattern.