Today, it seems that if there is one thing that companies should focus on more than any other it is innovation. Often, this is a question of the company’s survival. The marketplace is more cut-throat than ever: it’s characterised by global competition, the dynamism of start-ups, and empowered consumers that are ever more difficult to impress. Innovation is necessary to simply keep the company’s head above water. And perhaps, more dramatically, it is also a question of humanity’s survival. Humanity faces many threats. However, with politics and behaviour change having mixed success in dealing with them, the onus has fallen on companies to innovate us out of trouble.
Given the importance of innovation to both companies and society, we might expect to hold a clear understanding of how to do it. However, the innovation literature, despite being sizeable, falls short.
A typical innovation book offers two things: case studies of successful innovation and guidelines on how to create the organisational culture and structure needed for innovation. Case studies might allow the author to derive some best practice guidelines on innovation, but they rarely offer a theory or even a structure on how innovation should take place in the future. Organisational guidelines might indeed allow a culture of innovation to be fostered, but this hardly guarantees that the ideas a company produces are innovative.
Consider the most influential book on innovation, Clayton Christensen’s The innovator’s dilemma: When new technologies cause great companies to fail. Christensen’s argument is that “great companies” focus on pleasing shareholders and existing customers. This leads them to innovate incrementally, rejecting riskier disruptive innovations that, when they become available, lack demand. However, disruptive innovations are embraced by smaller companies, who make them viable and take over the market before the ‘great company’ can adapt.
The innovator’s dilemma is compelling, but ultimately offers little in guiding us to create disruptive innovations. Instead, it focuses more on the barriers that stopped great companies from innovating. It is, then, less a theory of disruptive innovation and more a theory of why disruptive innovation does not happen. Furthermore, the fact that he bases his arguments on only two key case studies hardly provides him with the basis for a more general theory of innovation.
Similar issues are associated with another core innovation text, W. Chan Kim and Renée Mauborgne’s Blue Ocean Strategy. The authors position their work as a counterpoint to the strategy orthodoxy that companies should focus on differentiating within existing markets. They call these “red oceans”, which reflects the bloody nature of competition. Red oceans have limited prospects for growth. In contrast, “blue oceans” are new markets that offer opportunities for high growth and quick returns. Thus, success means identifying these new spaces and innovating.
This all sounds great in principle, but the question is how to chart and sail into these blue oceans? In other words, how should we create innovations that create new markets? This is a question on which the authors offer little.
Beware the fetish of creativity
Given this void, it is no surprise that we turn to magic dust. This particular substance is so potent that it promises to save the world and let us rediscover ourselves. It is creativity.
It seems sensible to ‘pursue’ creativity. After all, creating truly path-breaking innovations requires a certain imagination and vision that means looking beyond the status quo.
The trouble is that we have created a fetish of creativity, and in doing so have lost sense of its meaning and use. Specifically, it has become an end rather than a means. Frequently, it seems as if what matters most is being creative rather than using creativity to transform. Thus, organizations focus on fostering the conditions for creativity. This usually means hiring young people, creating homely working conditions, and embracing design. Here, creativity becomes less a practice than a series of signs. Signifying that you are creative often stands in for actually being creative.
The perils of this are reflected in the state of the world’s cities. In his 2002 book, The Rise of the Creative Class, Richard Florida described how a new creative urban elite was transforming cities. Ironically, his analysis brought about the change he described. Cities began investing in creativity via promoting the growth of creative industries and encouraging the right sort of people to move to them. But the creative city has turned out not just to be a false promise, but a destructive one: it has created new social problems, as reflected in the title of Florida’s most recent book, The new urban crisis: How our cities are increasing inequality, deepening segregation, and failing the middle class. The cities that took Florida’s advice worshipped creativity for its own sake without thinking sensibly about the specific ways it should be applied.
Furthermore, creativity is frequently contrasted with ‘thinking’. In other words, to be creative is to suspend all prior knowledge and to get in touch with our inner child. As Stanford professor Robert Sutton puts it, in the creative process, “ignorance is bliss.”
Such an idea emerges from our binary view of the brain, in which one is either right- or left-brained, with the creative right side most valued today. This belief has some unfortunate implications for innovation. For example, it leads to the creative brainstorm being positioned as the ultimate way to innovate. Here, a bunch of creative people are placed together in a room and asked to churn out ideas at random with little or no guiding structure and few criteria to evaluate them. Such brainstorms are more frequently characterised by the excitement they generate than the quality of their end product.
None of the figures venerated in innovation texts like Darwin or Jobs created breakthroughs because they splurged random ideas. Their breakthroughs came from the ingenious applications of bodies of knowledge built up through a lifetime. True creativity involves thinking. This is an insight supported by neuroscience. As researchers at the University of Florida found, “not all people who have high levels of specialized knowledge or talent are creative, but we also know that to be creative one often needs to have specialized knowledge.”
Apparently aware of the oversights of his own work and that of other innovation scholars, Christensen has, through subsequent contributions, tried to spell out how innovation should happen. This led, for example, in Seeing what’s next: Using the theories of innovation to predict industry change to a series of models for understanding how industries are evolving. Yet this trend spotting fell short of a model of innovation.
Christensen finally provides this through his jobs-to-be-done theory. This proposes that innovation should be based on study of the particular tensions that consumers face in everyday life. These tensions are resolved through products. Consumers “hire” and “fire” products based on their efficacy in resolving the tension. Innovation should focus on identifying new jobs-to-be-done, or existing ones that are inadequately addressed by current market offers.
Whilst Christensen positions his theory as revolutionary, it is in fact a repackaging of old ideas from various disciplines. For example, it borrows a methodology from anthropology, design research and psychology, and a model of the consumer from neoclassical economics and economic sociology.
Nevertheless, jobs-to-be-done has a compelling quality. It is simple, easy to understand and relatively easy to do. It respects the consumer, in fact elevating her beyond the consumer to an entrepreneur via the language of ‘hiring’ and ‘firing’. It moves us past obsessing about product features to products’ actual use context. And it lays out a clear pathway for innovation. Once the jobs have been identified, there is an apparently strong foundation for product development.
However, it has two substantial weaknesses. First, it is reductionist in how it views the consumer. Essentially, it is based on a view of the consumer as a rational chooser. Christensen does not deny that consumption decisions may have emotional and social dimensions, but nevertheless it is clear that they are ultimately cognitive. The logic is that a consumer is presented with a challenge in her everyday life, and then makes a rational choice over a product that best answers the challenge. This is reflected ‘hiring’ and ‘firing’, which implies deliberate and decisive choice.
Consumption is a good deal more complicated than this. Many of the choices we make are not the product of rational deliberation at all. As cognitive neuroscientists are fond of telling us, 95% of mental activity is non-conscious. And as psychoanalysts have long argued, many consumption choices are based on unconscious self-ideals. Thus, jobs-to-be-done elides the actual complexity of consumption.
Christensen’s concern with rational individuals means that he lacks any conceptualisation of cultural value. Consumption is not just an individual matter, concerned with the dynamics of individual challenges and choices; it is also a cultural matter. It exists within a particular cultural context. The cultural values that shape consumption in the United States are very different from those that shape consumption in, say, China.
Jobs-to-be-done does not explicitly reject the influence that cultural value may have on consumption, but its focus on the tensions in individual lives means it has no way of understanding it.
The second problem with jobs-to-be-done is that it is not able to conceptualise what product needs may arise in the future. Its focus on understanding and resolving tensions in consumers’ lives means that it is present-biased. The tension the innovation attempts to solve might well be relevant in the present, but there is no way of knowing whether it will be relevant in the future. Jobs-to-be-done is not a future-proofed approach.
Again, this is because it lacks a conceptualisation of cultural value. The types of tensions that consumers confront are partly shaped by culture, and culture is constantly evolving. Therefore, in order to create robust and future-proofed innovations, we need the means to understand how value is changing, and what this means for consumer needs, ideals and rituals.
Jobs-to-be-done is a step in the right direction, but ultimately falls short. We need better ways of innovating.
The Cultural Framework of Value
To cure this myopia, we need to move beyond reductionism to a holistic view of value in consumers’ lives. This means building a Cultural Framework of Value. The creation of such a framework is driven by two questions:
- What are the needs, ideals, tensions, interactions and rituals shaping value for consumers in the present?
- How is value changing for consumers and how will value be generated for consumers in the future?
Such an approach moves beyond jobs-to-be-done by offering a much more complete picture of the influences shaping value in consumers’ lives, and by allowing innovations to be future-proofed via exploring how value is changing.
A number of methodologies are useful in answering these two questions. However, ethnography and semiotics are particularly useful.
Ethnography allows for a deep understanding of consumers’ lives and the relationship they have with products. A skilled ethnographer can reveal in depth the processes through which value is constructed, encompassing the full gamut of cognitive, emotional and cultural influences. Unlike jobs-to-be-done, it does not reduce research to individual cognition, but rather locates consumers within a rich, dynamic lived context. Doing this does not take-away consumers’ agency but rather acknowledges that the influences on consumption go beyond needs, simple tensions and cognitive processes.
Semiotics is the study of meaning. Semioticians decode what products mean to consumers, and how. They identify the different elements of a product that construct meaning (e.g. typography, iconography, colour) and analyse them according to shared cultural meanings. Semiotics provides a rigorous methodology for understanding how products come to have symbolic value for consumers.
Furthermore, semiotics provides frameworks for understanding the evolution of culture. Semioticians identify ‘codes’, clusters of meaning that provide the ‘rules of the game’ within any given area of culture, or consumer category. For example, there are many different codes of masculinity in culture or, rather, many different blueprints of being a man. Semioticians can identify which codes are emergent (new and still small), dominant (big and mainstream), or residual (declining and dated). The emergent codes provide insight into the future of value. They are a foundation for creating meaningful futures.
Semiotics and ethnography work well together. Ethnography explores value in the lived context of consumers, while semiotics explores value at the level of culture. Additionally, semiotics provides powerful tools for understanding how value is evolving. When correctly utilised, they provide a comprehensive Cultural Framework of Value. The Framework allows a proper perspective on sources of value in the present, and possible sources in the future.
More than ever, we need good innovation. At a societal level, innovation is essential to addressing the seemingly ever-increasing problems the world faces. At a business level, in an era of cut-throat competition, companies often need to innovate simply to keep their heads above water. Our models for innovating are, however, inadequate. Too often, we get lost in ideology, erroneously assuming that magic dust is enough to solve complex problems (e.g. the creative brainstorm). Alternatively, we embrace a seductively simple approach like jobs-to-be-done that fails to account for the complexity of consumers’ lives and is not future-proofed. We need better, more rigorous approaches that account for the multifaceted ways that objects become valuable to consumers. We need a Cultural Framework of Value.