Innovation leaves structures intact, developing new processes to monetize the dysfunctional systems we already have

The innovator has always been part savior, part confidence man. “In a multitude of men there are many who, supposing themselves wiser than others, endeavour to innovate, and divers Innovators innovate divers wayes, which is a meer distraction, and civill ware,” Thomas Hobbes wrote in 1651. In 1837, a Catholic priest in Vermont devoted 320 pages to denouncing a Protestant cleric he referred to scornfully throughout as “the Innovator.” At the turn of the last century, a Denver processed cheese executive told a reporter: “If you don’t innovate every day and have a great understanding of your customers, then you don’t grow.” Meanwhile, a 21st-century scholar delivering a post-mortem on a shuttered college said that universities’ main task now is not to teach or to research — these are so last century — but to create “space to innovate.” (Innovate what, and for whom? Don’t ask.)

What links the priest, the philosopher of power, and these 21st-century shillers of discount cheese and higher education? On the surface, not much. But the strange phrase “innovate every day,” without an object, like an epiphany, retains the magic of the visionary. What was once a pejorative for heresy and malicious self-interest — Hobbes used “innovator” as a synonym for “plotter” and “charlatan” — is now our finest modern virtue, a term that unites the solitary, prickly creativity we’ve learned to associate with artists with a basically benevolent understanding of the profit motive. “Innovate” has long been used in business contexts to refer to a discrete embellishment of a product, but conventionally it required an object; you innovated on something else. “Innovation” is most popular today as a stand-alone concept, a kind of managerial spirit that permeates nearly every institutional setting, ... 

... So it is with us: A veritable army of “brand strategists” and consultants stand at the ready to counsel anxious employees and nervous graduates how to “be more innovative.” If “disruption” offers an apocalyptic view of historical change, as Jill Lepore has argued, innovation is its sunnier, though no less anxious, counterpart. With its inward-looking moralism, innovation also appeals to an evangelical strain in American culture, in which the country is both chosen by Providence and perpetually backsliding.  In 1980, an author in Newsweek lamented what he saw as the widespread fear among his compatriots “that it’s all over for us, that the age of innovation has ended.” Or as one of the mightiest idols in the Innovator pantheon, Peter Thiel, said more recently: “innovation in America is somewhere between dire straits and dead.”

Innovation, therefore, is a strangely contradictory concept, simultaneously grandiose and modest, saccharine and pessimistic. The prophetic meaning embedded deep in its etymology allows “innovation” to stand in for nearly any kind of positive transformation, doing for the 21st century what “progress” once did for the 19th and 20th. On the other hand, its fetish for technology signals a retreat from the transformative visions of 19th- to mid-20th century “progress,” in all their various forms and with all their terrible faults. As scholar Paul Erickson has pointed out, innovation transforms processes and leaves structures intact. Thus, instead of reinventing housing or transit, “innovators” mostly develop new processes to monetize the dysfunctional housing and transit we already have, via companies like Airbnb and Uber.  It’s one thing, therefore, to celebrate novelty indiscriminately — as if meth labs and credit-default swaps are not innovative — but what if the new isn’t even very new at all? When Alec Ross, Hillary Clinton’s former “Senior Advisor for Innovation,” says that “If Paul Revere were alive today, he wouldn’t have taken a midnight ride from Boston to Lexington, he would have just used #Twitter,” all this tells me is that Twitter is basically no different than a fast horse.