Bad Metaphors, Bad Tech

January 28, 2013 | 1 book mentioned 12 5 min read

570_DP248127

Human flight began with laundry. In 1777, Joseph-Michel Montgolfier was watching clothes drying over a fire when he noticed a shirt swept up in a billow of air; six years later, he and his brother demonstrated their hot-air balloon, the first manned flying machine.

In Paris, a grand monument was planned to honor the balloon flight, but the project stalled in its earliest stages; all that’s left of it is a five-foot clay model in New York’s Metropolitan Museum of Art. At first glance, it’s hard to see that the sculpted clay depicts a balloon at all — the flying machine is encrusted with layers of tiny, winged, chubby-cheeked cherubs, stoking the fire, wafting up in gusts of hot air, hanging on for the ride. Two grown-up angels ride on the scorching clouds toward the top, one blasting a herald’s trumpet, the other (cheeks puffed up like Louis Armstrong) blowing to make the whole contraption move, straining himself as if it would take all of his lung-power to overcome the combined weight of several dozen dangling babies.

And here is my best guess at an explanation: when that clay was modeled, a flying, fire-powered, human-carrying sack of fabric was something entirely new, the highest of high-tech. But flying cherubs? They were old — little putti had been soaring across the ceilings of Europe’s churches and palaces for centuries. It’s only in terms of what’s old that the newest technologies make initial sense. Without the help of the old, they’re incomprehensible, which is as good as invisible.

It’s not surprising that technology comes into the world wrapped in metaphor. With the help of a metaphor — a flight powered by angels rather than expanding air — something as alien as a flying machine was domesticated into a visual culture that seemed to make solid good sense. That’s how we’ve always communicated progress to one another, even when the results risk looking ludicrous with a few centuries’ hindsight.

More than smoothing over progress after the fact, metaphors themselves often drive progress. The insight that turned a balloon into a piece of Baroque art was the same kind of jump that turned a billowing shirt into a flying machine. But if smart figurative thinking can spark and explain new technologies, defective metaphors can do just the opposite. When the words and images we use to familiarize the new become too familiar — when metaphors start to die, or when we forget that they’re only tools — they can become some of the most powerful forces against innovation. It’s not always technical walls that stop change in its tracks. Sometimes, innovation is limited by language itself.

When was the last time, for instance, that you used the word “desktop” to refer to the actual surface of a desk? Our desktops are imaginary now — but in the days of the earliest graphical user interfaces, comparing a computer to a piece of office furniture was odd enough that tech companies had to spell it out for us. “First of all,” read one of the earliest Macintosh print ads, “we made the screen layout resemble a desktop, displaying pictures of objects you’ll have no trouble recognizing. File folders. Clipboards. Even a trash can.”

coverIn 1984, when the image was still fresh, your computer interface resembled a desktop; now, it just is one. In 1984, the desktop justified the ways of Jobs to man; but soon enough (to mix metaphors just a little) it became a tyrant in its own right. An intuitive image for a screen resting on a desk made little sense for a screen resting in your hand. The mobile desktop didn’t fail for lack of trying: as Mike Kuniavsky explains in Smart Things, his book on computing design, one of the clunkiest early mobile operating systems failed because it took the desktop so literally.

Start up the Magic Cap OS, which debuted on Sony and Motorola tablets in 1994, and you were faced with an actual desk, complete with images of a touchtone phone and rolodex. To access other apps, you clicked out of the “office,” walked down the “hallway,” and poked into any number of “rooms.” The internet browser was a further trek: out of the office building, down the main street to the town square, and into a diner, where the web was was finally accessible by clicking on a poster.

Jump ahead a decade to the iPhone and iPad. To argue that they are so intuitive “because of touchscreens” is to ignore the first step that made their simplicity possible: abandoning a worn-out metaphor. We’d grown so used to desktops, folders, and all the rest, that they’d ceased to remind us of objects outside our computers. And once Apple recognized that the metaphor was dying a natural death, it was clear that the desktop could be discreetly buried.

(A bit more tentatively — because there are still quite a few old-school PDA fans — I’d suggest that the awkward handwriting-recognition systems of devices like the Newton and PalmPilot were themselves products of faulty metaphors. A PDA may resemble a pen-and-paper notepad, but it’s hardly meant to work like one.)

The awareness that metaphors can inhibit innovation as much as they advance it leads any number of technological misfires to make an odd, new kind of sense. Early cars weren’t simply called “horseless carriages,” they were literally designed to resemble carriages with the horse removed; the Model T, in turn, was one of the first cars to successfully eliminate the carriage metaphor. If driverless cars are ever feasible, we might expect the pattern to repeat itself: early entries modeling themselves on familiar sedans and minivans long after their function is gone, and successful competitors breaking through the metaphor entirely, into shapes we haven’t yet imagined.

Why, to take another example, were we so attached to manned spaceflight that we spent decades and billions on space shuttle busywork? One reason: from Captain Kirk to the word astronaut (literally “star sailor”), we’ve been taught to view space exploration through the metaphor of seafaring adventure. Yet the Curiosity rover crew, without resembling swashbuckling sailors in the least, has brought back more knowledge of our solar system than any astronaut to date.

Science and math may increasingly be the curriculum’s glory subjects — when’s the last time you heard a politician demanding that schools churn out more classics majors? — but innovation has always demanded just as much verbal creativity, a feeling for the possibilities and limits of words themselves. Innovators need an eye for what George Orwell called “dying metaphors”: not those newly vivid ones (like “desktop” in 1984), nor the dead ones that have stopped reminding us of images at all (like the “hands” of a clock), but the images that have outlived their usefulness.

And we need an eye, too, for all the silent biases that creep into tech-talk unawares. As Kuniavsky observes, the metaphor of “cloud” computing suggests an amorphous vapor that “extends beyond our reach and does not have a defined shape or boundary. Events that happen in the cloud may be outside the control of any one person in it.” Does the image of data stored in a cloud lead us to settle for less privacy?

Consider the desktop one more time: surely there are powerful economic reasons for the “digital divide,” but hasn’t the desktop metaphor contributed in its own way? From the moment it comes out of its box, your computer presumes that you’re the kind of person who spends most of your time at an office desk.

We’re free to write language, images, and anything else with the mushy look of the humanities out of the history of progress. We’re even free, like the state of Florida, to consider charging more for a college education in the comparatively “useless” fields of English and history. But the result might be a generation of would-be innovators even more prone to be unaware of, and trapped in, the dominant metaphors of the day — like the sculptor too busy modeling little angels to give much attention to the miraculous flying machine underneath.

Incidentally, Paris finally did get a balloon monument, though it took more than a century; it celebrates the aerial messengers of the Franco-Prussian War. Convincingly weightless even cast in bronze, the hot-air balloon sails up out of a circle of human figures. There’s not a cherub in sight.

Image via Metropolitan Museum of Art

is a Ph.D. student at Columbia University and a former congressional speechwriter. He is the co-author of Rome's Last Citizen: The Life and Legacy of Cato, Mortal Enemy of Caesar.