It begins with an announcement. One small crumb of information. And it sets the tech world atwitter (literally, Twitter). Speculation follows for months, and finally, one day, winding queues form outside all of Apple’s glass monoliths. Some new thing – a computer, tablet, mp3 player, a technological Swiss army knife – is released, and the crowd gets its fix.
Then, the previous version of whatever this new thing is, the version that garnered the same anticipation a year ago, the version that still works perfectly, looks just as modern, and was more than adequate before this new one, is thrown out by the most diehard consumers. And, in time, more follow suit. Yes, recycling programs exist, but that doesn’t change the fact that this thing is done away with. Apple seems to top itself year after year, rendering anything that came before it irrelevant – and that obsolescence was all carefully planned. But the consequences of this high-frequency mass consumption are wreaking havoc on the planet, personal ethics in general, and of course, your wallet.
A HISTORY OF PLANNED OBSOLESCENCE
The notion of producing disposable goods is an understandably novel idea, and began in the 1800s with detachable men’s shirt collars. Hannah Montague from Troy, New York, didn’t want to wash her husband Orlando’s shirts when when all she needed to wash was the collar. So she cut it off, cleaned it, and sewed it back on. A lightbulb lit in Orlando’s head, and soon, he opened a factory that made detachable collars, cuffs, and dickeys. By the late 1800s, business was booming.
Fast forward to 1924, when a group of seven light bulb manufacturers, including Philips and General Electric, came together to form the Phoebus cartel, meant to control pricing (by hiking it up), eliminate competition on an international scale, and manufacture bulbs that wouldn’t exceed a life expectancy of 1,000 hours. A company that did make a longer-lasting bulb would be subject to fines, meaning consumers weren’t going to find any more sustainable options while the cartel existed. (There were always, of course, candles.) The cartel and its agreement was supposed to last until 1955, but ended in 1939 at the beginning of World War II because, well, it was World War II.
Such shenanigans had received a name seven years prior, in 1932, when a New York real estate broker named Bernard London coined the term in a brochure entitled “Ending the Depression Through Planned Obsolescence.” His proposal was innocuous, benevolent even. London just wanted to create jobs. After all, if products have an expiration date, then at some point people will be hired to create more. Furthermore, jobs could be created to manage the destruction of the worn-out goods. Or, in London’s own words:
New products would constantly be pouring forth from the factories and marketplaces, to take the place of the obsolete, and the wheels of industry would be kept going and employment regularized and assured for the masses.
Fair enough. But in 1954, the idea was taken a step further by the industrial designer Brooks Stevens (you may know him as the man behind the Oscar Meyer Weinermobile). He defined planned obsolescence as “instilling in the buyer the desire to own something a little newer, a little better, a little sooner than is necessary.” And that’s exactly how it’s been played out since.
In April 2010, Apple started selling the iPad. In March 2011, it launched the iPad 2. The new version was thinner, lighter, and faster. But in those eleven months between each release, was anyone really complaining that the first iPad was too clunky? Too slow? The idea of planned obsolescence, as Stevens defined it, was so successful that it trained a nation of consumers to favor desire over practicality. The attitude isn’t, “I need to buy a new one.” It’s, “I get to buy a new one.”
Maybe the term itself is due for an update. Is the first generation iPad really obsolete? Of course not. But then what was the point of releasing a new version less than a year later? The new one makes the old one look just a little sadder – precious, even, in its attempt to look equally as modern as the new one. Maybe it’s more about perceived obsolescence, especially when it comes to cosmetic updates.
Except, even changes to hardware that look purely cosmetic can be insidious. Apple’s iPhone 4, released in June 2010, came with different, more obscure screws than its predecessors. Why is this important? Because the corresponding screwdrivers are obscure too. Meaning Apple controls how and by whom the phones are opened. Now, this doesn’t really seem like a problem, except when the battery in your iPhone dies and it’s time for a replacement. You can’t take your phone to a licensed repair shop because they can’t open the phone (and they aren’t authorized by Apple to replace the battery anyway). So you can either have Apple replace the battery for a fee, except they’ll have to erase the memory on your phone thanks to their integrated battery, or you can just buy a new iPhone. Oh, and look, the new model just came out! And so it goes.
There is an argument that planned obsolescence is on the decline, because old hardware can support new software. For instance, this article was written on a five-year-old MacBook updated with OS X Leopard (I know, I know). And so some products can live on, just not forever (try installing Leopard on one of those candy-colored iMacs from the ’90s). But this argument doesn’t account for rapidly-updating tech support or the hardware updates necessary to run new software. And even if planned obsolescence is on the decline, would the lines of people outside of Apple stores let that happen?
Look at your computer. Consider the hardware. Does it look particularly disposable? Is it designed to have a short lifespan? It appears to be relatively hard wearing. And yet, we replace them frequently. According this infographic, on average, consumers get a new computer every two years. Maybe you keep yours a little longer, but still, every few years, you have to trade up. You feel compelled to. Computers, desktop or otherwise, which harness some of the most sophisticated technology, have become disposable items. How is that possible?
Statistics from the Environmental Protection Agency report that in 2009, America produced 2.37 million tons of electronic waste, and only 25 percent of that was recycled. That means most of it ended up in landfills, unleashing unspeakable toxins into the environment. (Not to mention the now-wasted rare earths ignominiously extracted to originally compose it.) This stuff isn’t just computers – it’s everything electronic that’s thrown away, which also shines a light on other culprits of planned obsolescence.
Printer ink cartridges are especially problematic. Some are designed to reject refills, meaning you have no choice but to buy a new one. And if only one of the levels in a tri-color cartridge is low, the leftover ink in the other two colors doesn’t stand a chance. The cartridges are also quite small, meaning it won’t be long before the ink runs out. And finally, new cartridges can cost nearly as much, or more than, their corresponding printer. All this translates into cold, hard cash for whatever company makes the cartridges, which is exactly why it’s able to give you a printer for free with your computer. But it also means trash, and lots of it. There are some great recycling programs for cartridges, but they don’t mean squat if people don’t use them.
There are too many other offenders to list here, but cars (with yearly remodels and discontinued parts), razors (most of them wholly disposable, instead of the lasts-forever straight razor of yesteryear), and clothes (not all of them end up a Goodwill) all bear the mark of an industry predicated on planned obsolescence. If it’s old, it’s out. Destined to be, well, obsolete. But the question remains: will our tolerance for such ever meet the same fate?
Leading image AP Photo/Paul Sakuma.
“I wanted to create a special place to get away from the hustle and bustle of the city,”...
As an ice cream enthusiast and home baker, I put the Spring Release Ice Cream Scoop to the test...