If you’ve been paying any attention to tech news lately, you might have noticed that Apple’s getting some flak for withdrawing its products from the EPEAT rating system. For those of you who live under a rock, EPEAT’s one of those *things* like Energy Star that companies can stick on their products like a badge of environmental honor. Apple decided it was too good for EPEAT and tore off that badge this week.
The widely suspected motive behind this move was that the design direction Apple was taking went contrary to the virtues embodied in EPEAT ratings, specifically that a 10 year old with tools should be able to completely disassemble any product. Disassembly is an important precursor to recycle-ability, which is generally considered a good thing. But disassembly also means using removable fasteners, like screws, nuts, and bolts which take up a finite amount of space in any design. For a company like Apple, this is unacceptable. In the name of thinness and shininess, everything that can go, must. The Retina Macbook Pro that is currently the crown jewel of every Apple Store in the country right now is also the least disassemble-able laptop in the history of mankind.
Regardless of whether or not sacrificing serviceability is the right thing to do, it’s something we as a society will have to get used to. If we are ever to reach a state where embedded systems are seamlessly integrated into the world as we know it, we will have to accept non-traditional form factors that are specially produced in a factory and beyond the comprehension of the average person. Every single piece of technology you’ve ever seen in Star Trek was serviced with some fancy laser-emitting gadget that magically fixed it. Not just because inventing the mechanisms by which these devices worked would have been more tedious than inventing the entire Klingon language, but because future technology was too small to by manipulated by the oafish flesh sticks of humankind.
We still live in a world where our computers are user-serviceable. Processors, graphics cards, memory. They are all discrete units that can be swapped out. But even now, the trend in technology is to integrate multiple components into a single package, like the CPU and GPU. This is especially true in cell phones and tablets. RAM might be next, tied directly into the CPU cache for faster throughput. If this trend spills over to the PC segment, reducing a computer into a single chip, then EPEAT’s hosed. Useless. Hundreds of dollars of semiconductors per device per person will be even harder to recycle than it is now. And if you want future technology to be Minority Report-sexy, then you have to embrace that future, where computer systems are so seamlessly integrated with the physical world that a screwdriver no longer suffices for accessing all of their inner workings.
Another example-point: Cars. We have the ability to control SatNav, Climate Control, and infotainment from a computer the size of a lobster (yes, food is on my mind). Everything you see in front of you on a new car is a facade, an interface that a big clunky human can use. But if something breaks, unless it’s a knob or a button, you may no longer be able to dive in and troubleshoot your problems by following a wire. The digital world is shrinking. And humans simply cannot come along for the ride.
In short, if you want a future where you’re holding a paperthin, flexible tablet computer inside a smart computer controlled house with intelligent sensors everywhere, then you’re going to have to accept Apple’s choice as a necessary evil. If on the other hand you want to, you know, upgrade your gear and treat your gadgets like companions and not disposable silicon…? Then you won’t be able to have shiny things. You cannot make an iPad out of Legos.