Today in Tedium: CES is here, and with it lots of technology you don’t need. But sometimes, something stands out to you so much that it gives you a little bit of hope for the future of the tech world. And for me, that was the Intel Ghost Canyon NUC. If you’re not familiar with the NUC, or Next Unit of Computing, it’s a kind of computer that’s intentionally sold as a barebones kit, and in practice is a tinier but more upgradeable version of the Mac Mini. But the version being shown off this week uses a new form factor that replaces the small box with a board designed to go into a PCIe slot. This means you can upgrade it easily and use it in new kinds of ways—turning the process of building a computer, often a mess of cable management and thermal paste, into something closer to what a mere mortal can do. It’s cool. And when my Xeon finally kicks it, maybe I’ll get one, as long as Intel doesn’t screw it up. But it has me thinking about the nature of modularity, why it’s so powerful as a concept … and how business goals often compete with modular, component-based systems. Today’s Tedium takes modularity apart, and puts it back together again. — Ernie @ Tedium
(New sponsor below—and it’s for a podcast that Tedium fans will really dig. Check it out!)
Why did people fear coffee, novels, and teddy bears? In each episode of the podcast Pessimists Archive, we look at the moment that something new came along—something that’s commonplace now!—and try to understand why it freaked everyone out. Our goal: By understanding why people fear change, we can become better at embracing it.
Today’s Tedium is sponsored by Pessimists Archive. (See yourself here?)
“There is ample evidence that modular design in general and grids in particular are influencing the process of design in ways that can no longer be ignored.”
— Allen Hurlburt, a designer and author best known for his work on Look magazine, discussing the importance of grid-based design in layout in his landmark book The Grid: A Modular System for the Design and Production of Newpapers, Magazines, and Books. Grid-based design, which has actually become more common in the internet era, is often seen as a way to build structure around design, particularly involving columns of text.
The great benefit of modularity is consistency and greater consumer access
Ask anyone who shops for real estate in the modern era, and something you might hear is that “they don’t make homes like they used to.” The idea being that fewer corners were cut in the 1920s and 1930s and homes tended to have better building materials and a more distinctive design aesthetic.
More recent homes, particularly manufactured homes or mobile homes, which are generally modular in nature, often face a harsher reputation than these-old school houses that date back to this earlier era.
But what if I told you that many of the homes that people built in the early 20th century were actually prefabricated in the same ways that modular homes are? It’s true, and it’s all thinks to Sears, Roebuck, and Company, which made a killing selling homes by catalog throughout the earliest decades of the 20th century.
The shipments worked like this: A person would buy a home (at the phenomenon’s peak in the 1920s and 1930s, the catalog had more than 450 choices), and the parts would be delivered by train to the family, and that was it. Like the most complex IKEA manual ever, families had to build their own houses, and build they did, presumably adding in differences along the way but overall sticking to the script. As Curbed notes, Sears bungalows sold by the millions around the country and influenced home design trends. (There was also a civil rights element to the play: Sears, which didn’t discriminate against its paying customers, helped many black Americans get access to goods they might not have otherwise had.)
These homes sell for a premium these days, a reputation that you probably could never imagine about a modular home.
So when did pre-fabricated become modular? After World War II, the general idea gained in popularity. In fact, companies predicted that there would be a need after the war for cookie-cutter homes and buildings, and planned for it. In a 1944 article, J. Ernest Fender of the Structural Clay Products Institute spoke of the “new modular sizes of brick and tile for early post-war use” that it said would help the country develop lots of new buildings quickly.
“Perfected as a means of lowering building costs and improving the quality of construction, modular design is particularly well suited for large public structures, such as schools, hospitals, and other public buildings, where the cutting and fitting of materials on the job adds greatly to construction costs,“ Fender told the Harrisburg, Pennsylvania, Evening News that year.
This sort of modularity quickly gained popularity with consumers after the war. If cogs can be built, reused, and plugged in, why not homes? According to a 1958 Popular Science article, roughly 10 percent of homes were manufactured and delivered from the factory.
Even back then, the industry was fighting a reputation for cookie-cutterishness.
“Prefabs today are not look-alike cheese-boxes,” writer John L Springer explained, noting the thousands of choices available. “They have individuality.”
But even in a world where manufactured homes imply a mobile vessel on a cinder block, they still have their place. They create access where there once wasn’t any before, and highlight how manufacturing creates new kinds of efficiencies that ultimately save the consumer money.
Applied beyond homes, modularity improves consumer access and cuts down on costs, making it easier to both own and maintain. Generally, that’s good for the consumer—but not always for the company making the modular component.
The number of bricks that the toy company Lego makes each year, according to National Geographic, at a rate of more than 35,000 per minute. The company’s simple system design is versatile thanks to its modularity—just six lego bricks can be combined in hundreds of millions of ways.
The great downside of modularity is the ease of copying
In 1978, Lego—a company whose most iconic product is a set of building blocks with a modular design—lost something really important.
Its primary patent for the block system it invented in the 1950s expired, ensuring that competitors could step in and damage its marketshare.
Soon, competing companies decided to lean in onto their territory, claiming compatibility with a toy brand that has been active for many decades.
Initially, its main competitor was Tyco, a toy company known initially for die-cast locomotives and slot cars that released a Lego-compatible set called Super Blocks starting in the mid-1980s. Lego eventually sued, and while Tyco was allowed to continue selling the bricks, it had to make more clear its lack of affiliation with Lego.
“If Tyco is forced to market its blocks honestly and truthfully, we believe we’ll beat the pants off them,” Lego lawyer Allan Zellnick told The Washington Post in 1987.
Tyco won the legal battle, but lost the war for the hearts and minds of toy fans.
More recently, Lego has found itself locked in battles with the Canadian company Mega Blocks, a company that has avoided going the pure discount route that Tyco did in selling its products, instead coming up with brick approaches based on a creative theme, something Lego itself has done for decades.
Lego, in an attempt to find another route to protect its intellectual property, has tried to sue based on trademark law, claiming that its brick design is part of its overall brand—a trick that worked at first in the European Union, but ultimately failed under scrutiny.
These days, perhaps the primary thing keeping Lego alive and successful is its reputation and willingness to try different things to help maintain the brand and overall company. Not only can its product be cloned, it can easily be resold in parts, leading to companies that resell Lego components. It’s not like an old computer chip, which eventually grows obsolete over time—literally a brick sold in the 1950s can be used in today’s set, nothing stopping it.
And that’s a boon for resellers. It may have Lego’s logo on the top, but Lego isn’t the one making money on the sale.
For Lego, modularity helped it build a long-term success story—but the very thing that won it success in the marketplace eventually became a threat.
And this has played out many times with different types of modular products. It’s hard to be king when your crown can be taken by a competitor.
How modern consumer technology breaks the modularity equation
Why is it that, when Apple released a modular desktop computer for the first time in six years, it started in the high four figures and went up from there?
And why is it that when you buy a desktop computer or even a laptop from Apple, that machine is basically locked down these days, almost completely non-serviceable?
It might be because Apple knows that, even considering issues such as sustainability and recycling, modularity is poisonous to its consumer business.
In his book The Innovator’s Dilemma, Harvard Business School professor Clayton Christensen (who sadly passed away just a couple weeks after I initially wrote this) ponders the way that modularity often breaks down large companies. Using the example of Digital Equipment Corporation (DEC), Christensen explained how the company’s monolithic design approach ran into the bulldozer of modularity:
The processes for designing and manufacturing minicomputers involved designing many of the key components of the computer internally and then integrating the components into proprietary configurations. The design process itself consumed two to three years for a new product model. DEC’s manufacturing processes entailed making most components and assembling them in a batch mode. It sold direct to corporate engineering organizations. These processes worked extremely well in the mini-computer business.
The personal computer business, in contrast, required processes through which the most cost-effective components were outsourced from the best suppliers around the globe. New computer designs, comprised of modular components, had to be completed in six- to twelve-month cycles. The computers were manufactured in high-volume assembly lines, and sold through retailers to consumers and businesses. None of these processes required to compete successfully in the personal computer business existed within DEC. In other words, although the people working at DEC, as individuals, had the abilities to design, build, and sell personal computers profitably, they were working in an organization that was incapable of doing this because its processes had been designed and had evolved to do other tasks well. The very processes that made the company capable of succeeding in one business rendered it incapable of succeeding in another.
This is an astute point. As computers became more common and transistors became smaller, it was easier to put different kinds of parts together to create newer, more efficient types of computers, with modularity proving the differentiator.
Great news if you’re in the market for a computer. Not so great news if your business is built around computers. It means a flood of competition that never quite goes away, where folks can beat you on price and performance. Even Apple was initially a victim of this situation with the fairly modular Apple II.
“Given the shortcomings of its design structure, the fact that the Apple II had been assembled quickly out of cheap, off-the-shelf components made it prey to competition,” notes the 2000 book Design Rules: The Power of Modularity.
Apple eventually figured out that developing machines that were proprietary, with limited upgradability but unique features, was the best strategy.
And when the company floundered in the ’90s, part of the reason was because they allowed competition to seep in. Eventually, they moved back to modularity for a time, then started to close up shop again once it found success in consumer goods.
Christensen seemed confounded by Apple’s success in more recent years, surprised that it was able to flourish selling things that seemed like commodities for premium prices. “Apple may think the proprietary iPod is their competitive advantage, but it’s temporary,” he said in 2006, as a point of example.
What Christensen missed—and what Ben Thompson at Stratechery was quick to point out—is that Apple turned a modifiable commodity into a premium product with its own unique attributes. People don’t go out of their way to buy the cheapest watch. If they have the budget, they’ll go for the fanciest one. We buy things not for modularity but for status, especially if there are advantages to design and functionality you can’t get from something modular.
Sure, make a cheaper option, but kneecap it. Or let your competitors make the knockoff—they weren’t your customers anyway.
The problem is, this model works well for consumers, but not for businesses. Companies don’t buy a high-end workstation because they want it to look pretty—they want it to perform. And Apple forgot this in 2013, the last time it tried to build a workstation.
So the new hotness for corporations is modular, but modular with a price—so as not to attract the market that would buy modular to save money. The Mac Pro includes components that are technically modular, but you can only use on machines sold by Apple.
By avoiding modularity—and creating a fortress of luxury around its brand—Apple has managed to find success where long-gone predecessors have failed. The secret is remembering who‘s the customer.
So, back to the Intel NUC. The company’s strategy for this new device (the brain, with the included CPU and many of the machine’s primary parts, is called a NUC Element) is pretty interesting and will likely prove a huge benefit to consumers as the price of the technology goes down.
It fixes a lot of problems with the existing modularity structure used in modern computers, which often force users to have to make tough decisions about how to organize their storage, memory, cabling and airflow, which make working on building a computer something akin to a light engineering project rather than playing with Legos.
Computers have long been built with the assumption that you need lots of spaces for CD-ROMs and hard drives. As desktop computers have stopped using such bulky components in favor of tiny NVMe solid state drives and putting necessary components directly on the motherboard, there’s been a movement to shrink these builds to smaller sizes, but they get increasingly complex because of the fact that the components are organized around this structure that no longer is necessary.
Intel’s rethink of the NUC this week, depending on the impact on heat and other issues of performance, could solve these issues with modularity in the long run. Even if they don’t get it right starting out (there are quite a few critics online), it launches an important conversation that will eventually benefit the consumer greatly. Because ultimately, modular technology saves consumers money, especially down the road.
It feels like a better future for modularity than the Mac Pro, if you ask me, even if it’s not quite as profitable.
Find this one an interesting read? Share it with a pal! And thanks to Pessimists Archive for sponsoring. Be sure to check out the podcast! Cheers.
Editor’s note (01/24/2020): Updated to note the passing of Clayton Christensen.