CISC-y RISC-ness

An unusual type of processor from the early 2000s seemed to offer the best of all worlds—and may be the most inventive approach to the CPU ever developed.

Today in Tedium: If you’ve kept a close eye on the technology space of late, you probably know that this is perhaps one of the most interesting times for processors in many years. After a number of stagnant generations, Intel has started competing again; AMD’s Ryzen chips are still pretty solid; ARM is where a lot of the innovation is happening; and RISC-V looks like it’s going to be the coolest thing in the world in about a decade. But none of these chips, honestly, can hold a candle to the interestingness of the chip I’m going to tell you about today. It didn’t set the world ablaze; in fact, it was designed not to. In the end, it was used in relatively minor systems, like internet appliances and palmtops. But technologically, it bridged the gap between two camps—RISC and CISC. And that’s what makes it interesting. Today’s Tedium looks back at the Transmeta Crusoe, perhaps the most interesting processor to ever exist. — Ernie @ Tedium

Today’s GIF is Tom Cruise from Risky Business, except upside down, because I personally found it amusing.

News Without Motives. 1440 is the daily newsletter helping 2M+ Americans stay informed—it’s news without motives, edited to be unbiased as humanly possible. The team at 1440 scours over 100+ sources so you don't have to. Culture, science, sports, politics, business, and everything in between—in a five-minute read each morning, 100% free. Subscribe here.

A Raspberry Pi, a common type of computer that relies on a RISC processor—specifically, an ARM chip. (osde-info/Flickr)

Before we talk about Transmeta, let’s explain the difference between CISC and RISC

If you’re a computer nerd, you’ve likely heard that there’s been a long battle brewing in computing for generations. It’s a battle that starts and ends with the size of the instruction set that a given processor supports.

In the early years of computing, it was common to put a number of basic functions inside a processor, to help keep the sizes of programs relatively minimal. After all, when space is at a luxury, you want to keep the code fairly dense. That line of thinking made sense during the mainframe and minicomputer eras, and came to be known as the complex instruction set computer (CISC) processing set.

But over time, some came to debate the value of this approach, suggesting that it might be better to increase processing efficiency by cutting down on instruction complexity. In other words, the chip has fewer types of things to execute, but it might do them more often. Some types of processors actually did this before the concept got a name, but eventually, it did get one—the reduced instruction set computer (RISC) processing set.

RISC was popularized by a computer scientist named David A. Patterson, who formally began developing this concept at the University of California-Berkeley under the auspices of the VLSI Project, a project funded by the U.S. Defense Department with the goal of trying to boost U.S.-made microprocessor designs.

The iconic paper that started the RISC revolution. Note the names of the authors.

In the midst of this process, Patterson ended up coauthoring a seminal paper that made the case that RISC was the best choice for processing. The title of this paper: “The Case for the Reduced Instruction Set Computer.”

The paper makes the case, among other things, that the use of high-level programming languages, such as BASIC, had made pure assembly language something of a novelty, and as a result, a number of the complex instruction sets in CISC processors essentially sat unused:

One of the interesting results of rising software costs is the increasing reliance on high-level languages. One consequence is that the compiler writer is replacing the assembly-language programmer in deciding which instructions the machine will execute. Compilers are often unable to utilize complex instructions, nor do they use the insidious tricks in which assembly language programmers delight. Compilers and assembly language programmers also rightfully ignore parts of the instruction set which are not useful under the given time-space tradeoffs. The result is that often only a fairly small part of the architecture is being used.

One estimate that the paper presented was that just 10 instructions represented 80 percent of all code executed on an IBM 360 compiler, and that most instructions were rare cases. So, by reducing the number of instructions, the thinking went, CPUs could have simpler designs, shorter development times and lower costs.

And while there were opportunities to improve efficiency through these rare commands, it wasn’t enough to ignore the potential of simplifying the instruction set. From the paper:

There are undoubtedly many examples where particular ’unique’ instructions can greatly improve the speed of a program. Rarely have we seen examples where the same benefits apply to the system as a whole. For a wide variety of computing environments we feel that careful pruning of an instruction set leads to a cost-effective implementation.

This paper shaped an industry, and its thinking became the foundation for many types of processors. The ARM processor set is perhaps the most famous RISC-based CPU, while other well-known chipsets in the RISC family include PowerPC, Sun’s SPARC, the DEC Alpha, and the MIPS architecture.

Notably, the x86 processor line, which was developed long before RISC became common, isn’t on that list. Nor was the Motorola 68000, the other prominent 16-bit processor line of the era.

So, if RISC is a more efficient kind of processor technology, why did CISC ultimately come to dominate the PC market? Chalk it up to timing plus market. See, RISC’s potential was complicated at first—it’s worth keeping in mind that when it was first developed, we tended to count system memory on personal computers in the kilobytes, which meant that longer programs would naturally compete for limited RAM resources. And by the time RISC chips became commercially viable, the PC was already well-established, which made it hard to transition to something new.

But about two decades after that paper first came out, an innovative technologist who literally cowrote “The Case for RISC” with Patterson somehow found himself arguing for a different approach entirely.

VLIW

An acronym that stands for “very long instruction word,” a type of CPU architecture that is used to process long streams of information in parallel, rather than performing tasks as they come in. This approach, which was first developed in the 1980s, is similar to RISC, in that the actual execution set is simplistic, but also borrows some of its approach from CISC because it joins together multiple simple instructions in a single command. In other words, it’s kind of a best-of-both-worlds approach.

A thin client using a Transmeta Crusoe chip. (Epsem Klem/Flickr)

Transmeta: The tech company that dared to build processors differently

In 1995, David Ditzel found himself, after a successful run at Sun Microsystems, trying something risky for a change. Not really RISC-y, mind you, though he knew all about RISC, as he helped his onetime mentor David A. Patterson make the case in that seminal paper, and as Sun’s chief technical officer, had helped to develop many generations of the Sun SPARC processor.

But with Transmeta, he was trying to do something a bit different. The startup, founded in 1995, spent roughly half a decade with its head down, developing a new type of processor. When you mention “stealth mode” in Silicon Valley, Transmeta is the kind of company a lot of people might think of.

By the time it emerged, the speculation was almost comically overblown. “Because Transmeta has been so secretive, some reports have speculated in jest that the company is using alien technology,” CNET writer Brook Crothers wrote.

It was certainly alien if the way you thought about processors was limited to x86 and RISC.

Gradually, though, details about what they were building came out in the form of patent filings, the first of which, filed in 1996 and granted in 2000, hinted at the kind of technology the firm was building: “Combining hardware and software to provide an improved microprocessor.”

The text of said patent filing certainly supported the alien technology argument. Essentially, Transmeta had come up with a way to emulate x86 and other types of processors at high speeds through software that combined with a host processor that used a VLIW approach. From the filing:

Rather than using a microprocessor with more complicated hardware to accelerate its operation, the present invention combines an enhanced hardware processing portion (referred to as a “morph host” in this specification) which is much simpler than state of the art microprocessors and an emulating software portion (referred to as “code morphing software” in this specification) in a manner that the two portions function together as a microprocessor with more capabilities than any known competitive microprocessor.

(Side note: When you think of the phrase “code morphing software,” about half a dozen Hollywood sci-fi movies come to mind, don’t they?)

Other patent filings by Transmeta suggested a processor that ran in a kind of Goldilocks mode:

It is, therefore, an object of the present invention to provide a host processor with apparatus for enhancing the operation of a microprocessor which is less expensive than conventional state of the art microprocessors yet is compatible with and capable of running application programs and operating systems designed for other microprocessors at a faster rate than those other microprocessors.

Ditzel, when speaking about his company’s work in a 1998 speech ahead of its release, worked to emphasize that this was actually the floor of what was possible, not the ceiling.

All of this made it kind of weird when the processor finally came out and was seen running not on high-end workstations but on handheld devices, internet appliances, and thin clients. Somehow, Transmeta made a processor capable of emulating an up-to-the-moment Pentium III machine, yet the firm targeted the devices to run on power-sipping machines in a pre-iPad world.

The reason for that is that, well ahead of much of the rest of the industry, Transmeta had figured out that the real battle for technological supremacy was not going to appear in huge machines with massive processors, but in mobile devices that could greatly benefit from simplified instruction sets. The chip that the company had developed, the Transmeta Crusoe, worked differently from x86 chips in other ways, too. It was immune to synthetic benchmarks in part because it was capable of caching instructions so that they would run faster the next time you ran them—so the benchmarks, which generally work by running tasks repeatedly, would look unnaturally good.

In many ways, Ditzel made it clear that the RISC dream that he presented in the early ’80s had gotten complicated, along with the chipsets.

“Today we have large design teams and long design cycles,” he said during his 1998 speech. “The performance story is also much less clear now. The die sizes are no longer small. It just don’t seem to make as much sense.”

(One look at the PowerPC G5, a RISC device that ran so hot that Apple couldn’t figure out a way to put it in a laptop, is ultimately enough to support his claim.)

(0xf2/Flickr)

The Transmeta Crusoe represented a back-to-basics approach in some ways—a simple chip that didn’t use much power, but thought of ways to re-architect the chip for modern-day needs. One of the key technologies that the company developed was called LongRun, which could automatically adjust a processor to different performance levels based on computing needs, an approach common in modern computers today but that Transmeta first developed. For all the weird stuff that this chip could do, this actually turned out to be the important thing.

With this processor, Ditzel—one of the two guys who helped formulate the RISC concept that had taken over the entire industry outside the dominant IBM PC clone—had managed to flip the entire CISC vs. RISC debate on its head. Suddenly, there was this third option, and it worked well enough that it didn’t really matter if Intel was inside anymore.

But as impressive as the chips were, the years of hype, encouraged both by Transmeta’s lengthy stealth mode and its association with chip-development royalty in the form of Ditzel, meant that there was pushback on the general idea from some corners. In a 2000 piece in Maximum PC, processor pundit Tom Halfhill, a former Byte editor, made it clear that it was all a bit too much:

Let’s cut through the hype, which is thicker than the Pacific fog that rolls over Silicon Valley on winter mornings. Transmeta’s “revolutionary” new Crusoe processors are actually proprietary VLIW chips with x86 software emulators. They aren’t revolutionary, and they aren’t nearly as fast as their clock speeds imply: According to Transmeta, a 700MHz Crusoe delivers about the same performance as a 500MHz Intel Pentium III.

But even given that critique, he was quick to backtrack and admit that, yes, these chips were nonetheless worth talking about. “Transmeta has employed some innovative technology in Crusoe, particularly for conserving power,” he added.

It was easy to look at what Transmeta was doing as “just” emulation, but it was a limited way of thinking about it in the long run. It was also doing something else—it was cutting back on the need to fit so many transistors on a single chip, a race to keep up with Moore’s Law that in 2023 has led us to the state of affairs where we literally don’t know if we can shrink the transistors any further after a certain point. It was an interesting idea—a computer designed to fit a specific thermal envelope, rather than one that treated the thermal envelope like it didn’t exist—and one that later machines, especially on the ARM side, has proven correct, even if they didn’t start emulating x86 instructions on a VLIW.

In fact, machines of the era took inspiration from this point, to Transmeta’s detriment. The Pentium M, released in 2003, seemed to be something of a response to the Crusoe, that machines could be developed with a thermal envelope in mind without needing any of that “code morphing” magic that was at the center of Transmeta’s appeal.

“TM put battery life on the map as a key issue for mobile products, and they succeeded in convincing both Intel and AMD that in many situations, portability and flexibility are more important to mobile users than performance,” Ars Technica’s Jon Stokes wrote of the Pentium M, the fulcrum which the company’s Centrino mobile computing platform was built around.

The Centrino approach was a huge hit—and likely played a role in convincing Apple to move to Intel—but it was not good news for Transmeta, which finally brought excitement to the mobile computing space, only for it to get undercut by boring old CISC.

The Sharp PC-MM2, one of the few machines to ship with an Efficeon. (Wikimedia Commons)

In 2004, Transmeta extended on this approach with the Efficeon, a processor that was comparable to a Pentium 4 in processing capabilities but had a significantly smaller die, meaning it was within shouting distance of the processor it was emulating without all the extraneous stuff. The company needed it to be a hit—as noted in a 2004 InfoWorld article, the firm had lost nearly $600 million trying to bring the Crusoe to market, only for Intel to swoop in with the Pentium M and eat their lunch.

Unfortunately, it was not, and in 2005, the company had stopped developing its own chips and began licensing its key technologies to outside vendors such as Sony, NEC, and later NVIDIA.

Transmeta, for understandable reasons, also sued Intel, arguing that 10 of its patents had been violated in the development of recent Pentium chipsets—a case that eventually led to a $250 million settlement in 2007.

The firm, while having much luck licensing its patents, sputtered from there, being sold in 2009 to Novafora, a firm that closed almost immediately after the acquisition went through, and eventually selling off its patents to a patent holding company.

Its ideas prove of great influence on the processor space, but Intel is just a hard company to beat.

“Our community should rally around a single ISA to test whether a free, open ISA can work.”

— David A. Patterson and Krste Asanović, in a paper titled Instruction Sets Should Be Free: The Case For RISC-V, which (like the prior paper that Patterson and Ditzel wrote 35 years earlier) proposes a case for a new type of processor architecture to take over. Rather than going with a newer type of architecture like VLIW, which the authors note was never commercially successful, they made the case for a modernized version of RISC that could then be open-sourced and built upon.

In the years after Transmeta moved from stealth startup to also-ran with a killer patent portfolio, David Ditzel gradually moved away from the company, leaving entirely in 2007.

Interestingly, soon after the patent lawsuit with Intel cleared up, he actually went to work for Intel, which continued to build upon the code morphing approach that Transmeta first developed behind closed doors. (As The Register amusingly put it, “flying pig freezes in hell.”)

But it’s perhaps his most recent act that’s been most interesting. In 2013, he helped to launch another startup, Esperanto Technologies, which has been developing high-end chips using RISC-V, the latest iteration of the RISC technology he helped to formalize with David Patterson many decades ago.

There are so many cores on this single card. (via Esperanto Technologies)

The firm is building high-end processors for machine learning that leverage low-power cores to maximize performance in a limited setting. In 2020, the company made major waves with its ET-SoC-1 processor, which literally put 1,100 tiny cores on a single chip—and making them highly performant while not needing anywhere need the power consumption of a high-end GPU. This is a chip that is designed to work on a PCIe card, while embracing that the brains of the server are going to be x86. Then, as now, Ditzel’s company is trying to keep power consumption down.

“Esperanto’s challenge was how to put the highest recommendation performance onto a single PCIe-based accelerator card using no more than six of our chips and no more than 120 watts,” Ditzel said in a 2021 HPC Wire interview.

In a 2017 interview with EE Times, Ditzel said that he had found deep inspiration in the work of RISC-V, whose open-source nature and maturity offers deep flexibility to build new types of processors for deeply varied use cases. (Patterson, his old professor, even joined the company.)

“I wasn’t going to do something unless it could be bigger than Transmeta—it’s get big or go home,” he said.

In a world where it’s easy to see things as black or white, RISC or CISC, it’s good to know that people are developing chips that stretch the expected logic. The jury’s still out on whether RISC-V might shake the apple cart, but it might just be possible in the long run to make a chip as interesting as the Transmeta Crusoe.

--

Find this one an fascinating read? Share it with a pal!

Also, be sure to give our quick-hit newsletter Lesser Tedium a look—and get a two-minute look at one of the many elements in Tedium’s deep archives.

Is now the time to upgrade your news diet? Our sponsor 1440 can help.