Tuesday, June 16, 2020

Apple ARM Mac rumors

The latest rumor is that Apple is going to announce Macintoshes based on ARM processors at their developer conference. I thought I'd write up some perspectives on this.


It's different this time

This would be Apple's fourth transition. Their original Macintoshes in 1984 used Motorola 68000 microprocessors. They moved to IBM's PowerPC in 1994, then to Intel's x86 in 2005.

However, this history is almost certainly the wrong way to look at the situation. In those days, Apple had little choice. Each transition happened because the processor they were using was failing to keep up with technological change. They had no choice but to move to a new processor.

This no longer applies. Intel's x86 is competitive on both speed and power efficiency. It's not going away. If Apple transitions away from x86, they'll still be competing against x86-based computers.

Other companies have chosen to adopt both x86 and ARM, rather than one or the other. Microsoft's "Surface Pro" laptops come in either x86 or ARM versions. Amazon's AWS cloud servers come in either x86 or ARM versions. Google's Chromebooks come in either x86 or ARM versions.

Instead of ARM replacing x86, Apple may be attempting to provide both as options, possibly an ARM CPU for cheaper systems and an x86 for more expensive and more powerful systems.


ARM isn't more power efficient than x86

Every news story, every single one, is going to repeat the claim that ARM chips are more power efficient than Intel's x86 chips. Some will claim it's because they are RISC whereas Intel is CISC.

This isn't true. RISC vs. CISC was a principle in the 1980s when chips were so small that instruction set differences meant architectural differences. Since 1995 with "out-of-order" processors, the instruction set has been completely separated from the underlying architecture. At most, instruction set differences can't account for more than 5% of the difference between processor performance or efficiency.

Mobile chips consume less power by simply being slower. When you scale mobile ARM CPUs up to desktop speeds, they consume the same power as desktops. Conversely, when you scale Intel x86 processors down to mobile power consumption levels, they are just as slow. You can test this yourself by comparing Intel's mobile-oriented "Atom" processor against ARM processors in the Raspberry Pi.

Moreover, the CPU accounts for only a small part of overall power consumption. Mobile platforms care more about the graphics processor or video acceleration than they do the CPU. Large differences in CPU efficiency mean small differences in overall platform efficiency.

Apple certainly balances its chips so they work better in phones than an Intel x86 would, but these tradeoffs mean they'd work worse in laptops.

While overall performance and efficiency will be similar, specific application will perform differently. Thus, when ARM Macintoshes arrive, people will choose just the right benchmarks to "prove" their inherent superiority. It won't be true, but everyone will believe it to be true.



No longer a desktop company

Venture capitalist Mary Meeker produces yearly reports on market trends. The desktop computer market has been stagnant for over a decade in the face of mobile growth. The Macintosh is only 10% of Apple's business -- so little that they could abandon the business without noticing a difference.

This means investing in the Macintosh business is a poor business decision. Such investment isn't going to produce growth. Investing in a major transition from x86 to ARM is therefore stupid -- it'll cost a lot of money without generating any return.

In particular, despite having a mobile CPU for their iPhone, they still don't have a CPU optimized for laptops and desktops. The Macintosh market is just to small to fund the investment required. Indeed, that's why Apple had to abandon the 68000 and PowerPC processors before: their market was just too small to fund development to keep those processors competitive.

But there's another way to look at it. Instead of thinking of this transition in terms of how it helps the Macintosh market, think in terms of how it helps the iPhone market.

A big reason for Intel's success against all its competitors is the fact that it's what developers use. I can use my $1000 laptop running Intel's "Ice Lake" processor to optimize AVX-512 number crunching code, then deploy on a billion dollar supercomputer.

A chronic problem for competing processors has always been that developers couldn't develop code on them. As a developer, I simply don't have access to computers running IBM's POWER processors. Thus, I can't optimize my code for them.

Developers writing code for ARM mobile phones, either Androids or iPhones, still use x86 computers to develop the code. They then "deploy" that code to mobile phones. This is cumbersome and only acceptable because developers are accustomed to the limitation.

But if Apple ships a Macbook based on the same ARM processor as their iPhone, then this will change. Every developer in the world will switch. This will make development for the iPhone cheaper, and software will be better optimized. Heck, even Android developers will want to switch to using Macbooks as their development platforms.

Another marketing decisions is to simply fold the two together in the long run, such that iOS and macOS become the same operating system. Nobody knows how to do this yet, as the two paradigms are fundamentally different. While Apple may not have a specific strategy on how to get there, they know that making a common hardware platform would be one step in that direction, so a single app could successfully run on both platforms.

Thus, maybe their long term goal isn't so much to transition Macintoshes to ARM so much as make their iPads and Macbooks indistinguishable, such that adding a bluetooth keyboard to an iPad makes it a Macintosh, and removing the attached keyboard from a Macbook makes it into an iPad.


All tech companies

The model we have is that people buy computers from vendors like Dell in the same way they buy cars from companies like Ford.

This is now how major tech companies work. Companies like Dell don't build computers so much as assemble them from commodity parts. Anybody can assemble their own computers just as easily as Dell. So that's what major companies do.

Such customization goes further. Instead of an off-the-shelf operating system, major tech companies create their own, like Google's Android or Apple's macOS. Even Amazon has their own version of Linux.

Major tech companies go even further. They design their own programming languages, like Apple's Swift or Google's Golang. They build entire "stacks" of underlying technologies instead of using off-the-shelf software.

Building their own CPUs is just the next logical step.

It's made possible by the change in how chips are made. In the old days, chip designers were the same as chip manufacturers. These days, that's rare. Intel is pretty much the last major company that does both.

Moreover, instead of designing a complete chip, companies instead design subcomponents. An ARM CPU is just one component. A tech company can grab the CPU design from ARM and combine it without other components, like crypto accelerators, machine learning, memory controllers, I/O controllers, and so on to create a perfect chip for their environment. They then go to a company like TSMC or Global Foundries to fabricate the chip.

For example, Amazon's $10,000 Graviton 1 server and the $35 Raspberry Pi 4 both use the ARM Cortex A72 microprocessor, but on radically different chips with different capabilities. My own microbenchmarks show that the CPUs run at the same speed, but macrobenchmarks running things like databases and webservers show vastly different performance, because the rest of the chip outside the CPU cores are different.


Apple is custom ARM

When transitioning from one CPU to the next, Apple computers have been able to "emulate" the older system, running old code, though much slower.

ARM processors have some problems when trying to emulate x86. One big problem is multithreaded synchronization. They have some subtle difference which software developers are familiar with, such that multicore code written for x86 sometimes has bugs when recompiled for ARM processors.

Apple's advantage is that it doesn't simply license ARM's designs, but instead designs its own ARM-compatible processors. They are free to add features that make emulation easier, such as x86-style synchronization among threads. Thus, while x86 emulation is difficult for their competitors, as seen on Microsoft's Surface Pro notebooks, it'll likely be easier for Apple.

This is especially a concern since ARM won't be faster. In the previous three CPU changes, Apple went to a much faster CPU. Thus, the slowdown in older apps was compensated by the speedup in new/updated apps. That's not going to happen this time around, as everything will be slower: a ton slower for emulated apps and slightly slower for ARM apps.


ARM is doing unto Intel

In the beginning, there were many CPU makers, including high-end systems like MIPS, SPARC, PA-RISC, and so on. All the high-end CPUs disappeared (well, essentially).

The reason came down the fact that often you ended up spending 10x the price for a CPU that was only 20% faster. In order to justify the huge cost of development, niche CPU vendors had to charge insanely high prices.

Moreover, Intel would come out with a faster CPU next year that would match yours in speed, while it took you several more years to produce your next generation. Thus, even by the time of your next model you were faster than Intel, the moment in time right before hand you were slower. On average, year after year, you didn't really provide any benefit.

Thus, Intel processors moved from low-end desktops to workstation to servers to supercomputers, pushing every competing architecture aside.

ARM is now doing the same thing to Intel that Intel did to its competitors.

ARM processors start at the insanely low end. Your computer likely already has a few ARM processors inside, even if it's an Intel computer running Windows. The harddrive probably has one. The WiFi chip probably has one. The fingerprint reader probably has one. Apple puts an ARM-based security chip in all it's laptops.

As mobile phones started getting more features, vendors put ARM processors in them. They were incredibly slow, but slow meant they consumed little power. As chip technology got more efficient, batteries held more charge, and consumers became willing to carry around large batteries, ARM processors have gotten steadily faster.

To the point where they compete with Intel.

Now servers and even supercomputers are being built from ARM processors.

The enormous volume of ARM processors means that ARM can put resources behind new designs. Each new generation of ARM Cortex processors gets closer and closer to Intel's on performance.


Conclusion

ARM is certainly becoming a competitor to Intel. Yet, the market is littered with the corpses of companies who tried to ride this wave and failed. Just Google "ARM server" over the last 10 years to see all the glowing stories of some company releasing an exciting new design only to go out of business a year later. While ARM can be "competitive" in terms of sometimes matching Intel features, it really has no "compelling" feature that makes it better, that makes it worth switching. The supposed power-efficiency benefit is just a myth that never pans out in reality.

Apple could easily make a desktop or laptop based on its own ARM CPU found in the iPhone. The only reason it hasn't done so already is because of marketing. If they just produce two notebooks like Microsoft, they leave the customer confused as to which one to buy, which leads to many customers buying neither.

One market differentiation is to replace their entire line, and make a complete break with the past as they've done three times before. Another differentiation would be something like the education market, or the thin-and-light market, like their previous 12-inch Macbook, while still providing high-end system based on beefy Intel processors and graphics accelerators from nVidia and AMD.








No comments: