In 2006, open-source pundit Eric S. Raymond (ESR) published a document "World Domination 201", where he discussed the upcoming transition to 64-bits. He claimed that Microsoft would have a troubled transition with 64-bit Windows, which would give Linux the opportunity to step in and take over the desktop.
Yet, the reverse has happened. Windows long ago seamlessly transitioned to 64-bits, but Linux still hasn't completed the jump.
Here is today's purchase page for Dell's cheapest desktop. It allows you to choose your OS -- but only between two different versions of 64-bit Windows. I looked further into it: Dell will not sell "home" users (of desktops/notebooks) 32-bit Windows. You must order a "business" system for that.
Linux is the reverse. The most popular distros recommend 32-bit. Here is a picture taken today (Feb 8, 2011) of the Ubuntu download page, where they recommend that people download their 32-bit version:
Suse also defaults to 32-bits:
Fedora defaults to 32-bits:
As for Mac OS X, all systems are "64-bit" and "32-bit". Their 32-bit versions can run 64-bit applications, and their 64-bit version can run 32-bit applications. You can tell the kernel to boot in either 32-bit or 64-bit mode; low-end systems (notebooks) default to 32-bit, high-end systems (Mac Pro) default to 64-bit mode.
So what went wrong?
ESR's paper predicted what would go wrong:
"Linux is still an operating system developed by geeks and hackers for geeks and hackers. The disconnect between us and the non-technical end user is still vast, and too many of us like it that way and will actually defend our isolation as a virtue."
There is nothing technically wrong with 64-bit Linux on the desktop. Linux has been "64-bit ready" since 1995, and was the first operating system to run on x64 in 2001 -- two years before x64 hardware shipped. You can even get most 32-bit applications to run by setting up a special environment.
But Adobe Flash doesn't work. WINE is buggy. While 32-bit apps should work in theory, many don't in practice. Distros like Ubuntu don't even come with a 32-bit backwards compatibility installed -- you must install is separately. An enormous amount of open-source has not been upgraded -- while they compile on 64-bit, they don't run properly.
Techies can get around all these problems, but average users cannot. Techies can run in 64-bit, average users will find this difficult.
Not to mention the failings of even 32-bit desktop. Windows and Mac users now get to enjoy GPU accelerated web browsers (including Flash). Heck, Linux still stuffers from simple issues like the mouse pointer getting stuck when the system is overloaded. It's not something techies care about, but it's actually something average users (albeit unconsciously) care a lot about.
How can Linux dominate the desktop?
Linux is already the dominant operating system, from super computers to servers to home devices to smartphones. My home wifi gateway runs Linux. So does my DVR from the cable company. So does my Sony TV. My mobile is an iPhone, which runs the open-source BSD operating-system, but more Android mobiles shipped this year.
Stop fighting the battles of the past. Who cares if Microsoft owns the desktop? The future is in other devices, where Linux will surely win.
If anything, root for Apple. Linux techies will never have the compulsive desire to simplify computers that Apple has. If anything will displace Windows on the desktop, it will be Apple, not Linux.
One of the arguments for 32-bit over 64-bit is that it performs better. This is Apple's reasoning for shipping a 32-bit kernel that runs 64-bit apps.
The performance differences are minor. 64-bit code bloats a little, because pointers are now twice as big. This in turn puts more pressure on the cache, slowing things down. On the other hand, x64 doubles the number of registers, which in turn speeds things up.
I find in my own code that 64-bit is slightly faster overall. Either way, it's not a convincing argument why Linux desktops are stuck at 32-bit.
64-bit isn't needed yet
Nonsense. I just purchased 24-gigs of RAM for my desktop for $300.
Certainly, not all applications need 32-bit. For example, Visual Studio 2010 is a 32-bit application (for producing 64-bit code). That's because there is nothing it does that can esceed 4gigs of RAM.
On the other hand, Microsoft Office is now 64-bit. That's because high-end users do things like creating monster Excel spreadsheets.
I see this most often when I load large packet-captures using Wireshark. The 32-bit version crashes because the files are too big. The 64-bit version happily loads these monster files.
So what do I run? I'm writing this on a MacBook Air running Windows 7 64-bit (and BackTrack 4 under VMware).
Apple's hardware is fantastic, but in the end, I'm a Windows fan.
The overwhelming reason is "keyboard shortcuts". Windows does this better than Mac or Linux. It allows me to wiz through applications much faster than with a mouse. Most people don't care about this, but I care a lot.
The second reason is Visual Studio, Microsoft's C compiler (and other languages). It's source-level debugging is far beyond what you can get with GDB and GDB-derived GUIs like X Code. I've used GDB from the 1980s, and I still passionately hate it.
Which is weird. I develop my code for Linux. But I develop it under Windows and Visual Studio, and debug portability issues under Mac OS X and X Code. I hate Linux IDEs that much.
To restate my point: Dell refuses to ship a 32-bit version of Windows for home users, only 64-bit. In contrast, the major Linux desktop distros recommend 32-bit versions of their desktop.
There's a big difference between Dell shipping a new system, designed to work with a 64-bit OS, and a download which could be installed on a rather older machine.
If someone asked you which version - either Windows 7 or Linux - to install on an unspecified machine, I bet you would recommend 32-bits as being more certain to work.
You can still get caught out on recent hardware - I had 4 apparently identical HP DC7100 SFF systems, but one had a slightly different CPU variant that did not fully support 64 bits. (I had to check the processor flag bits to spot the difference, the model numbers were identical.)
So it's not surprising the Linux distros default to 32 bits. I'd hope that pre-installed new Linux systems would pick 64-bits as Dell does for Windows 7.
If that were true, then the default "server download would be 32-bits. But, on Ubuntu/Fedora, if you select a "server" system, it recommends 64-bit.
The cost of wrongly choosing 64-bits is that the install breaks, and you need to redownload the 32-bit version.
The cost of wrongly choosing 32-bits (assuming 64-bits is better) is that you limp along with a working 32-bit system.
No, if 64-bit Linux gave a superior user experience on 64-bit hardware, they would recommend it.
As a pointed out, though, is that it doesn't. The basic is reason is that Adobe Flash isn't supported. I predict when Adobe officially supports a 64-bit version for Linux, Ubuntu/Fedora will switch to 64-bit as their recommended platforms.
Good post! A bit offtopic but i was curious what keyboard shortcuts do you use that much that you can't live without.
I myself also use keyboard shortcuts for almost every task I can think of, and I'm always interested to know what other people use :)
First of all, there are the text editing short cuts, like <shift-up-arrow> to select the previous line. The fascinating thing is, I don't know what the shortcuts are. If you asked me "how to go to the top of the document", I wouldn't be able to tell you. But if I wanted to do it, I would unconsciously press the right combination (which, as it turns out, is <ctrl-Home>).
I prefer the two-key combinations that select menu options, rather than one-key operations, like "<alt-F&gr; S" for "save", rather than <ctrl-S>. Likewise, when looking for something that I don't already know, I'll pull down the menu, then use arrow keys to navigate the menu.
In dialog boxes, I'll use things like <tab> to switch fields.
It is so painful sitting next to somebody else working with them on something. Whereas I accomplish something in a fraction of a second (like deleting a line), they take several seconds to reach for the mouse, selected the line, then reselect the line because the mouse wasn't quite on the right position, then moving the mouse up the Edit menu, then selecting Delete.
This is why I'm not a believing in device like the iPad. For advanced users like me, I simply will for something to happen, and it does. I'm not longer consciously aware of the keystrokes I used to accomplish the task -- as far as I'm concerned, the computer has read my mind.
It's like touch typing to begin with. I'm no longer aware of typing each individual keys. I simply will for the text to appear, and it does. That's a problem, because times unconscious thoughts appear on the screen. I sometimes make typing errors with wholly unrelated words (rather than single keystroke errors). It's because I was thinking it somehow, and it appeared in the text.
Post a Comment