OSS Usability and Linux: What?

I found an interesting article in the Digg queue today titled, "Windows & OSS: The Usability Problem".

After giving it a read, I realized that I just don't agree with most of the author's points. Allow me to unleash the hounds:

Lack of standardized user interface

Users of open source operating systems are spoilt for choice: Gnome, KDE and Xfce only to name a few desktops and Blackbox, WindowMaker, AfterStep, FluxBox, fvwm and mwm just to name a few of available window managers. Yes, diversity is generally a good thing, but consider how confused an average Windows user must feel when all the programs look and behave differently among different desktops.


Well, I've got some ground-breaking news for you: It's been a long time since I've used a distribution of Linux that actually asked me which desktop environment I wanted to use. I think it's safe to say that most Linux users use either KDE or GNOME [1].

On all my systems, except on the Solaris boxes, I am running KDE. Why? Because it lets me concentrate on getting the work done and does not bother me with trivial tasks like mounting and unmounting devices and the like.


I've got some more news here: That's why everyone runs GNOME too. Both desktop environments offer a complete set of good applications for daily activities that are easy to use.

No, I am not using Gnome because of the poor GTK architecture and the lack of basic stability. On all my systems, from Laptops to workstations, Gnome did not perform well at all.


Poor GTK architecture? Lack of "basic stability"? Since the author carefully failed to provide evidence to support his argument, I'm going to ask all the GNOME/Ubuntu users out there: Have you experienced a lack of "basic stability"? Perhaps someone can point me to the source of this basic instability?

Diversity among Linux kernels

While the author does make some valid points here, I think many people view the Linux kernel the wrong way. Vendor specific kernels allow vendors to do better Q&A testing in order to ensure their distro works as stably as they'd like, while also letting them add new features in-between kernel releases. For example, the Ubuntu kernel often has features backported from the next "unstable" kernel release, to the benefit of the users. As well, new features and bugfixes that the Ubuntu team finds are sent back "upstream" to the official Linux kernel [2], so that other distributions can benefit from them as well. That's how open source is done these days. Essentially, new features and bugfixes are only vendor-specific until they're sent upstream and included in the next official Linux kernel release.

Hardware known to work on one system does not work on the other due to missing drivers or modules


Let me ask the reader the obvious question: If your hardware works in Ubuntu, would you expect it to work on OS X? So is it fair to expect that it would also work in Fedora Core? Fedora is a different operating system from Ubuntu.
I'll finish this thought below...

Effectively no software and hardware certification standards

The author again makes valid points here, but I'd like to mention my own thoughts:
As far as I can tell, a piece of software that's included in the default GNOME desktop environment is as close to being "certified software" as possible. It's guaranteed to be stable, have a consistent/friendly UI, and in general, be useful. I'd rather have my apps be GNOME-certified and Ubuntu-certified rather than LSB-certified. See where I'm heading with this?

Poor stability of many user programs

The days of horribly unstable Linux apps are mostly gone, and that's entirely due to the quality standards that have been set by distributions like Ubuntu and Fedora. It's up to the distributions to make sure they package stable software, and the major ones do!
Also, that big paragraph about Sally installing an RPM doesn't really apply to apt-based distributions like Debian and Ubuntu. (If the package wasn't already in a repository, then a properly created package downloaded from the web wouldn't have a problem.) :P

I'm not even going to start on the documentation stuff. (Actually, I lied: GNOME software is documented well.)

My Conclusion

Over the last few years, I feel there's become a greater distinction between "Linux" and Linux distributions. What exactly is Linux these days?
When I use Ubuntu, I'm using a Linux kernel enhanced by the Ubuntu team. When I browse the internet, I'm using GNOME and Firefox, both of which were tweaked by the Ubuntu team. The Ubuntu team didn't choose all of this software and tweak it accordingly just to be different: They did it to tackle the usability problem and the standardized user interface problem. They did it to provide a rock-solid kernel with the best hardware compatibility out there, to ensure users have stable software, and to ensure that software is quality software. Did I mention they do it every six months too?

The article in question is another example of an article that would have held its ground three years ago, but my, how things have changed.


[1] Desktop Linux 2006 Survey Results
[2] Search the 2.6.18 kernel changelog for the word "ubuntu" to see what I mean.