Friday, December 30, 2011

No Rants Just Sincere Concerns for Linux at Home

2012 is a few hours away. Thankfully, no one is claiming that "Year of the Linux Desktop" anymore across OSS fora or other social media on the web.

Linux Desktop is not going to die, but it won't win more home desktop users either. The present stack of servers will continue to work day and night in the racks, linux servers will power my office website, Android cell phones will probably gain more users, Windows 8 may go unnoticed on the mobile platform, million other appliances may definitely run on linux, more linux distributions will surface, some will die unmaintained, there will be hue and cry about the major improvements in some of the core and sub-core components of linux, four new kernel versions will be released, gnome shell will ride v. 3.6, we'll see some more forking of gnome and other desktop environments as well as distributions... so on and so forth. But the usershare will dwindle roughly at 1%.

There'll be arguments and counter-arguments - Windows comes preinstalled, Resistance to change, Lack of games on linux platform, Lack of drivers, Lack of marketing, Microsoft and Hardware vendors are partners in crime and on and on...

Are these arguments valid? If yes, how much? Moreover, are they going to bring any change? A big NO. Why?

I smell there are far more serious issues than the lack of application software, device drivers, games, lack of adverting/marketing or microsoft's ill practice. I've seen vast improvement in all these areas. Where Linux and the community lag behind is putting the basics of home computing right. In other words,  the platform suffers from bad integration of components. How?

Gone are the days of dirty commands on a text console. In 2012 99% users are going to use computers as just other electronic appliances. They are not going to worry what's going under the hood as long as it works. I'm using Linux for more than a decade (two of my home PCs are linux - Debian and PCLinuxOS, 100's of my office workstations are linux - CentOS and Fedora), even administered some desktops for sometime. Here are my list of the basics that go wrong in Linux, always.

 - Bad Workarounds
 - Overlapping Packages/Procedures
 - Problematic Sub-core Components
 - Fast Moving Base and Major Components
 - Ever Changing Desktop Metaphor

  • Bad Workarounds: For appliance-oriented present world, either it works or it doesn't. Anything in between is an annoyance. Device drivers in the kernel stagging area, especially for graphics and wifi, fall in the user annoyance category. Here're a couple of issues from my personal experience. Let's first talk about Radeon HD 6250 graphics that came with Asus Eee PC 1215B. AMD proclaimed it had opensource drivers (in kernel 2.6.38) ready for all the Fusion APUs (my graphics driver falls into this category), besides, it had proprietary binary linux drivers too. Sadly, driver support in both the cases are workarounds, far from being comparable to their windows counterparts. Open source drivers were damn slow, lacked much of the features of the device, even power management was shoddy. Installing proprietary driver was cryptic. It should confirm to a certain version of xorg, require libva and xvba-va-driver packages, require the mediaplayers to be set to xv or x11 or opengl (can't recall the exact one). After all these be prepared to see some conflicts with your desktop environment. I had similar experience with my Atheros wifi card. First, there was no opensource driver support, but it worked on ndiswrapper. Then one morning it broke, opensource b43 drivers superseded it. I had to blacklist it and then finally switched to windows drivers which would on/off on its own like a pain in the ass. Finally with Linux 3 kernel it worked as supposed. In my view the device maker or the community should not taut that "it works" or "it is supported", if it doesn't work the way manufacturer's specs claim. Users are not worried about whether it's opensource or closed. All they expect is not to see any ugly surprise. For your information, the opensource radeon hd 6250 driver performance is poorer than x3150 igp that comes with any atom pinetrail cpu.
  • Overlapping Packages/Procedures: Choice is good as long as the procedures don't multiply/overlap. Human life is too short to waste time reinventing the same wheel. All the distros are free, but the time involved to settle at one after a little bit of hopping is not. There is no benefit in learning multiple procedures for doing the same thing which should otherwise happen unnoticed. For example, take a look at the package management. In your distro-hopping you're likely to interface terms such as apt-get, yum, pacman, ppa, official repo, unofficial repo, free, non-free, restricted, local installation, remote installation, package pinning, compiling from source, so on... The other day, a linux guy at my office was dabbling yum commands on my debian squeeze (apparently none worked), he's just too used to yum in CentOS, no one to blame.  Similarly, you'll see a bewildering multiplicity of commands/tools/packages for configuring your desktop, setting network and doing user administration. Overlapping packages is even funnier. Can't remember properly if it was on Ubuntu - after installing both KDE and Gnome I saw network-manager-kde and network-manager-gnome in my installed list of packages. But only network-manager-gnome would work on kde and gnome. There're a dozen other packages just to monitor/show/configure your network. It's not funny, it's ugly. You might see similar multiplicity in your desktop in sound components - alsa, pulse, oss, bla..bla.. bla... Multiplicity in application software such as office suites and graphics software is ok. They don't change across distributions.. But there should be certain degree of order in desktop configuration tools/packages. For now the situation is a mess.
  • Problematic Sub-core Components: Consider the subcore components such as sound, graphics, network devices, printers, etc., and their integration on top of linux. In Windows you need to install a certain driver (100% time it is supplied by the manufacturer). That's it. But in Linux proper sound is not just about something called drivers - it involves kernel, alsa, oss, pulse, and some packages to wrap those packages with each other. What's worse, even after all these packages talk to each other well, sometimes you can't hear any sound, send the output across hdmi, or the sound is too choppy. Then of course, you fiddle with setting sound channels, reconfigure alsa and what not! You are overwhelmed by the presence of six items in your application menu just for sound such as pulse-volume manager, alsa-mixer, audio-manager, sound, etc. Same thing with graphics, there's no such thing as just graphics drivers, it's a set of packages - kernel, xorg, proper modules, proper players, plugins, proper wrappers, etc. You take some time to figure it out, and put them together well. You're happy till an update breaks them all.
  • Fast Moving Base and Major Components: Frequent release of base and major components often wards off the prospective linux users. It's like world's fastest roller coaster that never stops, there's no option to ride it. Consider the desktop environments Gnome and KDE. There was an almost perfect KDE 3.5. Then came the disaster KDE 4.0 final (quite sometime after the official alpha, beta and rc releases). The final 4.0 was worse than industry standard alpha release of most software. And took roughly 3 years to bake further to be usable by all. Same thing can be said of Gnome 3. The point is linux community doesn't let a stable, good working major component stay for long enough to be used. Major percentage of the software timeline here is a kind of never-ending work-in-progress. Any major upgrade of software-stack in GNU/Linux world, even after it comes with usual alpha, beta and rc cycle, is far from being usable/stable/bugfree; bugs outweigh features. Compiz, pulseaudio and many other major packages have had similar buggy health long after they had been touted final versions.
  • Ever Changing Desktop Metaphor: Some ways it's similar to the previous issue. At some point in the Linux history KDE 3 was very popular. It kept on improving and became rock-stable by 2008 through its 3.5.9 iteration. Then came the unholy KDE 4 in Fedora 9, it smelled more pungent than Sulphur. Even some KDE fanboys, after getting stuck with KDE4 for couple of months, left the camp to saught refuge in good old Gnome 2. Sure, it was not way too much configurable as KDE3 but it worked. Alas! Gnome team gave a similar jerk with their 3.0 release in Fedora 15. Nobody always likes spending time configuring the desktop when it never gets quite right. You'd be at the neck of some urgent work and your system would fuck the hell out of you. The desktop environment design people should understand that users develop a workflow and keyboard/mouse habit, get used to certain tweaks/tricks and settle into an environment. Challenging the same for the heck of it, in my opinion, is the worst one can expect. Users will go away looking for their familiar working desktop metaphor.

I'm sure the situation is not going to change. I will still be using linux at home and office. 99% will be using Microsoft or Apple products, despite Linux being inherently more secure.

How about this