Monday, January 16, 2012

AMD Opensource Gallium is a PITA

Last night I was reading an article comparing proprietary catalyst drivers with opensource gallium drivers. It was an eye-opener. Cases such as this shitty/shoddy performance of gallium drivers is the reason why people prefer proprietary blobs to FOSS bits and pieces. Here is just an example. For more, please visit http://www.phoronix.com/scan.php?page=article&item=amd_rv770_linux31&num=2



Tuesday, January 3, 2012

Did Linux Abandon Netbooks?

Four years back when Asus EEE PC line surfaced with its 7" netbooks, it was told that Linux is reviving on home computing. The first EEE PC models came with a certain modified versions of Xandros, the then popular commercial desktop linux. Popularity of those tiny computers inspired almost every other hardware vendors to come up with comparable models. We saw followers in HP, Lenovo, Sony and Acer, among many others. Keeping pace with the hardware we also saw many customized Linux distributions that gave full support to Asus EEE PCs, Acer Aspire One PCs and HP Mini Notebooks.

The prominent distributions among them were eeebuntu, eedora, debian eee pc, Jolicloud and UNR. Suddenly, the trend reversed in favor of XP and Win7. Now I see most of those hobby projects abandoned, though there's no official statement regarding the same.

Here are the screenshots of EEE PC support projects from two prominent distributions, Fedora and Debian, taken today. The screenshots (click on them to see them at full resolution) speak a lot about their status.

First, Debian EEE PC. The screenshot is about the support models. As you can see, there's none of the most popular models that came last year.

Second, Fedora Netbook. It tauts of supporting entire EEE PC line (sadly on 2.6.35 kernel). I have tried with latest (ridiculously, that's 7 month's old) fedora notbook kernel and eee-control package, none of them work on modern EEE PC or Aspire One models.
Third, Eeebuntu, now Aurora, has not been updated for last one year. The release page reads "Aurora Gnome Edition will be the Primary Release from......" That "will be" kills the mood. Have a look at the screenshot.
Of course, you don't always need these customized distributions for your netbook. Many of the models work quite well on the current stock releases of Fedora, Debian and Ubuntu. But the tragedy is most often, the special features of these gadgets such as hotkeys and power control don't work on the official releases of these mainstream distributions. Funny but true, android works better on many of these models, though the lack of productivity apps turns off the users from deploying it on their netbooks.

Friday, December 30, 2011

No Rants Just Sincere Concerns for Linux at Home

2012 is a few hours away. Thankfully, no one is claiming that "Year of the Linux Desktop" anymore across OSS fora or other social media on the web.

Linux Desktop is not going to die, but it won't win more home desktop users either. The present stack of servers will continue to work day and night in the racks, linux servers will power my office website, Android cell phones will probably gain more users, Windows 8 may go unnoticed on the mobile platform, million other appliances may definitely run on linux, more linux distributions will surface, some will die unmaintained, there will be hue and cry about the major improvements in some of the core and sub-core components of linux, four new kernel versions will be released, gnome shell will ride v. 3.6, we'll see some more forking of gnome and other desktop environments as well as distributions... so on and so forth. But the usershare will dwindle roughly at 1%.

There'll be arguments and counter-arguments - Windows comes preinstalled, Resistance to change, Lack of games on linux platform, Lack of drivers, Lack of marketing, Microsoft and Hardware vendors are partners in crime and on and on...

Are these arguments valid? If yes, how much? Moreover, are they going to bring any change? A big NO. Why?

I smell there are far more serious issues than the lack of application software, device drivers, games, lack of adverting/marketing or microsoft's ill practice. I've seen vast improvement in all these areas. Where Linux and the community lag behind is putting the basics of home computing right. In other words,  the platform suffers from bad integration of components. How?

Gone are the days of dirty commands on a text console. In 2012 99% users are going to use computers as just other electronic appliances. They are not going to worry what's going under the hood as long as it works. I'm using Linux for more than a decade (two of my home PCs are linux - Debian and PCLinuxOS, 100's of my office workstations are linux - CentOS and Fedora), even administered some desktops for sometime. Here are my list of the basics that go wrong in Linux, always.

 - Bad Workarounds
 - Overlapping Packages/Procedures
 - Problematic Sub-core Components
 - Fast Moving Base and Major Components
 - Ever Changing Desktop Metaphor

  • Bad Workarounds: For appliance-oriented present world, either it works or it doesn't. Anything in between is an annoyance. Device drivers in the kernel stagging area, especially for graphics and wifi, fall in the user annoyance category. Here're a couple of issues from my personal experience. Let's first talk about Radeon HD 6250 graphics that came with Asus Eee PC 1215B. AMD proclaimed it had opensource drivers (in kernel 2.6.38) ready for all the Fusion APUs (my graphics driver falls into this category), besides, it had proprietary binary linux drivers too. Sadly, driver support in both the cases are workarounds, far from being comparable to their windows counterparts. Open source drivers were damn slow, lacked much of the features of the device, even power management was shoddy. Installing proprietary driver was cryptic. It should confirm to a certain version of xorg, require libva and xvba-va-driver packages, require the mediaplayers to be set to xv or x11 or opengl (can't recall the exact one). After all these be prepared to see some conflicts with your desktop environment. I had similar experience with my Atheros wifi card. First, there was no opensource driver support, but it worked on ndiswrapper. Then one morning it broke, opensource b43 drivers superseded it. I had to blacklist it and then finally switched to windows drivers which would on/off on its own like a pain in the ass. Finally with Linux 3 kernel it worked as supposed. In my view the device maker or the community should not taut that "it works" or "it is supported", if it doesn't work the way manufacturer's specs claim. Users are not worried about whether it's opensource or closed. All they expect is not to see any ugly surprise. For your information, the opensource radeon hd 6250 driver performance is poorer than x3150 igp that comes with any atom pinetrail cpu.
  • Overlapping Packages/Procedures: Choice is good as long as the procedures don't multiply/overlap. Human life is too short to waste time reinventing the same wheel. All the distros are free, but the time involved to settle at one after a little bit of hopping is not. There is no benefit in learning multiple procedures for doing the same thing which should otherwise happen unnoticed. For example, take a look at the package management. In your distro-hopping you're likely to interface terms such as apt-get, yum, pacman, ppa, official repo, unofficial repo, free, non-free, restricted, local installation, remote installation, package pinning, compiling from source, so on... The other day, a linux guy at my office was dabbling yum commands on my debian squeeze (apparently none worked), he's just too used to yum in CentOS, no one to blame.  Similarly, you'll see a bewildering multiplicity of commands/tools/packages for configuring your desktop, setting network and doing user administration. Overlapping packages is even funnier. Can't remember properly if it was on Ubuntu - after installing both KDE and Gnome I saw network-manager-kde and network-manager-gnome in my installed list of packages. But only network-manager-gnome would work on kde and gnome. There're a dozen other packages just to monitor/show/configure your network. It's not funny, it's ugly. You might see similar multiplicity in your desktop in sound components - alsa, pulse, oss, bla..bla.. bla... Multiplicity in application software such as office suites and graphics software is ok. They don't change across distributions.. But there should be certain degree of order in desktop configuration tools/packages. For now the situation is a mess.
  • Problematic Sub-core Components: Consider the subcore components such as sound, graphics, network devices, printers, etc., and their integration on top of linux. In Windows you need to install a certain driver (100% time it is supplied by the manufacturer). That's it. But in Linux proper sound is not just about something called drivers - it involves kernel, alsa, oss, pulse, and some packages to wrap those packages with each other. What's worse, even after all these packages talk to each other well, sometimes you can't hear any sound, send the output across hdmi, or the sound is too choppy. Then of course, you fiddle with setting sound channels, reconfigure alsa and what not! You are overwhelmed by the presence of six items in your application menu just for sound such as pulse-volume manager, alsa-mixer, audio-manager, sound, etc. Same thing with graphics, there's no such thing as just graphics drivers, it's a set of packages - kernel, xorg, proper modules, proper players, plugins, proper wrappers, etc. You take some time to figure it out, and put them together well. You're happy till an update breaks them all.
  • Fast Moving Base and Major Components: Frequent release of base and major components often wards off the prospective linux users. It's like world's fastest roller coaster that never stops, there's no option to ride it. Consider the desktop environments Gnome and KDE. There was an almost perfect KDE 3.5. Then came the disaster KDE 4.0 final (quite sometime after the official alpha, beta and rc releases). The final 4.0 was worse than industry standard alpha release of most software. And took roughly 3 years to bake further to be usable by all. Same thing can be said of Gnome 3. The point is linux community doesn't let a stable, good working major component stay for long enough to be used. Major percentage of the software timeline here is a kind of never-ending work-in-progress. Any major upgrade of software-stack in GNU/Linux world, even after it comes with usual alpha, beta and rc cycle, is far from being usable/stable/bugfree; bugs outweigh features. Compiz, pulseaudio and many other major packages have had similar buggy health long after they had been touted final versions.
  • Ever Changing Desktop Metaphor: Some ways it's similar to the previous issue. At some point in the Linux history KDE 3 was very popular. It kept on improving and became rock-stable by 2008 through its 3.5.9 iteration. Then came the unholy KDE 4 in Fedora 9, it smelled more pungent than Sulphur. Even some KDE fanboys, after getting stuck with KDE4 for couple of months, left the camp to saught refuge in good old Gnome 2. Sure, it was not way too much configurable as KDE3 but it worked. Alas! Gnome team gave a similar jerk with their 3.0 release in Fedora 15. Nobody always likes spending time configuring the desktop when it never gets quite right. You'd be at the neck of some urgent work and your system would fuck the hell out of you. The desktop environment design people should understand that users develop a workflow and keyboard/mouse habit, get used to certain tweaks/tricks and settle into an environment. Challenging the same for the heck of it, in my opinion, is the worst one can expect. Users will go away looking for their familiar working desktop metaphor.

I'm sure the situation is not going to change. I will still be using linux at home and office. 99% will be using Microsoft or Apple products, despite Linux being inherently more secure.

Thursday, November 3, 2011

Debian Squeeze on Asus Eee PC 1215B



Device Status Workaround
Sound Works Need to change settings on alsamixer to get speakers/headphone jack and mic working
Graphics Works The default 2.6.32 squeeze kernel requires radeon.modeset=0 at the end of kernel line to boot properly. The resolution never maxes out 1024x768. Proper 1366x768 resolution and graphic acceleration is achieved after installing 2.6.39 kernel  (http://cdimage.debian.org/cdimage/unofficial/backports/squeeze/squeeze-custom-amd64-0808.iso) from backports repo and proprietary catalyst drivers (http://www2.ati.com/drivers/linux/ati-driver-installer-11-10-x86.x86_64.run) from amd website.
Webcam Works Works out of box.
CPU Works Works out of box. Detects both the cores and steps up/down as per the load.
Network Works Atheros LAN card works well on 2.6.39 kernel. Broadcom 4313 WLAN card works well after installing firmware-brcm80211 (http://packages.debian.org/squeeze/all/firmware-brcm80211/download) from non-free repo.
Bluetooth Works Works out of the box. With gnome-sharing transferring files over bluetooth is absolutely painless.
Hotkeys Don't work Don't work even after installing eeepc-acpi-scripts (http://packages.debian.org/squeeze/all/eeepc-acpi-scripts/download) and adding acpi_osi=Linux to the kernel boot line. Eeepc-wmi may resolve this issue but it's not available yet for Debian Squeeze.

Here's lspci info you might like to see:

00:00.0 Host bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Complex
00:01.0 VGA compatible controller: ATI Technologies Inc Device 9804
00:01.1 Audio device: ATI Technologies Inc Wrestler HDMI Audio [Radeon HD 6250/6310]
00:04.0 PCI bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Port
00:05.0 PCI bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Port
00:11.0 SATA controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 SATA Controller [AHCI mode]
00:12.0 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB OHCI0 Controller
00:12.2 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB EHCI Controller
00:13.0 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB OHCI0 Controller
00:13.2 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB EHCI Controller
00:14.0 SMBus: ATI Technologies Inc SBx00 SMBus Controller (rev 42)
00:14.2 Audio device: ATI Technologies Inc SBx00 Azalia (Intel HDA) (rev 40)
00:14.3 ISA bridge: ATI Technologies Inc SB7x0/SB8x0/SB9x0 LPC host controller (rev 40)
00:14.4 PCI bridge: ATI Technologies Inc SBx00 PCI to PCI Bridge (rev 40)
00:15.0 PCI bridge: ATI Technologies Inc SB700/SB800 PCI to PCI bridge (PCIE port 0)
00:15.2 PCI bridge: ATI Technologies Inc Device 43a2
00:18.0 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 0 (rev 43)
00:18.1 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 1
00:18.2 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 2
00:18.3 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 3
00:18.4 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 4
00:18.5 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 6
00:18.6 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 5
00:18.7 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 7
01:00.0 Network controller: Broadcom Corporation BCM4313 802.11b/g/n Wireless LAN Controller (rev 01)
02:00.0 Ethernet controller: Atheros Communications AR8152 v2.0 Fast Ethernet (rev c1)
07:00.0 USB Controller: ASMedia Technology Inc. ASM1042 SuperSpeed USB Host Controller

Tuesday, September 13, 2011

Why Linux Sucks on Desktops and How to Save Your Ass

In terms of pure resource usage, performance, stability and security Linux wins. Pick any distro (from Debian, Scientific, OpenSuse, Mint, Arch, PCLinuxOS...) and compare it with Windows 7, you'll know what I mean. Discard X-desktop and the stuff beyond, pure linux shell is perhaps the most powerful tool for computing. I've never been a Windows guy. But quite often I came back to it at odd times when I am almost pissed off by the so called direction (or lack of it) in the Linux world. So what're those minor glitches that sour the desktop experience.

Here's why Desktop Linux Sucks

Desktop Graphics Drivers

If you've a plain Intel IGP on your mobo and you are not a gamer, almost always you'll have a smoother experience. Similarly you'll have a painless experience with older realtek, atheros, huwei and many other devices. But if it's new, shiny and of non-standard (as per linux driver support) you're stuck. Take for example Nvidia optimus graphics technology related to switching between IGP and discreet graphics. It's been more than two years yet the graphics stack is half-baked. ATI/AMD side of the story, especially concerning the recent Fusion series APUs, is more grim. Though AMD was much vocal a year back regarding open source drivers for its fusion series APUs, the driver support for Linux is lame at best at the moment of writing this post.

I have an Asus 1215b EEE PC based on Fusion platform that sports a C-50 APU (AMD Ontario CPU + Radeon 6250 GPU). Windows 7 runs quite well and offers a thrilling graphics experience powered by UVD and DirectX 11. Linux? Fusion graphics is muddy with multiple wrappers, drivers and methods. When kernel 2.6.38 flaunted of having Fusion APU support through Gallium drivers it didn't disclose that it was limited to just decent graphics. You can't expect anything beyond, forget the support for UVD and 3D acceleration till next couple of years. For better graphics experience you're left with distro-specific fglrx drivers, xvba/vaapi wrappers and suitable xorg pieces. But the distro-specific drivers are generally dated, so, I pulled in the latest catalyst driver sources from AMD and compiled them for Debian Squeeze. I had a good-go, but the graphics performance was inferior to that of Windows 7.

Correct the Basics

Blue Screen of Death is history. Modern Windows OS (XP onwards) ensures you land at least on a basic vga mode if the install disk lacks proper display drivers. Then you're ready to install the proprietary drivers. But linux graphics problems sometimes slam you with a black screen (call it Black Screen of Death), and if you are unlucky you even can't enter to a rescue shell. Sure, there are dozen of cheat codes [(nomodeset, radeon.modeset=0, nvidia.modeset=zero, intel.modeset=0, if kms is messed up) or (vesa="numeric resolution value" for a vga screen or xforcevesa) or (some similar acpi cheat codes on the kernel line)] to put you on a workable shell. Who cares with these not-so-dirty but definitely-cryptic codes? Distributions should come up with fool-proof measures to land the users on a vga desktop without much fiddling around.

Desktop paradigm is gone. It's the time for mobile computing where sleep/suspend/hibernate/resume is very necessary. Linux world has been fighting with these features for years. These features work fine with standard distributions running on standard hardware. Sadly, they are far from being stable in case of very new or esoteric hardware.

Bewildering Choice vis-a-vis Rapid Development

Choice is good. But bewildering choice is very bad. Mass look for a few working applications, not a million shoddy clones. Situation is slowly improving in this regards. Thanks, the leading and serious flavors such as RHEL (and its clones), Debian, Arch and most recently Mandriva following frugality as far as choice of applications and desktop environments are concerned. Less configurations and less packages means less clutter. The bewildering choice and plurality in design philosophy decrease the mindshare. It also kills much of developers' hours in re-inventing the wheel. This coupled with rapid development worsens things further. Take the most popular distributions of our time, Ubuntu. Though it pulls packages from Debian testing/unstable it puts efforts in developing a few packages and polishing them. It follows a fully automated packaging and testing. However, given a 6-months release cycle, it must not be putting more than a month towards real development. Who'll expect fidelity from such a fleeting women!

Linux != Open Source. But the later is blamed for the plurality in Linux. For example, in Windows, if a certain version of package works it works. But in linux that's not always true. For example pidgin 2.7.3 on Windows, owing to the singularity of platform and API standards will be the same across XP, Vista and Win 7. But the pidgin on Fedora might behave differently than it does in Debian. The difference lies in how the particular software is packaged across various distributions. The same is true in case of some core components such as kernel. Kernel 2.6.38 in Debian backports repository is not 100% the same in Remi's repository meant for RHEL and its clones. The same trend is true in case Ubuntu, Mandriva, Arch and Slackware. Each original distribution has its peculiar set of patches for kernel, and particular build flags and dependencies for a particular software.

Features vs. Polish

Firefox undoubtedly has more options than Chrome, and OpenOffice is more versatile than any other proprietary office suite. Both are feature-rich, but both lack polish. Firefox is trying to catch up chrome on desktop. But still it lacks the philosophy of chrome, frugality. Firefox still caches aggressively like a hungry beast and sometimes forgets to flush. OpenOffice is jumping from Sun to Novel to Document Foundation. It's as slow as a sloth. Performance improvement is a long due for OpenOffice (now LibreOffice).

So, How to Save Your Ass?

Hardware and Distribution

1. Choose standard hardware. Save the output of your "lspci" commmand using any liveCD and post the text across popular forums to know which distribution fully supports your devices. Of the supported distributions choose a stable one from Ubuntu LTS, Debian stable, CentOS or Scientific Linux. If you're hardcore gamer, forget linux for a while.
2. Pin up the critical packages. If your current setup runs your devices well pin up the core packages such as xorg, kernel, sound-base packages and other device drivers so that future upgrades won't break your setup. I've faced sound problems, graphics hells and many booting problems related to upgrading core packages.
3. Don't tinker much. Choose your favorite distribution, customize it to your liking and forget. No need to always put newer bits and pieces. Newer is not always better. Even all new features may have nothing to do with you. Go by perceivable experience regarding performance, features and stability, not by numbers and benchmarks.

Personality and Distribution

1. If you really want to learn linux and expect a painfree experience for a longer future, choose Arch or Slackware. The things you learn here will last for ever. And a perfect Arch or Slackware setup will rarely go wrong.
2. If you want to make living out of Linux go with Scientific Linux. Because it's perhaps the most sincere clone of RHEL the present king in the enterprise world. Though it doesn't replicate RHEL in bug-for-bug philosophy, it's more predictable and open than its more popular cousin, CentOS.
3. If a great no-nonsense home desktop is all you want choose one from PCLinuxOS, Mepis or Mint. All three guarantee a superb desktop experience out of the box. PCLinuxOS gathers the best from across entire Linux distros, Mint does Ubuntu much better and Mepis polishes Debian to the extremes for a hasslefree desktop experience.
4. If you don't fall into any of the above and are apathetic to Windows. Choose FreeBSD, tame it with extra caution and make it your own. It's very very unix to the core and very systematically designed. If you don't want to shed that extra sweat choose OS X. Buy Apple hardware or assemble your Mac following insanelymac website and put OS X. OS X Mach kernel is heavily inspired by Unix. You will get many of the POSIX features including Bash shell.

That pretty well sums my two cents!

Thursday, June 9, 2011

Why AMD Fails


With the merger of ATi and AMD the new AMD Fusion platform offers far better value for money performance than Intel. However, AMD is still no more near Intel when it comes to mass adoption. Why? The reason is twofold:

1. Creates hype early but comes to party very late: Take for example the Fusion APU hoopla. AMD announced this groundbreaking technology 6 years back and kept on shelving for very long till Intel appropriated similar technology into Sandy Bridge. Sure, even now Fusion is a better proposition against any of Intel's Atom architecture. But sadly, Atom has become ubiquitous before Fusion knocks the door.

2. Great hardware but poor driver/software support: Creating new technology and throwing benchmarks of the same is not everything. AMD raved for its DirectX 11 and UVD support on the Fusion platform. It's great. But failed measurable in bringing out opensource drivers in a comparable time limit. Even today opensource Fusion driver support is bad at best. Be it gallium, catalyst or the inbuilt drivers of kernel 2.6.38, none work as well as they were tauted. The whole of device innovation trashed due to lack of proper software support. Whereas, Intel has promptly provided device drivers for Sandy Bridge. Intel's open-source VA-API implementation into graphics hardware is better off in many regards than AMD's XvBA under Linux. These days, VA-API works quite well and Intel even recently introduced support not only for video decoding, but also video encoding, using the VA-API library with the new Sandy Bridge hardware. Intel's next-generation Ivy Bridge support should be the same way. The recent Intel SNA technology has boosted Sandy Bridge as well as earlier Intel IGP performance to a great extent. Here AMD lags.

Weird Dependency of Packages in Linux World

Often, even today, I come across many problems related to package dependency in linux. Most of them suck while you are installing certain packages offline or building some packages from source. However, the worst still yet, as per my opinion is those weird packages that you can't remove. If you dare to remove them you're going to uninstall some key packages from the system. That may render your PC useless!

There are thousands of such cases. Here I will cite just four: fortune cookies, cowsay, libthai and libgweather.

Last night, I was trimming Linux Mint Julia on a friend's netbook. The netbook was running on a bare-and-basic intel N450 platform. I removed as much packages as possible. But while removing those weird four packages I got frightening warnings.

While removing libthai, I got a warning (click on the pic below to see it on true aspect/size) that it was going to remove alacarte, artha, bleachbit, avidemux and some 94 other packages. Damn, what libthai has to do with my computing life! I never browse Thai websites for work and/or fun. Why the heck it's dependency with some core/key packages? Why not libindic or libafrica (if they exist at all) is such a dependency?


Next weird dependency is the combination of fortune-cookies (fortune-min, fortune-mod, fortune-husse) and cowsay. Try to remove them you'll get a notification to remove ubuntu-minimal, mintsystem and some other core/key packages (click on the pic below to see it on true aspect/size). WTF? Why the gnome or mint (or whosoever) people have bundled these fancy programs as mandatory. Any fortune or cowsay message is a nag for many. Yet, they can't remove them. Sure, there are workarounds to stop fortune and/or cowsay. But... ?


Finally, why is that libgweather an integral part of gnome? Remove libgweather packages warns to remove gnome-panel and indicator applet as well (click on the pic below to see it on true aspect/size). What was wrong in making it as an optional package? I just like clock, vol applet and network monitor on my gnome systray; sane and usable.


As I said there are hundreds such weird packages that are built as key dependency with some other really important packages. Seems, the linux world is leaning more towards fancy than function.

Monday, May 30, 2011

Linux Kernel 3.0 is Not Far Away


Linux kernel 2.6.39 just released, much earlier than expected. Reasons?

It's coming of age after releasing 39 updates to kernel 2.6 line. The Linux 2.6 kernel series is now on its way to its 40th release in the past seven years of development. Linux 2.4 series had about 24 releases prior to Linux 2.6.0 being released and the 2.4 series as of today is up to Linux 2.4.39. And Linus (alongwith the community) is getting ready for the next gen kernel - 3.0. However, 2.6 series will still get patches and updates as still does 2.4 line.

Is it just a change in the versioning scheme or are there much under the hood? Well, Jump from 2.4 to 2.6 line had some striking features, the same will happen now. Most importantly, kernel 3.0 will remove some old cruft it gathered in its life of, say, roughly 20 years.

This jump in versioning won't reflect in the package list of enterprise linux distributions such as Red Hat (and its clones) who are adamant when it comes to security and stability. That means Red Hat, CentOS, Oracle and Scientific will stick to 2.6 series for roughly a decade, and will backport only some select features.

Thursday, May 19, 2011

How to Get Rid of "Unlock Keyring" Message in Linux after You Changed Login Password

I've been running Linux Mint Julia for a quite a long time now. Only yesterday I updated the entire system (including kernel, firefox, chromium, openoffice, pidgin and all that). Also changed my Login password due to some security reasons. No ugly surprises. No lag in performance. But everytime I started chromium browser it popped up a "Login keyring" message that read "Enter password to unlock your login keyring".


It seems Seahorse uses the login password as master password to unlock its passphrases. Sadly, when the user changes the password, it is not updated to Seahorse and that "Login keyring" popup comes up.

Visited both Mint and Ubuntu fora for a fix. All the solutions there from were pointing to Seahorse (two items in System >> Preferences: Passwords and Encryption Keys, and Encryption and Keyrings).

However, the easiest and unfailing fix to avoid this message it is to remove the ~/.gnome2/keyrings/login.keyring.

Saturday, May 14, 2011

Preparing a Bootable USB to Install Windows 7 on Asus 1215B


Asus 1215B perhaps put the components together around Fusion APU the best way. Great design, brushed aluminum finish and quality components. However, with 12" form factor it could not house an optical drive. It's the typical problem of notebooks with sizes below 13". So, if the product ships without any OS preinstalled you are left with just two options - 1. try installing an OS using a borrowed/bought USB optical drive, and 2. prepare a bootable USB with the OS of your choice.

I don't have a USB optical drive, and none around me has the same. Bootable USB was the way to go. Had it been installing Linux I could have done that either using plain dd command or unetbootin tool. But from what I see this gadget still doesn't have out-of-the-box support in Linux land. I am sure things will improve with kernel 2.6.38. Meanwhile Windows 7 is the best OS for this notebook. But, creating a bootable Windows 7 usb on a linux machine demands you to get dirty with the commandline.

I did try packing Windows 7 iso image into a USB drive using dd (the most common disk copy method, what I did was "dd if=/home/msahu/Window7Ult.iso of=/dev/sdb1"). The installation started but got stuck mid-way. Then I followed the old tried/tested formula, and it worked like a charm. Here is the rundown of the same:

1. blanked the usb drive
dd if=/dev/zero of=/dev/sdb1 bs=446 count=1

2. ran fdisk
fdisk /dev/sdb1

3. removed all the partition of that usb drive, created just one primary partition and turned on the boot flag

4. converted the usb filesystem to ntfs
mkfs.ntfs /dev/sdb1

5. finally extracted the contents of Windows 7 iso image to the usb drive and booted Asus 1215B using that drive

How about this