CBC: fail.

I long ago learned to treat any mainstream media news story which attempted to use statistics extremely gingerly, but this one's just too sad for words.

The CBC excitedly informs us that 32.6% of hospital beds in Quebec are occupied by current or former smokers. This, it is suggested, means "caring for smokers in hospital costs Quebec's health-care system $930 million a year" and "smokers are taking up beds and costing the Quebec health-care system millions a year".

What is sadly lacking from the article is any mention at all of the other piece of data without which the number cited means precisely nothing...

yes, chorus along with me - so, how many people in Quebec actually smoke, then?

Well, 23.3%, apparently. So it would appear there's something to this story, in that smokers seem to be taking a disproportionate amount of hospital space. Still, you'd have absolutely no way of knowing that from the story as presented by CBC. And in fact, that number doesn't account for the 'and former smokers' wrinkle, so it's entirely possible there's virtually no substance to this story at all.

(I'm not suggesting smoking has no health effects, here. Of course it does. That's been well demonstrated by years of medical studies, many of them written by researchers with a post-kindergarten grasp on the use of statistics. I'm just pointing out a particularly egregious example of the journalistic utterly meaningless statistic...)

In the interests of balance (or, Good job, Canonical!)

Yes, don't fall out of your chairs yet.

Here's Canonical being rather decent about upstreaming.

Multitouch seems like exactly the kind of thing Canonical ought to be working on, and they're taking this upstream at the right time and with the right approach, AFAICS. That's exactly what I was talking about in this other post. I can't tell whether the proposal is awesome or crack, I'll leave that to the experts. But the process here looks awesome. So nice job with that one.

Finishing up controversial crap week: What Canonical ought to do

Okay, just one more I need to get out of my system. =)

Edit: forgot to include the normal disclaimer: as usual, this is all my personal opinion and in no way representative of Red Hat (or anyone else, for that matter).

Second edit: I've had a few Canonical contributions pointed out to me which to some extent cover the areas I've identified. I'll update the post more extensively soon, but for now, just bear in mind that Canonical is actually active in a few more areas than I identified, and that is certainly awesome.

Reading the comments on my previous post, Greg's posts, and some of the replies to both, it seems clear that quite a few readers aren't exactly sure what it is I (and some others in the debate) are suggesting. The top layer of the debate is fairly simple - Canonical is/is not contributing to $FOO - but I guess it might help to spend a bit more time spelling out the implications.

One thing a lot of people seem to assume is that this is some form of jealousy or sour grapes - we're just hatin' on Canonical because they are, in some way, beating us (where 'us' is Red Hat or Fedora or whatever). But really, that's not it at all. Red Hat and Fedora don't really compete with Canonical at all in their main wheelhouse - the end-user desktop; Fedora's target user (and overall raison d'etre) is rather different from Ubuntu's, and they can coexist perfectly happily. Yes, Canonical is making initial moves towards the enterprise market, but they're pretty early. Novell is a far more significant enterprise competitor to Red Hat, yet no-one ever seems to suggest that RH staff are jealous of Novell (or vice versa), and the relations between RH and Novell are pretty fine.

So, no, I'm not just bitching because I hate Canonical and want to score points off them, or something. The point is that Canonical has established itself as a big player in the F/OSS world, and to make the F/OSS world better for everyone in it - including Canonical - it's important that everyone contributes; not just to marketing or UX design or whatever, but to the fundamental engineering. The argument isn't 'Canonical doesn't contribute to $FOO so they're a bunch of losers, nee ner nee ner!', it's 'Canonical doesn't contribute to $FOO and it would really be better for everyone if they did'.

Look at it this way. (Again, this is my personal reading of the situation, not Official Red Hat Gospel). When Red Hat identifies something lacking in the F/OSS world that goes into the distribution that it sell services around, broadly speaking, it works to make sure it gets resolved. Usually, that boils down to 'hire someone to write the thing'. Take virtualization. It was obvious that this was going to be a major need for the companies that actually use Red Hat products and buy Red Hat services, so RH backed Xen. When it became clear that Xen wasn't working out so great, especially in terms of kernel integration, RH bought out Qumranet and bankroll the development of KVM. It's important to note that the basic theme here is self-interest. There is idealism in how RH operates, definitely - there are all sorts of ways RH could perfectly legally make it much trickier for others to leverage our work, make it much harder for Unbreakable Linux or CentOS or Scientific Linux or whatever to exist - but doesn't. But in so far as actually writing the code goes, in a way, RH would be dumb not to do it. Not working on a good virt layer, and sitting back and hoping someone else will write it so RH can use their work, just wouldn't fly very well. There's pages and pages of examples of this, but the shape of the story is simple: figure out what it is that needs to be in RHEL, then write the code, and contribute it properly.

This is what Canonical needs to do - for the benefit of the overall F/OSS world, yes, but also for the benefit of Canonical. And there are some ways in which they seem to get this. The cardinal example of a significant Canonical code contribution is upstart, and that's a legitimate one to be sure. It's a properly organized open project which is funded by Canonical but accepts contributions from others and genuinely works to be integrated into other projects, and it's been a pretty broad success, with other distros taking it up (though Fedora is currently planning to move to systemd with F14, but that kinda thing happens, it doesn't invalidate the value of upstart). Their usability work (including work on next-gen desktop concepts like Unity) is indeed an example of the same right way of thinking, though in some ways they've been doing it wrong (ignoring XDG standards in their new notification system, for instance, so that it only works with apps that Canonical custom patches in Ubuntu, and they have to ship the standard system anyway to handle apps they haven't got around to patching yet; it doesn't work out well from any angle). But the overall idea is right - they've identified usability as an area where improvement would be a significant benefit to the product they want to make a living selling services around, so they're trying to do that work, and - even if not optimally - they are trying to share that work. So, again, that's an area where broadly they've been getting it right.

That's about all the examples I can think of, though. EDIT: David Treumann correctly points out Simple Scan as another good Canonical project. I'd love for SS to get into the GNOME suite in future. Clicking around from there, I see the SS developer is also involved in a display manager (we sure need another of those...) and something called Omsk that's listed as proprietary. Huh. Never heard of that. Has anyone else? I looked around. Nothing on the Google. You can find this mysterious page in Launchpad: https://launchpad.net/omsk . It's a Canonical OEM Project. Lots of people seem to be involved with it. There's no code you can see. Judging by the couple of bugs marked as affecting it, it seems to be some sort of secret OEM customized Ubuntu variant. Curious...) So here's some simple suggestions: these are the things it would be best for everyone, including Canonical themselves, to step up and contribute to.

  1. Audio. Thanks to Lennart Poettering for pointing this out. Sound is one of the fundamental bits of just about any consumer desktop. Most desktop users aren't going to use a computer that can't play sound, or has problems with it. Yet Linux audio is massively understaffed. Lennart says there are three people in the world who are paid to work on Linux audio - there may be others Lennart and I don't know about or are forgetting, but there sure aren't a lot. Red Hat employs Lennart to write PulseAudio (though he does other stuff too) and Jaroslav Kysela to work on ALSA. Novell pays Takashi Iwai to work, in part, on ALSA (though this isn't his full time job). Canonical doesn't pay anybody to do any work on this area. It's almost ironic - Novell and Red Hat would cope far better in a world where Linux audio was completely neglected than would Canonical. I don't sell RHEL so I'm not the most informed, but I rather suspect that the vast majority of Red Hat's and Novell's significant customers couldn't really give a toss whether their servers can play Lady Gaga or not. But Canonical's users are far far more likely to be worried about audio functionality. So why are RH and Novell supporting this vital area of infrastructure - even if not really to the extent it needs - and Canonical isn't doing it at all? It would help everyone, but Canonical as much as or more than anyone else, for Canonical to find two or three people who can grok kernel hacking and signal processing and pay them to work full time on ALSA and PulseAudio and desktop sound integration. Hell, I can suggest one person for starters (though I'm not sure if he's free to take a job with Canonical) - Colin Guthrie, who's been a contributor to PulseAudio for a while.

Edit: Colin and also Daniel Chen pointed out to me that there actually are a couple of Canonical developers working on audio, something that I managed to miss while looking stuff up for this post. :) I'm looking into this more closely to rewrite the section above, but for now, please note that Canonical does indeed seem to be making some efforts in this area, which is great.

  1. Graphics. Same story as audio, pretty much. Red Hat and Novell both employ major upstream X.org contributors. Intel pays people to work on the intel drivers. AMD has a few people on staff who contribute to the ATI driver work. Heck, Mandriva has/had pcpa on staff (not sure if he's still around) and tried to make sure he had some spare time for upstream work. Canonical has one X developer on staff (Bryce Harrington), and he has no time for upstream contributions; he simply works full time on packaging X for Ubuntu and managing bug reports. Yet again, same story as audio: Canonical stands to lose the most if graphics development is neglected. Again, many of Red Hat's and Novell's customers could probably get by with vesa without really losing that much; Canonical's users are the ones who need proper accelerated drivers with 3D and video playback acceleration support. So why is Canonical contributing nothing to this development? Why would they trust a vital component of their product to people who work for other companies, or volunteers, and not take a stake in X development at all? How is that good for them in the long term? There's a ton of qualified people Canonical could hire here.

  2. Networking. Starting to sound like a broken record, but yet again this is an area which is more vital to Canonical than other companies, yet Canonical contributes less. Infamously, Canonical has no-one making significant contributions to the kernel, where network drivers live; Red Hat and Novell both employ kernel developers (not sure whether Novell has network driver developers, off hand, Red Hat has at least David Miller, who's in overall charge of the networking stack, and John Linville, who is in charge of the wireless stack). Red Hat pays Dan Williams to run, and write most of the code for, the NetworkManager project (which Ubuntu uses). Canonical...well, all I can find is one set of commits to NetworkManager back in October 2008 from a guy called Alexander Sack. Yet again, this is an area that's arguably more important to Canonical than anyone else, at least in parts. Probably most big Red Hat and Novell customers are mostly using ethernet, which is a fairly static area and doesn't require a lot of coding work; a new driver every now and again for a new chipset, but there's far fewer new ethernet chipsets than wifi chipsets, and the manufacturers often provide the drivers themselves these days. The areas of networking which really need development are, yet again, consumer focussed ones: wifi and mobile broadband. This is stuff Ubuntu users really really want to work; wireless networking is one of the classic knocks against desktop Linux, mobile broadband is up and coming. Yet, again, it's not Canonical staff doing the work here. Dan Williams has done almost all the work on implementing mobile broadband support into NetworkManager; the kernel level stuff for mobile broadband and wifi is done by a range of people from a range of companies, but no Canonical involvement. Yet again, why isn't Canonical contributing to this area that's so vital to its interests? Why can't they hire three or four engineers to contribute to writing drivers for new networking hardware and help out with improving NetworkManager? Yet again, it would help them as much as or more than anyone else.

  3. Desktop applications. This one's a little different, since everyone could stand to improve a lot here. The other big vendors don't do a huge amount of work on typical, non-infrastructure apps; the big ones like Firefox and OpenOffice.org are mostly supported by non-Linux vendors, small apps tend to be written by small companies, independent developers, or even Linux vendor staff working mostly on their own time. There are significant exceptions - Novell pays people to work on Banshee, OpenOffice, and Evolution (actually Novell probably does more than anyone else in this area), Red Hat supports the development of Totem and Rhythmbox to a degree, and has one or two others working in this area (RH has an Evolution developer on staff, I think, probably a few others I'm forgetting). But really, the story of major vendor support for Linux desktop app development is pretty crappy, and yet again, it's Canonical that's losing out the most. Yet again, RH and Novell's customers can get by without this stuff; Canonical's users really can't. Again, you'll notice I'm focussing on the classic big knocks against the Linux desktop here, and this is one of the biggies. All down the years, we've heard that big apps are missing. These days, the classics that get pulled out all the time are graphics editing and video editing, and there's a lot of truth in that. GIMP is good but it's missing stuff that Photoshop users rely on; our video editing story is terrible.

Perhaps the best contrast here isn't one of the other Linux vendors, but Apple. Apple realized back at the start of the OS X era that providing desktop apps that people really want to use is a great way to sell your desktop. Apple develops and supports the development of a lot of the best OS X apps, and bundles them in with OS X - the best example being Garage Band - or sells them relatively cheap. So, why isn't Canonical doing this? Canonical needs the Linux desktop to be an attractive choice for its business model of selling services to Linux desktop users to work; sure, it won't make any money directly by funding the development of a kick-ass open source video editor, but it needs there to be a kick-ass open source video editor for its plan to make money to actually work. This is the conceptual leap Canonical needs to make more often, in a nutshell. Red Hat doesn't make any money directly from funding the development of bits of the Linux kernel, but it needs that development to happen for its business plan to work. Canonical needs to go out there, find the people working in scraps of spare time on promising but fundamentally incomplete or broken desktop apps, hire them, and polish those things till they gleam. Go out and find the best attempt at a Linux video editor, hire the top five developers, give them an office and let them develop the project - not in secret in Launchpad, but right out on its existing project page. In the end, as long as Ubuntu is the leading desktop Linux distro, it's still ultimately going to be Ubuntu that sees the benefit more than anyone else, even if everyone else gets to use it too. Find the top five contributors to GIMP, hire them, go do a survey of Photoshop users and find out what it is they need in an open source photo editor, and damn well give it to them. Hire the top contributors to Audacity, Jack, Hydrogen, Rosegarden and all the other jumble of disconnected Linux audio creation apps and frameworks, stick them together in an office, and build a kickass integrated audio creation suite. Just go out and read those articles about the key desktop applications Linux is missing, and hire some people to write them. It ain't rocket science, and ultimately, it's self-interest as much as anything else. But it's the right thing for Canonical and the right thing for the rest of the F/OSS world.

The President of the U.S. famously said something about lipstick and pigs. Yes, Canonical's steps towards usability and interface work are important, but the prettiest interface in the world to a desktop operating system isn't enough if the underlying hardware support isn't there, or the applications that people to need to run aren't available. It's Canonical that needs these things to exist, more than anyone else; so why wouldn't Canonical want to be the ones to get them done? Hoping that other companies or volunteers will write them for you is not the best plan, it really isn't.

More controversial crap, or: where did all the Linux netbooks go?

So, I've been involved in a long discussion here and on IT Wire with just the most charming and polite adversary one could wish for, Mr. Telic. This has actually caused me to go out and do some boring research on stuff which I had a feel for but hadn't really put any hard data to, and which is fairly depressing. (There's a summary at the bottom for impatient types: search for 'tl;dr' - that's 'too long; didn't read', for anyone else who was baffled by that cryptic little snippet as long as I was).

So, netbooks. Mr. Telic holds out the netbook market as the counter to my theory that Linux is doing pretty crappily in the traditional operating system 'market'. He cites numbers mainly drawn from 2008 and 2009, when Acer and Asus executives were on record as saying they were shipping 20% and "30 percent to 40 percent" of systems, respectively, with Linux installed, and one analyst said that 32% of netbooks shipped in 2009 ran Linux, and predicted that the majority would run Linux by 2012.

On the face of it, hey, that's a pretty strong argument. On my Canonical-hatin' sidetrack I did note that neither Acer nor Asus shipped Ubuntu on their netbooks - Asus uses/used Xandros and Acer uses/used Linpus, a Fedora derivative - which makes Mr. Telic's point rather weaker in support of the theory that Ubuntu's doing all the work of promoting Linux for ordinary people. (Only Dell of the major-tier manufacturers has shipped netbooks with Ubuntu pre-installed; the other major tier vendor we've discussed, HP, ships/shipped SUSE). But really, what I'm interested in with this post is the question of how Linux is doing.

So, here's the thing. Are we really on the smooth track from 32% (claimed, at least - I suspect that figure was actually a little optimistic) in 2009 to a majority in 2012? Well, not to put too fine a point on it, I don't think we are, no.

The obvious question to ask was 'what's the figure for 2010?' Interestingly, I can't find one. Not from any analyst. Especially not from ABI, the firm that was very loud about its survey which gave the 32% figure for 2009 and the 'majority' extrapolation for 2012. This year they have released a new study which seems to mush things up a bit, and consider netbooks, smartbooks, tablets and basically anything else in between a laptop and a phone as one big market segment - they talk about 'Linux-based mobile devices'. The press release for the study (you can't read the study for free, of course) is noticeably lighter on big headline numbers than the little publicity tour they sent one of the authors of the 2009 Linux-on-netbooks study. The only number the press release cites is right up there in the first paragraph - "ABI Research anticipates that Linux-enabled mobile devices, led by the success of Google’s Android and upcoming Chrome OSs, will comprise 62% of the operating systems shipping in all (non-smartphone) mobile devices by 2015." (Again, my pesky Canonical-bashin' muscles note that 'led by Google's Android and upcoming Chrome OSs'...and while I'm at it, let me bash analysts by cynically noting that their 2009 press tour didn't really mention Android much at all. Boy, those analysts sure are great at predictions).

I Googled around and checked a few other analyst firms, but no-one seems to be giving any numbers on operating system share for netbooks for 2010. (Best I can find is a cheery prediction from December 2009, fairly typical of the genre around then, in which "mobile computing expert and Web entrepreneur Sascha Pallenberg" reckons that 50-70% of netbooks in 2010 will be running Android. Hmm.)

So, in the absence of that kind of data, and related to some back-and-forth nitpicking with Mr. Telic, I went out looking for netbooks with Linux preloaded instead. What I found was, well, a wasteland.

Let's go with the big dogs first. Asus! The company that introduced the netbook (and whose first models only came with Linux). The company that was shipping 40% Linux models in 2008, and expected it to stay at 30% after that. Now, Asus has about sixty bajillion very slightly different models listed on its website, and life's frankly too short to go through all of those. Let's look at what you can actually buy. So, to Best Buy!

As of writing this, Best Buy offers 16 Asus netbooks through its website. How many come with Linux pre-installed? That would be none. Yup, it's Windows all the way - a couple with XP Home, the rest with 7 Starter.

Well, that ain't so good. Still, Best Buy's not the only retailer, right? Let's go with somewhere a little more geek-friendly. How 'bout NewEgg? Surely they'll have at least a token system or two for the geeks, right? Well, uh, nope. NewEgg lists 21 Asus netbooks. 21 of 'em come with Windows.

Well, hey, that's just the U.S., y'know, the biggest computer market in the world. No biggy. Let's check some other countries. Canada? Well, uh, nope. Future Shop doesn't sell Eee's, oddly enough. Best Buy Canada sells one, with Windows. Netlink, where I buy my stuff, sells (or at least lists, half of them probably aren't really available any more) 42 models - every bloody one with Windows. The UK? Nope. Comet has five, all with Windows.

Well, okay, then. Asus obviously fell a bit out of love with Linux. Still, they're yesterday's news. Even though they introduced the sector, Acer rules it; they ship the most netbooks of any manufacturer. Best Buy doesn't list any Aspire Ones on its US website (odd), so I went with CDW, who list 17...all with Windows. NewEgg, 8, Windows. Best Buy Canada, 5, all Windows. Future Shop, 3, Windows. Netlink, 8, Windows. Dixons, one, Windows. Comet, two - both with Windows pre-loaded, but one (it comes in a couple of colors, so there's two search results, if you're duplicating my results) with an Android dual boot! Finally!

So, Acer goes 1 for 44. Hey, that's one better than Asus. Swing, batter batter...

So, well, hey, we're not doing so great with the market leaders. Still, Mr. Telic is keen on HP and (especially) Dell, so let's take a look at them, shall we?

So, Dell. Yes, Dell sells Ubuntu. Three current models with a current Ubuntu release, a netbook, a laptop, a desktop, all at reasonably competitive prices. That's not horrible. It's about as well as any distributor has done for a major OEM deal. So, moderate kudos to Canonical there. Yes. Now, excuse me while I pile on.

One, do you know how Dell expects you to get to that page? You go to www.dell.com, where you see precisely no mention of anything remotely to do with Linux. You then click on 'For Home' (but not any of the drop-down options under it, none of them gets you anywhere you can buy an Ubuntu system). This takes you to a big, busy page, where you may notice the bold-face, fairly prominent strapline reading "Windows® . Life without WallsTM . Dell recommends Windows 7." What you're going to have to work harder to notice is the third section of the sidebox labelled "Essential PC Links", which is headed "PC Operating Systems" (you'll have to scroll down to see it, unless you have a very high resolution and/or very small fonts). Here, under four different types of Windows, is a link called "Open-Source PCs" (and if that doesn't scream "Buy Me!" to the average PC buyer, I don't know what does!) Click this, and you get taken through a buffer page which starts out "We’re glad you found Dell’s Ubuntu website." (well, yes, you did make it quite an excursion, I think the pat on the back is deserved...) and which has a small 'Shop for Ubuntu' button right at the bottom (I missed it the first two times I saw the page) which takes you, finally, to the page I linked to right off the top, where you can actually buy Dell systems with Ubuntu pre-loaded.

Whew. You think the above causes Microsoft to lose huge amounts of sleep? I'm betting no.

The models sold here are the Mini 10n, the Studio XPS 7100 n-Series, and the Inspiron 15n ('n' seems to be Dell shorthand for 'comes with Ubuntu'). Try this. Pull down the 'Laptops & Netbooks' menu and click 'Mini - Netbooks'. You get the Mini 10 and Mini 10v Nickelodeon Edition. Do you get the Mini 10n? No, you do not. The only way you get the Mini 10n is to find your way to the Ubuntu ghetto page. Try it for the laptop - 'Laptops and Netbooks -> Inspiron'. Again, you see the non-n, Windows-y models; you don't see the n, Ubuntu models. Only with the 7100 is there a 'Looking for Ubuntu?' button under the 'Customize' button for the Windows config. Dell really doesn't seem to try at all to make it easy to buy an Ubuntu system unless you go looking for it, it seems. (For bonus fun, go to the Ubuntu ghetto and click on the picture of the Mini 10n system, rather than the 'Personalize' button. It takes you to the main Mini page...where the 10n isn't listed. Same deal if you click 'Product details'. headdesk no cookies for you! It's like they fill the page with booby-trapped links which pull you out of the tiny tiny sliver of the Dellverse where you can actually buy an Ubuntu system...)

Quick price check. The base config of the 10n is $299, with 1GB of RAM, a 160GB hard disk, 802.11n wireless, a webcam, and a 3-cell battery. The base config of the Mini 10, with Windows 7 Starter, is $299 for identical specs - except you get a 6-cell battery. So the Windows model is slightly better, for the same price. The 15n costs $579; the $529 model of the Inspiron 1545, with Windows 7 Home Premium, has a bigger hard disk (500GB vs. 250GB), more RAM (4GB vs. 3GB), and a better battery (6 cell vs. 4 cell). So the Windows model is $50 less with better specs. The XPS 7100n, again, is the best story: the Ubuntu model is $459, versus a Windows model with exactly identical tech specs for $499. Finally.

So Dell's netbook story is a single model with inferior specs to the Windows model, which you can only find on a rather well-hidden special page, for the same price as the Windows model. It has a laptop which costs more for worse specs, and only the single desktop model seems to have positive elements to it. Isn't it sad that this is the best we've managed?

Finally on this little tour of depression, HP is up. As far as I can find, HP offers a single netbook model with Linux pre-loaded - the Mini 2102, in its build-to-order incarnation. (They previously offered the Mi editions of three other models, and the 2133, but these are no longer available; they have an ARM-based smartbook with Android pre-loaded coming up). I can't find anything that discusses Linux for non-server machines on hp.com, anywhere - clicking through I see nothing, searching for 'Linux' or 'SUSE' just brings up a bunch of server-related pages clearly aimed at enterprise server purchasers. I was only able to track this system down through the third party Linux Netbook site, which has handy listings of pretty much every Linux netbook that's ever existed. If you go to the main Mini 2102 page - not sure if that link will work, it's one of those crazy ones full of numbers, but it's what I've got - there's no indication that Linux is available on it, though on the general Mini page there's a small mention of "SUSE Linux Enterprise 11" and "FreeDOS" in the Operating System box. To actually buy it with SUSE, you have to click 'Configure your model', then 'Configure PC' (why there's so many freaking pages to click through on major vendor sites I will never figure out), which takes you to the Small Business site, where you click 'Configurable', then 'Configure and Buy'. You can probably short-circuit that a bit by going straight in through the Small Business site, but jesus. Here you can change the operating system to SUSE Linux, take out the OS Label and EnergyStar Label (yikes, big loss), and arrive - finally - at a Mini 2102 configuration with SUSE Linux preloaded, for...$461. If you ditch Bluetooth, you can get down to $452. Now, if you reverse all those steps and take the cheapest pre-configured Windows Mini 2102 model, you get the exact same specs - with Bluetooth - for $329. That means you're paying a 'Linux tax' (huh?) of $132 if you buy the model with SUSE installed. Boy, that's gonna get 'em stampeding through the doors, that is.

tl;dr summary: where the hell did all the Linux netbooks go? In 2007 you couldn't buy a netbook with Windows; in 2008 to 2009 you could still walk into a big box store just about anywhere and pick from a few with Linux; now, you can buy one from one store in England with an Android dual boot, one from a hidden page on Ubuntu's site with an inferior configuration to its equally-priced Windows equivalent, and one from a very well hidden bit of HP's site with a $132 premium over its identically-specified Windows equivalent. Yes, please do go ahead and point out the ones I've missed in the comments - I had a quick look at MSI but not much else - but it ain't gonna be much compared to the flood of Windows I found here.

This is a freaking far cry from having the majority of the market by 2012. It makes me wonder why there hasn't been more shouting, aside from this SJVN article back in May. If the netbook is the forefront of Linux's assault on the general computing market...well, yikes. Look, I know this is pretty depressing, and not in itself terribly productive. But I think it's important to have a clear-eyed view of things. The Linux press has a tendency to report the good news and ignore the bad; there was a lot of trumpeting of those 30-40% numbers back in 2008-2009, but it seems no-one's too keen to note how things haven't been going so well since then. It's important to have a clear view of where we are, and right now, where we are - for Linux in the traditional consumer computing market - is not a very good place. I'm not claiming to have the answers for how to make it better, and I know it's a hard thing to sort out; the cynical side of me can come up with a lot of explanations as to where all those pre-loads went, and all of them involve large amounts of money going out of Microsoft bank accounts. In a way I'm fortunate that the fortunes of Red Hat and Fedora aren't really tied up in this hellhole of a market. But if you do have a keen interest in that market...pay attention.


Short thought: I've posted some things that are fairly harsh on Canonical, here and in other places, over time. There's one thing I'd like to make clear. I think Canonical is a far, far better player and contributor to the F/OSS community than most other companies. Sometimes people have suggested that I (and others) are being overly harsh on Canonical compared to companies that are way worse, and to a degree that's a reasonable criticism. So here's a bit of balance. The reason I don't complain much about Microsoft or Apple or Google is that I pretty much consider them lost causes. They exist to make a buck, and they have cultures that are more or less fundamentally at odds with proper collaborative F/OSS development. Even with Google, Android is pretty much a manual on how not to run a proper F/OSS project, and their ultimate goal appears to be to own every piece of information about everyone in the world, which frankly creeps me the hell out. I don't expect anything more than minimal legal compliance with open source licenses from Google and Apple and Microsoft, and most other software companies. Whenever we see anything else I get pleasantly surprised. I get passionate about Canonical precisely because they're not as bad as those companies - they're definitely not a lost cause, they do a lot of good stuff, and they have a great opportunity to do even more really good and constructive work. Mostly I see Canonical as being on the Side Of Justice and my bitching is hoping they can do a better job at stuff, not hating on them because they're on the Side Of Evil. I try not to spend too much time worrying about the companies that just don't get it, because it's not going to get anyone anywhere.

So, yeah, despite my carping, I think Canonical is trying to do the right thing, whether they get it wrong sometimes or not, and there's a lot of good people there doing good work. I moan out of love, honest. =)

The success of Ubuntu

(final edit: okay, I posted this, had second thoughts about the prudence of posting it and hid it, but it seems like everyone saw it anyway, so what the hell, I'll unhide it again. It's maybe not the most diplomatic thing to post, but...erf.)

Very important disclaimer: this post is utterly and entirely my own personal thinking and in no way whatsoever reflective or representative of my employer.

Quick foreword, edited in later - I forgot to explain the reason for this post. When people talk about Ubuntu's shortcomings they often hedge it around by saying 'but hey, they're good at marketing', or 'hey, they make a successful desktop'. I'm arguing that this isn't so clear cut.

I'm going to try and keep this as succint as I can, as this kind of thinking has gotten me in Big Trouble before, but commenting on Greg's blog post got me thinking. Has Ubuntu, really, been successful?

Becoming the 'leading' Linux desktop distribution is a pretty paltry measure of success, after all. As I wrote on Greg's post, no-one's really taken SUSE seriously since Novell bought them (though that seems rather unfair, and probably is a problem of perception / marketing - from all I hear, it's still a very solid distro), Mandriva has been basically bankrupt since 2003 and has huge management problems (also, can't string together a press release in correct English, which makes appearing more professional than Mandriva a freaking low bar. Don't get me wrong, I love Mandriva, but the company has huge problems. I never said it didn't when I worked there. I just didn't talk about them. Publicly. :>), and Fedora and Debian aren't really Uncle Bob-type distributions. All the rest are derivatives of the others, or second-stringers. If you show up with a couple of graphic designers, anyone who's passed Media Relations 101, and a bit of cash, you can pretty much win by default, which is what Ubuntu did.

So, let's not go on the kiddy pool stuff. After all, that's not how Ubuntu says it's measuring success. Remember Bug #1, filed in August 2004? It doesn't say 'Mandrake has the biggest share of the Linux market', it says 'Microsoft has a majority market share'. So, let's look at Ubuntu's record in those terms. Let's look at the W3Schools operating system share statistics.

In July and September 2004 (so presumably also in August), Linux is at 3.1%.

In June 2010, after nearly six years of Ubuntu as the generally-perceived Linux desktop standard bearer, Linux is at...4.8%.

In March 2003, Linux was at 2.2%. So that's a rate of growth of 0.9% over 16 months to July 2004 - 0.05625 percentage points per month. The rate of growth from July 2004 to June 2010 is 1.7% over 71 months - 0.02394 percentage points per month. The margin of error in those numbers is likely huge, because we're playing with such small numbers, but even so, it sure doesn't look like Ubuntu has even managed to increase the rate of growth of Linux one iota over the 'leading desktop distributions' that preceded it (in the 2003-2004 range that was probably Mandriva; before there was Gentoo and Red Hat Linux, and SUSE was always there or thereabouts).

It's hard to find stats from the other places that track operating system usage that go back as far, but going back as far as they do - to around 2007 or so, usually - they seem to tell much the same story. I can't find any which show really significant growth in general Linux adoption, or a significant increase of the rate of growth at any point in Ubuntu's tenure.

So, is this success? Really? Discuss. :)

Given what's discussed in this post, I'd say the top 3 damage-dealers to Microsoft since Canonical started up have been Mozilla, Google, and Microsoft itself (hello, Vista). Canonical probably ain't in the top 10.

(To pre-empt the obvious point - yes, I was working for Mandriva and Red Hat during most of the period under discussion, and we haven't revolutionized the world in terms of Linux market share either. But Mandriva never exactly aimed to; it aimed to be a successful commercial Linux distributor, and was always more about effectively serving a relatively small market than trying to Take Over The World. I'd argue Mandriva's usually provided an excellent product to its mostly-loyal user base, and almost all the company's problems, which are significant, are in management and marketing. Red Hat, also, has never said we're aiming to make significant inroads into general user desktop market share. This is not the same as 'we don't care about the desktop', note, but the emphasis is slightly different. We maintain a great desktop distribution, which I work on, and do a large amount of work on the F/OSS desktop - see Greg's post - but not in exactly the way Ubuntu claims to be aiming for.)

Poulsbo (GMA 500) driver in RPM Fusion repository for Fedora 13

So, I finally did it - I've uploaded a mostly working driver for the Intel Poulsbo (GMA 500) graphics chipset to the RPM Fusion repository for Fedora 13. As of right now, it's in non-free updates-testing; it'll be moved to the main updates repo at some point. To install it, set up RPM Fusion repos according to the instructions, then run this command:

yum --enablerepo=rpmfusion-nonfree-updates-testing install xorg-x11-drv-psb

You may also want to do this (it'll make it work even when a new kernel comes out and Fusion doesn't yet have a kmod-psb package for it):

yum --enablerepo=rpmfusion-nonfree-updates-testing install akmod-psb

Then just reboot, and usually it should work. Remember, you can get an mplayer build with video playback acceleration support which will let you play HD videos smoothly from my video-experimental repo: details in this post.

If you have trouble, check what kernel -devel package(s) is/are installed. Depending on your exact hardware – many Poulsbo-sporting machines don’t actually support PAE – you may also need to manually install (with yum) the i586 kernel-devel package instead of the i686-PAE kernel-devel package that the dependencies may pull in.

If it works, but 3D acceleration and VA-API support don't seem to be working, check the permissions on /dev/dri/card0; they should be 666 (rw-rw-rw-). If not, add this section to /etc/X11/xorg.conf:

Section "DRI" Mode 0666 EndSection

Here's some system-specific tweaks:

If you want to have an external monitor side-by-side with the laptop display, you'll want to add a Screen section to /etc/X11/xorg.conf that looks much like the one in this file. You need to combine the dimensions of the displays you'll have connected appropriately. For instance, I have a 1600x768 panel in my laptop and the external monitor would be a 1680x1050 screen, which would be to the right of the laptop panel. So I add together the horizontal dimensions - 1600+1680 = 3280 - and use the tallest single vertical dimension - 1050 - and wind up with a size of 3280x1050. That's what I put in the 'Virtual' lines in the Display section. If your laptop display was 1024x768 and your external monitor 1280x1024, you'd do 2304x1024, and so on. Then you can configure the displays with gnome-display-properties or krandrtray or xrandr once X starts up.

If you’re using a Sony Vaio P (first generation), you’ll want the kernel parameters ‘mem=1900MB nohz=off’ to make everything work smoothly (this is nothing to do with the Poulsbo chipset, just other quirks of the Vaio P; I'm not sure if they're needed on the second-gen P).

In case you're someone who reads my blog but uses another distro, the driver is still packaged in Mandriva and should work out-of-the-box on 2010 and 2010 Spring (you may need to install additional packages from non-free to get 3D and video playback acceleration support, I'm not entirely sure, and I'm not sure how well it works on 2010 Spring). Ubuntu users will want to see this Wiki page and the recent pages of this forum thread for instructions and info. I'm not sure about other distros - do feel free to add info in the comments.

Note that this is entirely something I do in my spare time, and is not an official Red Hat project of any kind. Nothing to do with it is in any way representative of, or endorsed or supported by, Red Hat or the Fedora project.

Edit: I totally forgot to include credits in this post, very rude of me! I'm sorry. Thanks go to Olivier Blin at Mandriva, who did a lot of the work of porting the driver to work with X server 1.7 and later, and to the Ubuntu poulsbo community who did some more work, particularly Yves de Muyter who solved some thorny issues with EXA support. I really just packaged up the stuff other people have made for Fedora. :)

And now for something completely different...

I like it when you find out someone's got somewhere before you.

It being a warm and lazy afternoon, I was watching the Blue Jays and ruminating upon the rules of baseball. Scenario: score is tied, bottom of the ninth. A speedy runner is on first base. The juiced up^H^H^H^H^H^H^H^H skilled hitter takes a mighty swing and hits a ball to the wall in right field. The man in right field chases it in despair, knowing the speedy runner will surely score before he can get the ball to home plate.

Can our poor right fielder simply grab the ball and throw it into the stands, causing the play to be an automatic double, forcing the runner to take third base and no more, and saving the game for his team? Is there a provision against this in the rules?

Well, it being a very lazy afternoon, I took the trouble to look it up, and the answer is yes, there is such a provision:

7.05(g) "...When such wild throw is the first play by an infielder, the umpire, in awarding such bases, shall be governed by the position of the runners at the time the ball was pitched; in all other cases the umpire shall be governed by the position of the runners at the time the wild throw was made..."

That is, if an outfielder throws the ball out of play, the runners get two bases from whichever base they'd already reached by the time he threw it, not two bases from where they started the play. Unless our hapless right fielder is an extremely poor fielder and can't make an out at home plate on a runner who hasn't yet reached second base when he gets the ball, he cannot profit from throwing the ball away.

Well, back to the TV for me!


Slashdot comment threads can vary wildly in quality (thank you, I'll accept my Understatement Of The Decade Award later), but just occasionally they're brilliant. Here's fuzzyfuzzyfungus rendering years, and billions of dollars, in free market rhetoric into a handy-dandy pocketable one-liner:

"Any failure by the real world to precisely replicate the predictions of an Econ 101 student with a B or better average is caused by government meddling and could be solved by cutting taxes."

MMOFPS, or why Penny Arcade rocks

One thing I do with the time for which Red Hat pays me so many millions of (some of) your precious dollars is to lounge around in my underwear reading a bunch of web comics. (This comes under the heading of 'community research'). One of my favourites is Penny Arcade, and this is one of the big reasons why.

Yup, that's a link to a news post. It's not even got much to do with the comic it was posted alongside. On most comic sites - Nuklear Power is a great example - most of the news posts are eminently skippable and generally fairly dull. On Penny Arcade, they're most of the meat. The PA comics are very, very funny, but the news posts form probably the most perceptive long-term body of criticism of the video game industry ever written (in addition to being very, very funny). I don't play games anywhere near as much as I used to, but I still hang around the borders of that world, dipping a toe in occasionally (I got the 242nd star in Super Mario Galaxy 2 last week). PA has a knack of looking at issues of game design from a perspective you never thought of before, but which seems blindingly obvious as soon as they do it, and that article's a top-notch sample of the above.