OpenID (and others) login support

So, I've had many annoying whines^H^H^H^H^H^H^H^H^H^H welcome requests from readers in the past for OpenID authentication for posting comments. I didn't implement this for a long time. I told people it was because I was worried about spam but really it's because I'm freaking lazy.

I tried to enable this a while back with the standard Wordpress OpenID plugin, but it didn't seem to work (as many of you have noticed). I left it in this broken state for months, again because I am freaking lazy.

Today, because I'll do almost anything to avoid writing an FWN beat, I actually got this implemented in a way that works, using a different plugin - this one. It uses an external authentication intermediary called RPX. This is slightly worrying, but I did look around a bit and read the license terms carefully and they don't seem to be evil. Please do let me know if you know anything to the contrary.

The good bit is that it works - I tested - and it allows login with Yahoo, AOL and Google accounts as well as OpenID. So that should make it easier for people to comment in future. YOU'RE WELCOME! You can also still login with your existing account on my blog, if you like - there's a little link down the bottom of the RPX login screen for that.

I reserve the right to disable this again if it causes me to get spam comments.

Video acceleration repository updates

Yesterday and Tuesday I pushed several updates to my experimental video acceleration repositories. Most obviously, there's now an F11, an F12 and a Rawhide (F13) repo, all in the logical places. I also updated all components; libva is up to sds7, vdpau-video is up to 0.5.2. mplayer-accelerated is up to 20091106 and mplayer-mt is up to svn rev 29934 (basically, the latest mplayer and ffmpeg-mt checkouts as of yesterday).

Note that Splitted Desktop Systems have made xvba-video - the equivalent to vdpau-video for the ATI proprietary driver, it's a bridge from ATI's 'xvba' video acceleration system to VAAPI so you can use any VAAPI-compliant player with ATI cards - available. Unfortunately, it's proprietary. I'm not sure whether I want to package it, due to that. I probably will at some point, but...ick. Anyway, the fact that it's proprietary makes it very trivial to install, basically you'd just grab the latest tarball for your arch from here and extract the single file it contains to the appropriate place on your system. Then make sure you have libva and mplayer-accelerated installed and try it.

I'll try and find some time to work on the RPM Fusion mplayer package soon. Basically I'd like to update it to a newer snapshot and enable VDPAU support.

Engineering fail

I'm not normally much of a one for the Windows / Microsoft-bashing - I tend to just take a live-and-let-live approach - but this write-up of a Windows kernel engineer's talk is just hideous. Highlights:

"In order to make MinWin, Microsoft had to split existing DLLs that had these unwanted dependencies, such as Kernel32.dll. The team created KernelBase.dll, which has only the base functions MinWin requires. Applications expect to find these functions in Kernel32, but they are simply forwarded to KernelBase. Kernel32 itself is outside MinWin."

"A related problem is that Microsoft has been in the habit of combining unrelated APIs into the same DLL for performance reasons. Its solution is to create virtual DLLs, which are the API sets programmers call, but which are implemented in logical DLLs that might combine several virtual ones. A schema file that is mapped into every process tells Windows where the real API resides."

"Microsoft observed that 15 per cent of all user-mode crashes and 30 per cent of shutdown crashes were caused by heap corruption: applications that try to access freed memory, or memory beyond what is allocated. Its solution was a feature called the Fault Tolerant Heap (FTH). If an application crashes a few times, Windows will automatically apply a shim that intercepts its memory operations. FTH will over-allocate memory, and keep a copy of freed memory so that attempts to re-read it will succeed. The user gets better reliability at the expense of performance, which suffers by up to 15 per cent or so, while buggy applications work better than you would expect."

Windows: Fixing Things Wrong Since 1985

I'm being a bit unfair, as it does seem that Microsoft (or at least the kernel engineer in question) finally understand how hideous this is and indicates that the current slate of horrible, horrible workarounds is something of a first step in trying to move towards doing things properly. But the above is a nice illustration of how far things have gone wrong. The virtual DLLs thing made me throw up in my mouth a little bit. I hate to think what Microsoft's attrition rate for kernel developers is; they must go through 'em like ninepins I imagine the sight of a soon-to-be-ex-kernel developer running through the corridors, foaming at the mouth and screaming 'NO MORE! NO MORE!!!' is a regular occurrence around Redmond way...

I'm thinking this post would make a wonderful thing to point to when people start asking why F/OSS distributions and components tend to insist on fixing things the right way the first time, no matter how much short-term pain it involves.

Fedora 12 release celebration menu

To start: home-made garlic bread

Main course: Gnocchi in a tomato, garlic and mushroom sauce, topped with grated cheese Accompaniment: Osoyoos Larose 'Pétales d'Osoyoos' 2005

Dessert: Tiramisu or chocolate cheesecake Accompaniment: Smith Woodhouse LBV 1995


get Fedora 12 here. We're halfway through the port and hence a little hazy on specifics right now, but I'm pretty sure it's good.

Oh, pictures from Seattle (with the shiny new camera!) are up here.

New camera

We're mostly done with the Fedora 12 release now, and I'm going for a weekend break to Seattle with my partner tomorrow, so I thought it was a perfect time to indulge myself in a shiny new toy - a new camera. My last one was a fairly cheap Sony compact bought three years ago, so it was definitely showing its age. It annoyed me to take quite nicely-composed shots like:

Lake twilight

which are mostly ruined by the high noise level. Its various other inadequacies were also getting to me - it's pretty slow by modern standards, not many manual controls and they're a bit finicky to use, the usual stuff.

So I've been thinking about getting a new one for a while. I considered a low-end SLR but rejected that on the grounds that I'm just not committed enough to lug the thing around with me. It'd end up sitting at home all the time, and the old aphorism is that the best camera is always the one you have with you. So I discarded that option. I also considered the Olympus E-P1 and especially the Panasonic DMC-GF1, the two popular somewhere-between-SLR-and-compact micro Four Thirds-based cameras. Especially the GF1. Eventually I reluctantly decided against this option for a couple of reasons. One, it's really pretty expensive; a bit more than I, as a casual photographer really, can justify. Two, they're only really portable as long as you're using the fixed focal length pancake lens option. (For the even-less-photgraphically-knowledgeable than me, that basically means 'no zoom'). There's lots of rave reviews which go on at great length about how this is really less of a big problem than you'd expect, but these are clearly all written by the type of drooling photo geek who has seventeen cameras available at any given moment, most of which could zoom in on Alaska in an emergency. For someone like me for whom this will be my only camera, it's a much bigger deal. You can, of course, buy a wide range of variable focal length (zoom) lenses for these cameras, that's kind of the point of the whole system, but they all make the thing much bigger once attached. And then I have the portability problem again; I know the zoom lens would wind up sitting in a bag in a corner somewhere, being a waste of money, and I would be out with the pancake lens cursing that I can't zoom in on something.

So, although the micro Four Thirds options called very strongly to the shiny-odd-things geek part of me, I manfully rejected them. This left me in the market for a newer compact camera. For a while, I was considering the Sony DSC-TX1 - the new model of their high-end extremely tiny camera, the line with the sliding lens covers that you've probably seen in any big-box electronics store. I considered this for a few reasons. Perhaps oddly, I'm something of a Sony fan, as laughable as that may be at present; I generally like Sony design and interfaces, and have found Sony stuff is generally well-made and solid, which is a quality I care about quite a lot. I am a sucker for extremely small instances of things (viz. Exhibit A, my laptop). And it's actually supposed to be a rather good camera for its size, with the video and low-light capabilities quite widely praised.

In the end, though, I decided you're just inevitably giving up too much in raw potential quality to gain such a tiny tiny body, however well Sony have managed to cope with it. I don't really need something so tiny, my pockets and bags can accommodate something a little bigger, and if I am going very lightweight my phone can take acceptable 'fun' pictures at a pinch. What I really wanted was something small enough that I won't wind up leaving it at home all the time and small enough to be pocketable in a pinch, but that was all.

Eventually I wound up back at the usual suspect: yup, Canon. Sony make a mean-looking and very tiny widget, but it seems like the serious choice always winds up being a Canon. At first I was looking at the new G11, the old-fashioned-style compact which feels like you could run over it a tank and has extremely highly-rated image quality. But a few of the reviews pointed me in a different direction. If you don't mind not having the articulated LCD, external flat hotshoe, and optical viewfinder - and I don't mind any of those - you can get the same sensor, and an arguably better lens, in the much-slimmer body of the PowerShot S90.

So, I did. (Whew, you were getting tired there, eh?) And I'm very happy with it so far. It's a revelation compared to any other compact camera I've ever used; in operation (though not physically, obviously) it feels much more like an SLR. The controls, as many reviews have remarked, are wonderful - it's amazingly easy to twiddle with all sorts of options, from the fairly vanilla (zoom, ISO setting, white balance presets) to exposure compensation, flash power, a very wide range of shutter speed and aperture size settings, and all sorts of clever bits like moving face autofocus and slow synchro modes and heaven knows what else. It provides two dial controllers - a typical one on the back and a big one around the lens, like many SLR lenses have for manual focus - and a plethora of modes so you can set each one to control whatever you'd like to have fast access to at any given time. It's just brilliant. Easy enough to have me fooling around on my balcony with the thing sitting on top of my barbecue (tripod? hah!), taking two-second exposure shots of the street outside (it's pretty dark). The processing is again like no compact I've used before - in super-dumb auto mode you can just point it at anything you like and usually get a pretty decent shot, and it's incredibly responsive.

The quality is really amazing for this kind of camera, and just worlds apart from what I was using before. You're never going to get SLR quality out of the lens and sensor that can be stuffed into a compact body at a sub-$500 price point, but it really does wonders with those provisos. Anything up to ISO 800 is very usable, and the image stabilization gives workably sharp hand-held shots at shutter speeds as slow as 1/60 (that's good, very good, for your dimly-lit inside/night shots). I'm very sure my lakeside picture would've come out a hell of a lot better with this camera. The flash is a compact flash that you can't point anywhere but straight at what you're shooting, it's never going to be brilliant, but it does as well as anything this big could. The build quality is great, which as I said is a big deal - I like to keep my stuff a long time, and I do tend to drop things. The casing is metal and feels very solid, all the buttons and wheels feel chunky and non-breakable, even the port covers aren't bad as port covers go.

As you can tell, I'm really happy with this thing! I'll have pictures taken with it up on Flickr after I've been to Seattle, nothing interesting to point it at around here tonight really. But yep, if you're in the market for a new camera, the price is in your range and you're not very sure you want something tiny or really big, it's a fantastic choice.


you wind up doing some very odd things as part of QA:

mdadm --force --create /dev/md/ddf --metadata=ddf --raid-disks 1 /dev/vdb

yes, that is as weird as it looks...

The snow cone-ometer indicates we're gurgling at approximately 3,674.56 Belgiums per Connery!

,or, Fun With Benchmarks, Part #5,468.

That fine and most reliable periodical WorksWithU decided to take a look at the nouveau driver. After a fairly innocuous introduction, they decided it'd be a great idea to do a performance test.

It's worth pointing out that they are, at least on some level, aware of a rather important fact: "To solve this problem, the cleverly named nouveau project was launched a few years ago to develop a full-featured, open-source video driver for nVidia chips. As its feature status chart demonstrates, it’s still maturing and doesn’t yet offer any real 3D functionality, but 2D support is implemented." So, yes, to recap, that's quite correct: nouveau offers no 3D acceleration.

So, after (one presumes) careful consideration, how did they decide it would be best to thoroughly benchmark the performance of nouveau? Why, let's hear it in their own words:

"I used glxgears (yes, it’s not a good tool for benchmarking overall video performance, but it’s a useful basis for standardized comparison of FPS rates under different video drivers) to measure video frames per second under the nouveau, nv and nvidia (closed-source) drivers."

(No, I am not quoting selectively to misrepresent. The single test, the sole rigorous examination of driver capabilities which was chosen as the basis on which this plucky little driver would stand or fall, was glxgears).

You used glxgears.

You...used glxgears.

Really? glxgears? Really? That tool which, insofar as it measures anything at all, measures...3D rendering performance?

It's just...I mean...guh. As the fine television football show puts it, "c'mon, man". If there's a moral here, it's...well, it's that epic fial is prevalent on the intarwebs, obviously. Also, that, as you really ought to know already, it's really easy to get into trouble with benchmarks. Though it does take some ingenuity to get it that wrong...

Fedora 12: Crunch time

It's been a crazy week. We're still pushing to make the Fedora 12 final release on time but without compromising on quality. It's been a little hairy over the last two days but we've got what we think is a solid package set in at last, and a first release candidate build has been cut. We still need to do some heavy testing on it and make a final call on whether we're going with the planned release schedule - that will happen on Monday - but at the moment I'm hopeful. We'll make the right decision either way, if we ought to slip the release we will do, and Fedora 12 should be one of the highest quality Fedora releases for a while.

So, what can you do to help? Oh, so much! First, don't let the magic words 'release candidate' mean too much: the RC compose per se really only exists to test the actual image compose process, and the DVD / multi-CD installation process. The bits it contains are just the same ones in the repositories. If you have an F12 installation already - say, from the Beta - and you update it daily, you're getting the exact same bits. You can also get on the F12 train from the nightly builds, and testing that install path is just as valuable as testing the DVD or net install paths.

So whichever method you choose to get on the Fedora 12 train, we need you to test out the package set that will be hitting the repositories tomorrow - there's a small number of updates from today, but they're all very significant ones - and make sure it works. We want to hear about any catastrophic problems, and especially about any cases where the packages from 2009-11-06 work worse than the ones from 2009-11-05. That's key. Post a comment here, file a bug and mark it F12Blocker, tell us on a mailing list or find someone on IRC - any of these will work in such a case.

If you have spare systems, drives, partitions or virtual machines we certainly need installation testing, either from the RC compose or from doing a network install from a Rawhide mirror. We have the installation test matrix up here. There's a couple of ways you can use it. The easiest way is to just do an install the way you usually would have done, then see which cell on the table that installation method maps to. If it's not filled in already, fill it in. If it's filled in but your result doesn't match what's in the table already, edit the page to say so - and include a link to a bug report, or contact details, some way for us to follow up! The other way is to look at the table first, and look for tests that haven't been done yet - white spaces. See if you have the hardware or configuration needed to do the test, and if you do, then do it and fill in the cell. In all cases, if you try an install and hit a serious bug, make sure to file a report to bring it to our attention.

The bottom line is, the more testing the better; just run F12 code anywhere you can and let us know where it's broken. Information is always good.

It's been incredible to see how hard everyone in QA, release engineering, development and documentation teams are working to get this release done and make sure it's great. I'm sure the other teams are all working just as hard, as well. I don't think there's many other distributions where I could start the day watching one of Red Hat's full-time engineers start working on an issue in France in the early morning my time, discuss it with more developers in Europe and the U.S. during the course of the day, and go to bed in the evening (okay, early morning) knowing a couple more paid Red Hat engineers are working on it in Australia. It's a great process to be a part of, insofar as I've been helping by running around like a headless chicken bashing everyone in sight over the head with the blocker list and demanding fixes yesterday!

On another note, congratulations to Ubuntu and Mandriva for their 9.10 and 2010 releases. I hope to get some time to play with MDV 2010 once this craziness subsides :)

Test Day update: i18n Test Day 2009-10-29

Tomorrow (for me, anyway, probably today for most) - 2009-10-29 - sees the final Test Day of the Fedora 12 cycle, on i18n (internationalization). This is a key topic for many Fedora users, covering the rendering of text in non-Roman alphabets, and input methods for those languages that require them. The test day page is all set up and ready with test cases, and you can test using any Fedora 12 - the Beta live image, an installed Beta, or the nightly live images. Testing is under way right now in #fedora-test-day on Freenode IRC, and will continue throughout 2009-10-29 whenever that is in your time zone! Please come along and help make sure Fedora 12 is as great an experience no matter what language you speak.