Data loss and other pontifications

The Register rejoices in the fairly unusual status of having a remarkably clueful readership. This article - a database professional's evaluation of the massive screwup whereby a U.K. government agency lost personal (including banking) details of nearly half the nation's population in the post - is a gem, but the comments are even better. The article suggested that the agency's protestations that removing unnecessary fields from the data before sending it would have been prohibitively expensive are tosh, with the author saying that he would estimate about half a day's work, at a cost of £500, to do the work. A commenter pointed out that this is only in the world of sense. In the world of government / business, it goes something more like this:

"I think that there might be a markup put on your work as well as some overhead of writing the requirements, formulating the demand for extra IT work when costs are being reduced, 5 layers of management to get the request approved after choosing the appropriate budget line, possibly a committee or two to pass through, then the costed proposal from the outsourcer, the quality plan for the work, fully detailed PERT chart, test specifications, approval process, proposed modifications to service level agreements, approval of tea breaks (oops, only kidding) etc. etc. etc.

Techies like you make things sound EASY. ;-))"

Would be funny if it weren't true...

I've been playing Super Mario Galaxy for the last week or so. Every time I play a main series Mario game I'm struck again by how unreasonably far ahead of the curve Nintendo still is when it comes to pure game design. I have never played a game made by any other company which produces so much pure enjoyment from ridiculously simple designs. I'm just incredibly impressed that you can give a Nintendo designer a set of blocks and a couple of enemy's with AI about two lines of code long and they can come up with about seven thousand different puzzles that will leave you grinning stupidly and playing till 4 a.m. without even noticing it. I've played games that are as good as Galaxy, but never in quite the same way, and never with quite the same level of pure simplicity. Anyone with a Wii must go buy it right now, and if you don't have a Wii, consider getting one.

I made chocolate spread yesterday. This would probably be incredibly easy with a food processor, but I feel rather embarrassed using any cooking gadget which my grandmother managed to live without till well into her 80s, so my method involved crushing chocolate chunks and roasted hazelnuts together with a mortar and pestle. I've never had bruises on my palms before. Tastes good, though.

Guitar Hero III - no, thanks

So I see the new Guitar Hero is out. Some choice quotations from the IGN review:

"...but this title does play a tad different than what you're probably used to if you've been following the series closely. For starters, the window of time in which you can successfully hit a note has been extended by quite a bit...The difficulty on the lower tier songs has been reduced, making the game much more accessible to new players. It also makes it so that intermediate players can feel like a rock star right from the get go"

"The character models have been given a new art direction and its one that we're not particularly fond of. Although a lot of motion capturing was done to get the signature moves of Slash and Bret Michaels, as well as to nail the singing animations, the way the band and crowd move just doesn't feel like rock and roll."

"There's even a bit of slowdown that comes infrequently as star power is activated. It's a rare occurrence, but the fact that we saw it at all is inexcusable in a music rhythm game."

"The standard campaign mode has only been slightly tweaked for Guitar Hero 3. The game still plays as a list of songs broken into tiers with a surprise encore at the end of each one."

"One of the little scenes shows the band getting upset at accusations that they've sold out. That's fine, except that Activision clearly sold Guitar Hero out in every way it could. The story is rather hypocritical. Ads are fed into the game, an entire stage was sold to Pontiac, and there is even an Axe Body Spray guitar."

(they still gave it an 8.9, but hey, that's review sites for you).

So...basically, the team behind Guitar Hero realized that with two releases they had done everything that was really necessary to do in the game. Instead of hanging around to create endless retreads, they left to go and do something new and cool (Rock Band). The publisher, as per usual, just wants to wring the teat until the milk don't come no more, so they bring in a new studio to basically re-build the exact same game from scratch with a different song list, make it even more ridiculously easy (yes, to any GuitarFreaks / Drummania player, Guitar Hero is stupidly easy, the timing is insanely lax), throw in some utterly inexcusable bugs (slowdown?!) and throw a nasty patina of corporate sponsorship over the entire thing. I fully expect they will be doing the exact same thing until we hit Guitar Hero VIII and no one cares any more.

The coolest thing about Guitar Hero in the first place was that it was clearly a labor of love, a type of game that doesn't come along very often any more. Lots of people call it 'original', but as it was an acknowledged remake of GuitarFreaks for North America, it wasn't. However, it was done with absolutely perfect style and understanding by a bunch of people who clearly love the whole rock music world deeply, and perfectly understand it. To anyone who owns several hundred CDs and visits dozens of concerts a year, it felt absolutely perfectly right on a huge number of levels. Sure, it had corporate sponsorship - but from music firms, which was only appropriate. The game would've felt worse without Fender guitars, Marshall PAs and so on.

The Guitar Hero III / Rock Band differential paints the whole situation in a painfully clear light. Guitar Hero III is a pointless cash-in on an existing franchise, built by people who clearly do not understand the mindset and motivation of the original developers. It adds nothing new but bugs and it compromises the whole spirit of the exercise. Rock Band is looking to be exactly what Guitar Hero was - a game that's a labor of love, and that has a clear purpose and sense of development from what went before. Sure, it's going to have problems viewed technically as a rhythm game - it'll likely still be too easy for hardcore players, and will miss the crack-like mechanics that keep people playing GFDM for years on end (pass the song? good, go back and S it. S it? good, go back and FC it. FC it? good, go back and SS it. SS it? good, go back and excellent it. Not that good? Just keep trying to get your skill points for the song up, one by one by one...) But it's going to have the perfect atmosphere that Guitar Hero initially had, which is a huge reason that made it so popular in the first place.

GH 3 will sell a ton of copies on inertia. But I think there'll be a significant tail off from GH 4 onwards. Rock Band should sell truckloads, but it hasn't been well-promoted. If there's any justice, it still will.

Compiz 0.6.2 and Compiz Fusion 0.6.0 backported to Mandriva Linux 2008

Yep, the backports for the new Compiz and Compiz Fusion releases have hit the 2008 repositories. If you want to use it on 2008, just set up your backports repositories - go to the repository configuration tool and enable them - then update the appropriate packages (compiz, libcompizconfig, compiz-bcop, compiz-fusion-plugins-main, compizconfig-python, compiz-fusion-plugins-extra, ccsm).

New ATI drivers

Just to let everyone know, we (as in, the Mandriva development community) are aware of the new ATI driver release. Anssi Hannula (who does the actual ATI / NVIDIA driver packaging), myself (who handles the graphics card / driver detection tables), and Olivier Blin and Colin Guthrie (who handle compiz and drak3d) are co-ordinating to bring this new version to both Cooker and 2008. We're aiming to upload the new driver to 2008 /backports simultaneously with backports for ldetect-lst and drak3d which will allow the graphics card configuration tool to automatically configure the new driver on appropriate cards, and set drak3d to allow 'native support' (i.e. AIGLX) for the new driver.

So if you'll give us a couple of days, you should be able to get a nice smooth and easy upgrade to the new drivers. Of course this won't stop the bleeding-edge enthusiasts running off to amd.com to try and install it themselves, but if you're willing to be a bit more patient it should be easier for you :)

We'll also consider a 2007.1 backport of some kind, but that would be rather a lot more complicated and may not be possible.

Mistakes

I was a bit surprised recently when I read about Ubuntu's PPAs. After years of being irritatingly good at everything infrastructure-y, this is a mistake so hilariously bad I don't know how they could have made it. Basically, Ubuntu has institutionalised the third-party repository.

A PPA gives any registered Ubuntu developer a personal repository into which s/he can upload new packages for any supported Ubuntu release. These are built on a central buildsystem (which is good) but, crucially, they are not built against each other (which is very, very bad).

The key problem with third party repositories, as the Mandriva world learned extremely painfully between 2000 and 2004, is that they cause all sorts of interaction problems.

As Mandrake / Mandriva, until recently, had no real provision for doing significant package upgrades / changes for official releases, third party repositories naturally appeared to fill the need. These would include version upgrades, new software and other significantly changed packages compared to the official release. And because there's a body of users who always want the latest shiny everything, they became very popular.

They also caused huge amounts of problems. As there's some history here, I should be careful to note that in most cases, these were really not the fault of the third party packagers, exactly. The problems are with the whole setup.

The biggest problem comes with upgrades. Upgrading between releases of a Linux distribution is, as most people who've tried it know, a fairly dicey proposition in any case. Adding packages from third party repositories just almost inevitably made it break. The installer is built with the latest official packages in mind. I don't know of anyone who installed something substantial from a third party Mandrake / Mandriva repository (except one specific one that has, well, special circumstances) who then got a trouble-free upgrade to the next Mandrake / Mandriva release. It just breaks stuff. Ubuntu has, by all accounts, been having increasing problems ensuring good upgrades with each release (although it's something they started out being very good at). PPAs are just going to make that about five hundred times worse. I confidently predict that the number of people who have trouble upgrading from 7.10 to 8.04 will be far greater than any other upgrade in Ubuntu's history, and a lot of them will be people who used PPAs.

The second biggest problem is that third party repositories do not play well together. I mentioned above that PPAs are built on a proper central buildsystem, which is good. This will at least avoid the problems caused by amateurish third party repositories that don't understand the importance of a clean buildsystem and upload badly built packages. However, they are not built against each other, which is very bad. The classic problem with this scenario goes as follows. Package Foo depends on Package Bar. Packager Bob adds Package Bar to his PPA - he upgrades the version, makes some significant packaging change, whatever. Packager Jasmine adds Package Foo to her PPA. As Jasmine's PPA is not built against Bob's PPA, Jasmine's Foo is built against the official Ubuntu Bar. But what if a user installs Bob's Bar and then wants to install Jasmine's Foo? Trainwreck, that's what happens.

If you're lucky they'll explicitly conflict and Jasmine's Foo will just fail to install. Then you've just got a minorly narked off user. What can also happen, though, is that there won't be any explicit package conflict and Jasmine's Foo will install perfectly happily...but it won't quite work right. There'll be some bad interaction that causes all sorts of odd buggy behaviour. The user will haul off to the Ubuntu forums and report this behaviour. "I can't reproduce, it works fine for me," some helpful person will say. Then another. They will suggest things for the user to try that will do nothing to resolve the problem. The user will get increasingly frustrated. Finally, someone will think to check what version of Bar the user is using, and the problem will be revealed. Or not - the user will just be stuck with the broken behaviour and never know why.

The fact that there's literally hundreds of these PPAs, and their design is such that they're intended to be small and virtually single-purpose, is going to make this massively more likely to be a big problem. There were only ever maybe 10-20 commonly used third party repositories for Mandrake / Mandriva and we had all sorts of trouble like this. With over 100 independent little repositories this has the potential to be a gigantic nightmare.

The last big problem is related to the previous one, and it's this - third party repositories are a support nightmare. As noted in the example above, the only way to figure out what our poor notional user's problem is, is to find out that they're using a PPA Bar package. Now imagine every time you want to help anyone with any issue, you have to check what versions they have installed of all the packages related to the issue. Yep. If they have a problem in Epiphany - check their entire GTK+ stack. Check their Firefox package. Check all sorts of stuff. Do this every time. Forever. Learn, or look up, the entire dependency stack for any package anyone ever comes to you with a problem for.

The people who help others out in the Ubuntu forums are either doing their nut about the whole PPA thing right now, or they will be very soon, as they start to encounter problems like this. It makes it just incredibly hard to help people.

It only took Mandriva about six years to come up with a good answer to this problem. Fundamentally the solution is to have one, official repository for non-security, non-bugfix, major changes to packages in a released version, have the packages in that repository build against each other, and have a policy for what can go in it (i.e. no important libraries). This is the /backports system used since Mandriva 2007. Prior to that we had the 'community' tree for the 10.2 and 2006 releases, which fulfilled basically the same function but in a much less well organized way. You would have thought that other distributors would learn from our experience, but apparently not.

Of course, the fact that Ubuntu also has a backports system - which will apparently just roll alongside the PPAs - and a proliferation of true third party repositories (small repositories with random updated versions of applications, built and provided entirely outside the official Ubuntu framework) and random .debs being uploaded to project websites is only going to make the interactions that much more difficult to trace.

Part of me wants to sit back with a gigantic grin and a bag of popcorn and watch the carnage, but honestly, I feel sorry for the users, and also for the poor sods who do Ubuntu support.

More reviews, some miscellaneous news

So there's been some more reviews of 2008...

Here's one! Here's another! Here's another! And here's one more!

All pretty nice reviews so far, which is great.

As for Cooker, it now has the new beta 3 of KDE 4, mostly. We are looking at backporting it to 2008, but this has some problems which would need to be worked around.

Personally I spent half of yesterday updating our mc package to the latest CVS snapshot (most other distros now carry a CVS snapshot rather than 4.6.1) and updating the entire patch set. That got a bunch of cruft out and should have made the package a lot better, but if it broke anything, feel free to whine. :)

Also, for anyone who's read about the upcoming Club changes and is wondering when their paid Club account will be converted to a Powerpack subscription account and extended (for Silver members) - I've been told this conversion will be done at the end of October.

New NVIDIA, ATI driver backports for 2007 Spring

okay, so we're all about the shiny new 2008, but I know there's those of you out there who are perfectly happy with 2007 Spring thankyouverymuch and won't be upgrading till it goes out of maintenance. So if you're in this group but you'd like the latest proprietary NVIDIA or ATI driver for your card, never fear - Anssi Hannula has come to your rescue. He's backported the NVIDIA 100.14.19 and ATI 8.40.4 driver versions to 2007 Spring, in the /non-free/backports repository (yeah, see, and you thought we were never gonna use it ;>). You'll find ati, dkms-ati, ati-control-center, nvidia97xx and dkms-nvidia97xx packages there. I have also backported the latest ATI and NVIDIA device IDs in ldetect-lst and released ldetect-lst-0.1.167.2-1mdv2007.1 to /main/backports , so if you install that package too, NVIDIA and ATI cards released since 2007 Spring came out will be automatically detected and configured to use the new drivers by the Mandriva graphics card configuration tool.

2008 One GNOME out

Okay, so those of you who kept on bugging me about the GNOME version of 2008 One can stop now, it's out! Grab the torrent or find the ISO on the public mirrors.

Updated both my mail and web server VMs to 2008 today. Went very smoothly in both cases, via urpmi. Everything seems to be working well so far.

Webkit in Mandriva 2008

Just a followup to my posts from a few months back about GTK+ WebKit stuff. I haven't actually checked in the other major distros, but I believe Mandriva 2008 may actually be the first to ship a Webkit-based GTK+ browser. It includes Midori 0.0.6, built against SVN revision 25144 of WebKit (WebKitGdk as the GTK+ port was called at release time, WebKitGtk as it's called now) - not the gtk-webcore port I was initially packaging, as that seems to have mostly fallen by the wayside now. Pascal Terjan has stepped up to package this incarnation of GTK+ WebKit and keep midori up to date, so thanks to him. As I wrote a few months back, Midori is missing a lot of the features you'd expect of a real mature browser application, but as far as just rendering normal web content goes, it's pretty advanced already, as you'd expect since WebKit is a mature toolkit. It's fun to play with. So if you're running 2008 and want to look at the future of GTK+ web browsing, set up your contrib repository and install the midori package.

On a side note, this site will be down for a bit - I'm upgrading the machine that runs its VM to 2008.

Edit: okay, the VM host machine is updated to 2008 and now running VMware Server (rather than Player as before) - damn nice piece of software, I must say. Tomorrow I will tackle updating the mail and web server VMs themselves to 2008, so expect more downtime. This upgrade went nice and smooth once I swapped out the DVD drive in the machine for one that actually works. Moral of the story - never buy a Lite-ON DVD drive. I've bought three and they've all died within 18 months.

Public Service Announcement: NOKEY errors final

A final message on the NOKEY issue: all mirrors in the official list are now okay with the sole exception of ftp.cica.es. Any other mirror proposed by the Mandriva repository configuration tools should now be okay. As mentioned in the previous message, if you have already set up your repositories and are encountering the NOKEY issue, you will need to either run 'urpmi.update -a --force-key' as root or remove and re-add your repositories to stop the errors from showing up in future. Once again, we're sorry for this inconvenience.