Skip to content

Month: June 2005

Hackability as a selling point

Hardware: On my home network, I recently replaced my NetGear MR814 with a brand new Linksys WRT54G.

My top criteria for what hardware to buy for this job weren’t price, form factor, how pretty the hardware is, or even what features it had — instead, I bought it because it’s an extremely hackable router/NAT/AP platform. Thanks to a few dedicated reverse engineers, the WRT hardware can now be easily reflashed with a wide variety of alternative firmware distributions, including OpenWRT, a fully open-source distro that offers no UI beyond a command-line.

Initially, I considered a few prettier UIs — HyperWRT, for example — since I didn’t want to have to spend days hacking on my router, of all things, looking stuff up in manuals, HOWTOs and in Google. Finally I decided to give OpenWRT a spin first. I’m glad I did — it turned out to be a great decision.

(There was one setup glitch btw — by default, OpenWRT defaults to setting up WPA, but the documentation claims that the default is still no crypto, as it was previously.)

The flexibility is amazing; I can log in over SSH and run the iftop tool to see what’s going on on the network, which internal IPs are using how much bandwidth, how much bandwidth I’m really seeing going out the pipe, and get all sorts of low-level facts out of the device that I’d never see otherwise. I could even run a range of small servers directly on the router, if I wanted.

Bonus: it’s rock solid. My NetGear device had a tendency to hang frequently, requiring a power cycle to fix; this bug has been going on for nearly a year and a half without a fix from NetGear, who had long since moved on to the next rev of cheapo home equipment and weren’t really bothering to support the MR814. I know this is cheap home equipment — which is why I was still muddling along with it — but that’s just ridiculous. None of that crap with the (similarly low-cost) WRT. OpenWRT also doesn’t contain code to DDOS NTP servers at the University of Wisconsin, which is a bonus, too. ;)

Sadly, I don’t think Cisco/Linksys realise how this hackability is making their market for them. They’ve been plugging the security holes used to gain access to reflash the firmware in recent revisions of the product (amazingly, you have to launch a remote command execution attack through an insecure CGI script!), turning off the ability to boot via TFTP, and gradually removing the ways to reflash the hardware. If they succeed, it appears the hackability market will have to find another low-cost router manufacturer to give our money to. (update, June 2006: they since split the product line into a reflashable Linux-based “L” model and a less hackable “S” model, so it appears they get this 100%. great!)

Given that, it’s interesting to read this interview with Jack Kelliher of pcHDTV, a company making HDTV video capture cards:

Our market isn’t really the mass market. We were always targeting early adopters: videophiles, hobbyists, and students. Those groups already use Linux, and those are our customers.

Matthew Gast: The sort of people who buy Linksys APs to hack on the firmware?

Jack Kelliher: Exactly. The funny thing is that we completely underestimated the size of the market. When we were starting up the company, we went to the local Linux LUG and found out how many people were interested in video capture. Only about 2 percent were interested in video on Linux, so we thought we could sell 2,000 cards. (Laughs.) We’ve moved way beyond that!

Well worth a read. There’s some good stuff about ulterior motives for video card manufacturers to build MPEG decoding into their hardware, too:

The broadcast flag rules are conceptually simple. After the digital signal is demodulated, the video stream must be encrypted before it goes across a user accessible bus. User accessible is defined in an interesting way. Essentially, it’s any bus that a competent user with a soldering iron can get the data from. Video streams can only be decrypted right before the MPEG decode and playback to the monitor.

To support the broadcast flag, the video capture must have an encryptor, and the display card must have a decryptor. Because you can’t send the video stream across a user accessible bus, the display card needs to be a full MPEG decoder as well, so that unencrypted video never has to leave the card.

Matthew Gast: So the MPEG acceleration in most new video cards really isn’t really for my benefit? Is it to help the vendors comply with the broadcast flag?

Jack Kelliher: Not quite yet. Most video cards don’t have a full decoder, so they can’t really implement the broadcast flag. ATI and nVidia don’t have full decoders yet. They depend on some software support from the operating system, so they can’t really implement the broadcast flag. Via has a chipset with a full decoder, so it would be relatively easy for them to build the broadcast flag into that chipset.

Aha.

Project management, deadlines etc.

Work: I took a look over at Edd Dumbill‘s weblog recently, and came across this posting on planning programming projects. He links to another article and mentions:

My recent return to managing a team of people has highlighted for me the difficulties of the arbitrary deadline approach to project management. Unfortunately, it’s also the default management approach applied by a lot of people, because the concept is easy to grasp.

The arbitrary deadline method is troublesome because of the difficulty of estimation. As John’s post elaborates, you can never foresee all of the problems you’ll meet along the way. The distressing inevitability of 90% of the effort being required by 2% of the deliverable is frequently inexplicable to developers themselves. Never mind the managers remote from the development!

I’ve been considering why my experience of working with open source seems generally preferable to commercial work, and this may be one of the key elements. Commercial software development is deadline-driven, whereas most open source development has not been, in my experience; ‘it’s ready when it’s ready’.

Edd suggests that using a trouble-ticket-based system for progress tracking and management is superior. I’m inclined to agree.

Irish SME associations quiet on patenting

Patents: yes, I keep rattling on about this — the vote is coming up on July 6th. I promise I’ll shut up after that ;)

UEAPME has issued a statement regarding the directive which is strongly critical of its current wording (UEAPME is the European small and medium-sized business trade association, comprising 11 million SMEs). Quote:

‘The failure to clearly remove software from the scope of the directive is a setback for small businesses throughout Europe. UEAPME is now calling on the European Parliament to reverse yesterday’s decision at plenary session next month and send a strong message that an EU software patent is not an option,’ Hans-Werner Müller, UEAPME Secretary General, stated.

‘There is growing agreement among all actors that software should not be patented, so providing an unequivocal definition in the directive that guarantees this is clearly in the general interest. We are calling on the Parliament to support the amendments that would ensure this,’ said Mr Müller.

‘The cacophony of misinformation and misleading spin from the large industry lobby in the run up to this vote has obscured the general consensus on preventing the patenting of pure software.’

That’s all well and good. So presumably the Irish members of UEAPME, ISME and the SFA, are agreeing, right? Sadly, neither of these have issued any press releases on the subject, as far as I can see, and approaches by members of IFSO have been totally fruitless.

Since both have made recent press noting that Irish small businesses face difficulties with the rising costs of doing business, this would seem to be a no-brainer — legalising software patents would immediately open Irish SMEs up to the costs associated with them: licensing fees, fighting spurious infringement litigation from ‘patent troll’ companies, the ‘chilling effects’ on investors noted by Laura Creighton, and of course the high price of retaining patent lawyers to file patents on your own innovations. One wonders why they aren’t concerned about these costs…

Happy Midwinter’s Day!

Antarctic: Happy Midwinter’s Day!

I’ve just finished reading Big Dead Place , Nicholas Johnson’s book about life at McMurdo Base and the US South Pole Station, with anecdotes from his time there in the early years of this decade.

It’s a fantastic book — very illustrative of how life really goes on on a distant research base, once you get beyond romantic notions of exploration of the wild frontiers. (Like many geek kids, I spent my childhood dreaming of space exploration, and Antarctica is the nearest thing you can get to that right now.) A bonus: it’s hilarious, too.

Unfortunately it’s far from all good — as one review notes, it’s like ‘M*A*S*H on ice, a bleak, black comedy.’ There’s story after story of moronic bureaucratic edicts emailed from comparatively-sub-tropical Denver, Colorado, ass-covering emails from management on a massive scale, and injuries and asbestos exposures covered up to avoid spoiling ‘metrics’.

Here’s a sample of such absurdity, from an interview with Norwegian world-record breaking Antarctic explorer, Eirik Sønneland:

BDP: I was working at McMurdo when you arrived in 2001. I remember it well because we were commanded by NSF not to accommodate you in any way, and were forbidden to invite you to our rooms or into any buildings. We were told not to send mail for you, nor to send email messages for you. While you were in the area, NSF was keeping a close eye on you. What did the managers say to you when you arrived?

They asked us what plans we had for getting home. The manager at Scott Base (jm: the New Zealand base) was calm and listened to what we had to say. I must be honest and say that this was not the way we were treated by the U.S. manager. It was like an interrogation. Very unpleasant. He acted arrogant. However, it seemed like he started to realize after a couple of days that we didn’t try to fool anybody. He probably got his orders from people that were not in Antarctica at the time. And, to be honest, today I don’t have bad feelings toward anyone in McMurdo. Bottom line, what did hurt us was that people could not think without using bureaucracy. If people could only try to listen to what we said and stop looking up paragraphs in some kind of standard operating procedures for a short while, a lot could have been solved in a shorter time.

One example: our home office, together with Steven McLachlan and Klaus Pettersen in New Zealand, got a green light from the captain of the cargo ship that would deliver cargo (beer, etc.) to McMurdo, who said he would let us travel for free back to New Zealand if it was okay with his company. At first the company was agreeable, but then NSF told them that the ship would be under their rent until it left McMurdo and was 27 km away. Reason for the 27 km? The cargo ship needed support from the Coast Guard icebreaker to get through the ice. Since, technically, the contract with NSF did not cease until the ship left the ice, NSF could stop us from going on the ship. At which point NSF offered to fly us from McMurdo for US$50,000 each.

He also maintains an excellent website at BigDeadPlace.com, so go there for an idea of the writing. BTW, it appears the UK also maintains an Antarctic base. Here’s hoping they keep the bureaucracy at a saner level over there.

The meaning of the term ‘technical’ in software patenting

Patents: One of the key arguments in favour of the new EU software patenting directive as it’s currently worded, from the ‘pro’ side, is that it doesn’t ‘allow software patents as such’, since it requires a ‘technical’ inventive step for a patent to be considered valid.

Various MEPs have tried to clarify the meaning of this vague phrase, but without luck so far.

Coverage has mostly noted this as meaning that ‘pure software’ patents are not permissible, for example this Washington Post article, FT.com,and InformationWeek.

But is this really the case, in pragmatic terms? What does a ‘technical inventive step’ mean to the European Patent Office?

Well, it doesn’t look at all promising, according to this report from the Boards of Appeal of the European Patent Office from 21 April 2004, dealing with a Hitachi business method patent on an ‘automatic auction method’. The claims of that patent application (97 306 722.6) covered the algorithm of performing an auction over a computer network using client-server technology. The actual nature of this patent isn’t important, anyway — but what is important is how the Boards of Appeal judge its ‘technical’ characteristics.

The key section is 3.7, where the Board writes:

For these reasons the Board holds that, contrary to the examining division’s assessment, the apparatus of claim 3 is an invention within the meaning of Article 52(1) EPC since it comprises clearly technical features such as a “server computer”, “client computers” and a “network”.

So in other words, if the idea of a computer network is involved in the claims of a patent, it ‘includes technical aspects’. It then goes on to discuss other technical characteristics that may appear in patents:

The Board is aware that its comparatively broad interpretation of the term “invention” in Article 52(1) EPC will include activities which are so familiar that their technical character tends to be overlooked, such as the act of writing using pen and paper.

So even writing with a pen and paper has technical character!

It’s a cop-out, designed to fool MEPs and citizens into thinking that a reasonable limitation is being placed on what can be patented, when in reality there’s effectively no limits, if there’s any kind of equipment involved beyond counting on your fingers.

The only way to be sure is to ensure the directive as it eventually passes is crystal clear on this point, with the help of the amendments that the pro-patent side are so keen to throw out.

(BTW, I found this link via RMS’ great article in the Guardian where he discusses software patenting using literature as an analogy. recommended reading!)

Latest Script Hack: utf8lint

Perl: double-encoding is a frequent problem when dealing with UTF-8 text, where a UTF-8 string is treated as (typically) ISO Latin-1, and is re-encoded.

utf8lint is a quick hack script which uses perl’s Encode module to detect this. Feed it your data on STDIN, and it’ll flag lines that contain text which may be doubly-encoded UTF-8, in a lintish way.

BSA Spams Patent Holders

Patents: An anonymous contributor writes:

‘I just received this letter and these pre-addressed postcards in the post this morning. I was surprised when I saw the envelope, because I’d never received anything from the BSA before. It turned out that they had extracted my name and address from the European Patents database, because I registered a software patent once. So a lot of these letters have been probably been sent out.

According to the letter, from Francisco Mingorance, the draft directive is being turned around to ‘rob small businesses of their intellectual property assets’.

I find it hard to see how that could be true. However the BSA’s letter has an important message you should heed – it is critical to contact your European representatives (your MEP and your country’s Commissioner) within the next two weeks. Let them know that the European Union should curtail software patents for once and for all.

Get out your best stationery and write to your MEP at the address given on this page.

Make sure your message is short and clear. SME’s don’t benefit from patents. Few patents are held by SME’s and the cost of applying for, maintaining and defending them is crippling.’

jm: I would suggest noting that you support the position of rapporteur
Michel Rocard MEP, and/or the FFII — details here. Please do write!

BTW, the contributor also offers: ‘if anyone is interested in doctoring up the BSA postcards, I can provide the hi-res scans.’ ;)

Amazing article series on Climate Change

Science: in April and May, the New Yorker printed an amazing series of articles on climate change by Elizabeth Kolbert, full of outstanding research and interviews with the key players.

Unlike much coverage, it includes the expected results of climate change in the US:

Different climate models offer very different predictions about future water availability; in the paper, Rind applied the criteria used in the Palmer index to GISS’s model and also to a model operated by NOAA’s Geophysical Fluid Dynamics Laboratory. He found that as carbon-dioxide levels rose the world began to experience more and more serious water shortages, starting near the equator and then spreading toward the poles. When he applied the index to the giss model for doubled CO2, it showed most of the continental United States to be suffering under severe drought conditions. When he applied the index to the G.F.D.L. model, the results were even more dire. Rind created two maps to illustrate these findings. Yellow represented a forty-to-sixty-per-cent chance of summertime drought, ochre a sixty-to-eighty-per-cent chance, and brown an eighty-to-a-hundred-per-cent chance. In the first map, showing the GISS results, the Northeast was yellow, the Midwest was ochre, and the Rocky Mountain states and California were brown. In the second, showing the G.F.D.L. results, brown covered practically the entire country.

‘I gave a talk based on these drought indices out in California to water-resource managers,’ Rind told me. ‘And they said, ‘Well, if that happens, forget it.’ There’s just no way they could deal with that.’

He went on, ‘Obviously, if you get drought indices like these, there’s no adaptation that’s possible. But let’s say it’s not that severe. What adaptation are we talking about? Adaptation in 2020? Adaptation in 2040? Adaptation in 2060? Because the way the models project this, as global warming gets going, once you’ve adapted to one decade you’re going to have to change everything the next decade.

And how the anti-climate-change side are attempting to control US public opinion:

The pollster Frank Luntz prepared a strategy memo for Republican members of Congress, coaching them on how to deal with a variety of environmental issues. (Luntz, who first made a name for himself by helping to craft Newt Gingrich’s ‘Contract with America,’ has been described as ‘a political consultant viewed by Republicans as King Arthur viewed Merlin.’) Under the heading ‘Winning the Global Warming Debate,’ Luntz wrote, ‘The scientific debate is closing (against us) but not yet closed. There is still a window of opportunity to challenge the science.’ He warned, ‘Voters believe that there is no consensus about global warming in the scientific community. Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly.’

They’re a great synthesis. Go read the articles — part 1 (‘Disappearing islands, thawing permafrost, melting polar ice. How the earth is changing’), part 2 (‘The curse of Akkad’), and part 3 (‘What can be done?’). They’re long, but if you’re still on the fence about this one, they’ll wake you up.

Bayesian learning animation

Spam: via John Graham-Cumming‘s excellent anti-spam newsletter this month, comes a very cool animation of the dbacl Bayesian anti-spam filter being trained to classify a mail corpus. Here’s the animation:

And Laird’s explanation:

dbacl computes two scores for each document, a ham score and a spam score. Technically, each score is a kind of distance, and the best category for a document is the lowest scoring one. One way to define the spamminess is to take the numerical difference of these scores.

Each point in the picture is one document, with the ham score on the x-axis and the spam score on the y-axis. If a point falls on the diagonal y=x, then its scores are identical and both categories are equally likely. If the point is below the diagonal, then the classifier must mark it as spam, and above the diagonal it marks it as ham.

The points are colour coded. When a document is learned we draw a square (blue for ham, red for spam). The picture shows the current scores of both the training documents, and the as yet unknown documents in the SA corpus. The unknown documents are either cyan (we know it’s ham but the classifier doesn’t), magenta (spam), or black. Black means that at the current state of learning, the document would be misclassified, because it falls on the wrong side of the diagonal. We don’t distinguish the types of errors. Only we know the point is black, the classifier doesn’t.

At time zero, when nothing has been learned, all the points are on the diagonal, because the two categories are symmetric.

Over time, the points move because the classifier’s probabilities change a little every time training occurs, and the clouds of points give an overall picture of what dbacl thinks of the unknown points. Of course, the more documents are learned, the fewer unknown points are left.

This is an excellent visualisation of the process, and demonstrates nicely what happens when you train a Bayesian spam-filter. You can clearly see the ‘unsure’ classifications becoming more reliable as the training corpus size increases. Very nice work!

It’s interesting to note the effects of an unbalanced corpus early on; a lot of spam training and little ham training results in a noticeable bias towards the classifier returning a spam classification.

Flickr as a ‘TypePad service for groups’

Web: a while back, I posted some musings about a web service to help authenticate users as members of a private group, similarly to how TypeKey authenticates users in general.

Well, Flickr have just posted this draft authentication API which does this very nicely — it now allows third-party web apps to authenticate against Flickr, TypeKey-style, and perform a limited subset of actions on the user’s behalf.

This means that using Flickr as a group authentication web service is now doable, as far as I can see…

DVD annoyances

Hardware: I’ve been needing a decent backup solution, since I’ve got 60GB of crud on my hard disk that isn’t being rsynced offsite yet. So I bought myself a nifty DVD writer from woot.com a week ago, supporting DVD+RW, DVD+R, DVD-RW, and DVD-R, and a spindle of 20 DVD+Rs from Target. Little did I realise the world of pain I was entering.

Did you know there are no less than 6 barely-compatible DVD formats? Prerecorded DVD, DVD-RAM, DVD-R, and DVD-RW, from the DVD Forum, and DVD+RW and DVD+R, from the ‘DVD+RW Alliance’. Interoperability is, needless to say, a total mess, even with the Sony 4-format drive I picked up.

I eventually managed to burn myself a DVD+R backup of bits of my home dir, making several coasters in the process (DVD+Rs apparently do not support simulated-write dry-runs, at least not with growfs). So, great!

Next thing to do was try it out on my laptop’s internal CD/DVD drive to make sure it worked. Needless to say, it didn’t.

Apparently, single-session, single-track DVD+Rs are virtually identical to DVD-ROMs, which most generic DVD-reader drives support. However, Sony drives do not support setting the ‘book type’ bits, which is the trick that turns a DVD+R ‘into’ a DVD-ROM-compatible disc. Guess why (hint: it’s Sony). Yep, that’s right, paranoia about piracy. Well, thanks a bunch, Sony — my backups are now of decidedly limited usefulness, since I don’t know if I’m ever going to be able to read them again! (more info from the OSTA.) I think I now see why Woot were flogging them cheap.

I’m not sure where to go with this — do I have a spindle of 17 shiny frisbees? I have a very nasty feeling I’m heading into dead media territory here. What a mess…

Aaaanyway. Here’s some possibly-useful bookmarks.

OTOH, I got to watch the BBC’s new documentary, The Power of Nightmares, a fantastic history of the two parallel ideological worlds of al-Qaeda and the US neo-conservatives. Mind-boggling, but highly recommended.

European swpat update letter

Patents: Ian Clarke copied the FSFE-IE mailing list with a good mail he sent to Mairead McGuinness MEP, detailing the current state of proposed fixes to the European software patenting directive. He discusses a comment from an Ericsson employee asking for software patentability:

It may be the case that this employee was concerned about Ericsson’s ability to compete against smaller competitors if Ericsson cannot use software patents against them. I would argue that it is not the responsibility of any EU institution to protect Ericsson against legitimate competition from other companies, indeed competition must be encouraged. Software patents will have a stifling effect on competition in Europe, and this is why some large companies like Ericsson are strong advocates for this directive.

And a brief overview of the amendments we want:

The Foundation for a Free Information Infrastructure, an organisation whose line we endorse, has prepared an analysis of the amendments, indicating which will help to ensure that software patents do not become patentable, and which will not. This document may be downloaded here.

In particular, we support the position and amendments of Piia Noora Kauppi MEP, who has taken a strong position against the introduction of software patents within the EPP group, and also the position of Michel Rocard MEP who is the rapporteur for this Directive.

The only other thing it misses, in my opinion, is a paragraph discussing the ‘as such’ loophole that has been heavily relied upon by most pro-swpat politicians recently — the trick of saying ‘this directive does not permit software patenting, as such‘.

Indeed, it does not permit patenting of all software techniques, but instead permits the patenting of software techniques as long as it is of ‘a technical nature’ — without defining what that means. Given that it’s clearly arguable that all software is technical, and since patent offices earn money based on the patents they accept, rather than those they reject, this is a loophole the size of a bus. Many of the desired amendments concern cleaning up this obvious omission.

Anyway, here’s the full text of Ian’s mail from the list archive.

Dot-coms and geographical insularity

Web: i caught sight of (8 June 2005, Interconnected), on the geographical insularity of the dot-com boom. A good read:

The huge influx of cash at the turn of the millennium led to the whole Web being built in the image of the Bay area. The website patterns that started there and – just by coincidence – happened to scale to other environments, those were the ones that survived.

Lots to think about. He’s spot on, of course — many of the web’s big commercial success stories are almost shamelessly US-oriented, and if they work outside that, it’s purely by accident.

I’d love to see more web businesses that work well for other parts of the world, but that’ll take money — and from what I saw in Dublin, the money either (a) just isn’t there, or (b) frequently goes to the companies that talk the talk, but then piddle it away on ludicrous ‘e-business architectures’ and get nothing useful out the other end.

On both counts, Silicon Valley has an ace up its sleeve. The VCs are smart and well-funded, and the developers have experience, and know which tools are right for the job.

I’d be curious to hear how other high-tech hotspots in the US (Boston, for example) find this.

IBM patents web transcoding proxies

Web: I link-blogged this, but it’s generated some email already, so it deserves a proper posting.

One thing you quickly learn about IBM where software patents are concerned, is that if IBM Research is making noise about a new software technique, they’ve probably patented it already. A few years ago, IBM was keen on HTTP transcoding — rewriting web content in a proxy, to be more suitable for display and access from less-capable devices, like PDAs and mobile phones.

So I probably should not have been surprised today when I came across USPTO patent 6,886,013, which is an IBM patent on a ‘HTTP caching proxy to filter and control display of data in a web browser’. It was applied for on Sep 11 1997, and finally granted on Apr 26 of this year.

The first claim covers:

  1. A method of controlling presentation on a client of a Web document formatted according to a markup language and supported on a server, the client including a browser and connectable to the server via a computer network, the method comprising the steps of:

    as the Web document is received on the client, parsing the Web document to identify formatting information;

    altering the formatting information to modify at least one display characteristic of the Web document; and

    passing the Web document to the browser for display.

Notice that there’s actually no mention of a HTTP proxy there — in other words, an in-browser rewriting element, such as Greasemonkey or Trixie may be covered by that claim. However, the claim does indicate that the document is passed from the ‘client’ to the ‘browser’, so perhaps having the ‘client’ inside the ‘browser’ evades that.

It appears this really wasn’t original research even when the patent was applied for — there’s probable prior art, even if the patent itself doesn’t cite it. For example, WWW4 in 1995 included Application-Specific Proxy Servers as HTTP Stream Transducers, which discusses ‘transduction’ of the HTTP traffic and gives an example of ‘A “rewriting” OreO (transducer element) that encapsulates each anchor inside the Netscape Blink extension, making anchors easier to spot on monochrome displays’. On top of that, Craig Hughes notes that his ‘senior project at Stanford in 1992 was an implementation of a content-modifying HTTP proxy. It re-worked HTML in http streams to add some markup to enable full navigability through touch screen or voice control, for screen-only kiosks.’

Add this to the ever-growing list of over-broad software patents.

Getting JuK to output sound via ALSA

Linux: Linux sound is still a mess. Due to the ever-changing ‘sound server of the week’ system used to decide how an app should output sound, it’s perfectly possible to have 3 apps on your desktop happily making noise at the same time, while another app complains about requiring exclusive access to /dev/dsp — or worse, hangs silently while it attempts to grab an exclusive lock on the device.

This page gives a reasonably good guide to getting software mixing working across (virtually) all apps, using ALSA software mixing and esd.

However, some cases are still very kludgy — in particular, JuK, the excellent KDE mp3 jukebox app, has a tendency to play poorly with others, requiring playback via no less than two sound servers — artsd and esd — to work correctly in the above setup. In addition, the support for mp3 files in artsd is buggy — it’s frequently unable to open certain mp3s, depending on how they were encoded.

Well, good news — the current release of JuK now supports direct playback from GStreamer via ALSA. Here’s how. By adding these lines:

[GStreamerPlayer]
SinkName=alsasink

to ~/.kde/share/config/jukrc, you can skip sending JuK mp3 playback via 2 sound servers, and just play directly to the hardware from the mp3 player. An improvement! Not quite optimal, and certainly not user-friendly — but getting there…

Patents come to computer gaming

Patents: in a recent discussion about games and patents, it emerged that these common elements are patented:

Looks like software patenting is coming to computer games in a big way. I’m not sure how any game on a modern platform can avoid the ‘streamed loading’ patent.

Naturally, I can remember playing games on the Commodore 64 in the 1980s that included these…

Yet another non-smoking weblog

Life: seeing as yesterday was World No Tobacco Day, it’s worth noting that I gave up smoking last Thursday.

This is the first time I’ve taken the step of quitting with any seriousness. I’ve been smoking since I was 18 or 19, without any real attempts to quit before now. It was a gradual process, but imagining a smoker’s future, with the diseases and reduced life expectancy it involves, makes it quite sensible in the end. So far, it’s going pretty well — lots of occasional pangs, but nothing I can’t say no to… especially with the aid of Liquorice Altoids. wish me luck!