Centralisation dressed as federalism?


, , , , , ,

There seems to be an assumption on all sides that the Smith Commission’s rather lackluster devolution proposals (which, one will recall, are still only proposals) will be revisited in the current Parliament. It is in this spirit that the Bingham Centre for the Rule of Law has produced A Constitutional Crossroads: Ways Forward for the United Kingdom. It suggests further devolution of powers to Scotland – though without joining the bunfight of precisely which powers should be devolved – and a ‘new Act of Union’ to formalise the relationship between the devolved institutions and the UK.

One of the more contentious proposals in the report is for any new federalist settlement to include restrictions on ‘secession referendums’ (the choice of nomenclature perhaps showing the viewpoint from which the authors approached the issue) – which might reasonably be described as a UK version of Canada’s Clarity Act. Adam Tomkins, one of the report’s authors, argues that:

Self-determination should be subject to the rule of law, just like any other process of constitutional governance. They need a clear legal basis and the law should provide for such matters as who may vote and how frequently such referendums may be held. “Once in a generation” should be a matter of law, not a matter for the First Minister or her predecessor randomly to determine.

The question that inevitably arises is one of legitimacy: if voters are not allowed to determine the frequency of referendums at the ballot box (by voting, or not, for a party proposing one), then by what alternative mechanism would such a restriction gain democratic legitimacy? The authors repeatedly point out – correctly – that constitutional changes in a constituent country of the UK require (as a matter of political reality, not of law) the consent of the people of that country, and yet propose that a major constitutional change should be made without such consent even being sought.

It is perhaps stating the obvious to point out that this would not go down at all well in Scotland. Indeed the lack of consent would be explicit: it would certainly be opposed by an overwhelming majority of MPs from Scotland, and there would probably also be a resolution in the Scottish Parliament condemning the idea. It is perhaps even possible that, if the time limit idea were taken up by the UK government, it might constitute the ‘material change in circumstances‘ which Nicola Sturgeon said would be required for her party to seek a mandate for a new independence referendum.

Why I switched to WordPress.com


, , , , ,

I’ve been with NSDesign for as long as I’ve had this site – since 2000 in fact. Lately though I’d been noticing a steady decline in performance – the admin interface could sometimes take 4 or 5 seconds to open, and tests with site performance diagnostics didn’t produce encouraging results either. This doesn’t have any obvious cause, as I hadn’t written any posts lately (hence traffic was infinitesimal) and only had five user-facing plugins installed. It might just be my imagination, but I have the impression that this decline in performance started when NSDesign spun off their hosting service to a separate company (‘Broadband Cloud Solutions’) – though it could simply be down to running PHP in CGI mode, which the WordPress Codex article on performance regards as a Bad Thing.

I’d half considered (even before deciding to change hosting) using something like a static site generator – the idea being that instead of storing your posts in a database and having script files of some sort to generate pages from that, you write the posts as Markdown or reStructuredText and then use the generator to turn that into a fully functional set of HTML files for a blog. The obvious benefit to this is that the web server doesn’t have to do anything difficult: it just gives each visitor a copy of the static page they requested. Since I’ve done some Python development before, Pelican would have been an obvious fit. The problem though was that although this is obviously a blog, the capabilities of static site generators as a category didn’t meet my particular use case:

  • You obviously can’t do anything interactive, given that you’re using static files. There are ways of getting around some of these limitations, but I’m not a particular fan of Disqus and wanted to avoid using it if possible
  • You lose the ability to update your site from any computer, whether a desktop laptop or mobile device. In fact you can only really update from a machine where you have administrative rights, a command line, and permission to install software.

Since I knew I definitely didn’t want to go down the cloud VPS route (just I didn’t want that sort of hassle for a personal site), that excluded the various Django-based solutions I might otherwise have gone for. Ultimately then, with Textpattern not having advanced hugely and Ghost relying on a VPS-type setup I knew I would end up using some form of WordPress. Installing WordPress yourself on a hosting account has the obvious advantage that you get full control over your WordPress install. You can use any theme you want, edit its structure however you want, and install any plugin you want. WordPress.com doesn’t allow you to do any of these things: you can only select themes from a gallery of presets, while installing plugins is verboten.

What you get in return for that loss of flexibility is actually quite good though. Premium gets you a total of 13GB of storage (everyone gets 3GB, buying Premium gets you an extra ten), which is roughly comparable to what the more reputable shared hosting providers offer at this price point. You also lose the ads that occasionally appear on free sites and gain the ability to modify the CSS (but not the structure) of whatever theme you end up choosing. The key advantage – and what ultimately persuaded me to go with WordPress.com – is that everything else is unmetered. You don’t need to worry about bandwidth (‘unlimited’ bandwidth shared hosting for £2 per month? Pull the other one!) or CPU usage (the real killer) – if your site gets Slashdotted then it just continues to work, rather than slowing down to a crawl, eventually dying, and getting your account suspended.

UPDATE: I actually wrote most of this as a draft before I ported the site over. Unfortunately the changeover went a lot less smoothly with my old host than would probably have been the case with their pre-outsourcing customer support team. I filed a fairly simple support ticket asking for the nameservers on the domain to be changed to the WordPress.com ones, and nothing happened except a query the next day effectively asking if I knew whether changing the nameservers meant the nameservers would be changed. I confirmed that I did, and still nothing happened. Perhaps I’m being unreasonable, but I don’t think it should take more than 24 hours to copy-paste three lines into a domain config – even at the weekend! It’s a shame that a 15-year relationship with a company had to end with transferring the domain to a different registrar in order to get something so simple sorted.

How to use iTunes (rather than the buggy VLC) to play the BBC’s HLS streams


, , ,

A few months ago the BBC discontinued its old MP3 radio streams in favour of the more modern HTTP Live Stream format. For PC users (unlike users of dedicated internet radios) there is no great technological problem, though the BBC haven’t done a great job of publicising the addresses of the new streams. Presumably they want us all to use their disappointing Flash-based ‘iPlayer radio’ web player. The main obstacle is that VLC (normally the go-to media player for all sorts of unusual formats) has a bug that causes very high CPU usage when listening to HLS streams – and the lead developer has refused to even attempt to fix the bug for the sole reason that the person reporting the bug doesn’t know how to produce a full-stack trace!

But enough about obnoxious people on web forums, the key question is how to play the streams without VLC. As odd as it might seem, the only other software I’ve come across that can play the streams is Quicktime. Now using the Quicktime Player itself might be a bit of a pain in the neck, but luckily you can also use iTunes (because Quicktime is what iTunes uses to actually play video/audio). To do this you need to get the URL of the stream (such as this one) and change the protocol (the bit at the beginning before the ://) from http to itls. You then go to File -> Open Stream in iTunes and copy-paste your modified URL into the box.

This is, of course, not a very slick or user-friendly process. To avoid all that, you can use the links below.

NB: These are the UK streams (they include sports coverage for example) so they won’t work elsewhere.

Blocking the Wikipedia fundraising ad banner


, , , ,

Every year Wikipedia runs a fundraising drive to help pay its significant operating costs. During that drive the normal ‘no adverts’ rule is suspended in order to display a fundraising ad banner, which goes beyond usual adverts by actually including a payment form within the advert itself! One can take positions for and against this idea (I don’t particularly want to get into that argument here), but this year’s implementation seems to be more obnoxious than usual – taking up half or more of the available vertical screen space on some devices. For those who want to, you can block the ad banner by adding the following text as a filter rule to your Adblock variant of choice.


Giffgaff jumps the shark with 66% price increase



Update: 24 hours of what Three passes off as customer service have made me much more amenable to the Giffgaff price hike]

Update 2: As jo pointed out in the comments, existing customers get a hike to £15 (matching Three) rather than £18

One of the factors hampering smartphone adoption was always the lack of unlimited data contracts: 1GB is actually a very small amount of data, and watching a single TV programme on the train home could exceed it, so if that’s your limit for an entire month then you effectively don’t have a mobile internet connection. Giffgaff, an MVNO using the O2 network, changed this by offering unlimited data for £10 a month – a price they achieved by only offering ‘SIM free’ deals rather than subsidised phone purchases. This eventually became £12/month about a year ago, and still there were persistent problems with network outages and non-existent data connections.

All of this should make the recent announcement on changes to ‘Goodybags’ (the name for the company’s calls+texts+data addons) less surprising than it seems to be for a lot of Giffgaff customers. To get unlimited data will now cost those who remain with Giffgaff £20/month, while the £12/month deal will revert to the pathetic 1GB/month that was the main reason most customers left their old network in the first place.

The obvious question then becomes whether there’s any point to Giffgaff as a network: would Ryanair, for example (because it has the same ‘cheap price, no frills’ business model as Giffgaff), still have customers if it announced it was increasing its prices by 66% without changing its service levels? The elephant in the room here is Three, which offers the same ‘truly unlimited’ data as Giffgaff on its ‘All in One £15‘ plan – 25% cheaper than Giffgaff’s new pricing, and undoubtedly with more reliable mobile internet.

Of course there are one or two things you’d need to replace when making the switch: Giffgaff offers unlimited calls to other people on the same network, whereas Three doesn’t. Adapting to this is by no means a mammoth task though – requiring a whole three clicks for Android users for example. Similarly Three offers 3,000 texts per month on its deal whereas I’m fairly sure the limit on Giffgaff is 5,000. So those of you who send more than 100 SMS messages per day (there must be some out there) might have to resort to instant messaging – again, hardly insurmountable.

One would have thought that such a dramatic price hike would have merited more explanation than 11 slides posted to a message forum.

Useful links:


Olympus 15mm f8 body cap lens


, , , ,

My most recent photographic acquisition is the quirky Olympus body-cap lens. While mirrorless cameras are well known for having more compact lenses than DSLRs (though the excellent Pentax DA Limiteds come pretty close), this lens takes it to new and intentionally-silly extremes. It has a front element you very nearly need a magnifying glass to see, and it protrudes no further from the camera body than an actual non-photo-taking body cap. This, the price, and the f8 maximum aperture should tell you that it isn’t going to win any awards for image quality. It certainly isn’t as bad as some of its detractors claim though: Photozone’s test results show that it has pretty good centre resolution, as well as better distortion than any Olympus or Panasonic prime between 12 and 25mm (!). Of course there is a fair amount of chromatic aberration (fixable in Lightroom, right?), and the less said about the corner resolution the better, but every lens involves tradeoffs and border resolution is the tradeoff for this lens.

As Ming Thein highlights, this simplicity of the lens is not a ‘downside’ but is in fact the whole point: you don’t need to worry about – and spend time on – choosing your AF point or aperture. Just set it to the infinity mark, and at 15mm f8 just about everything will be in focus anyway – particularly on micro 4/3. What it amounts to is a fun additional option to have in your photographic kit – where its limitations are part of the charm and its unique ‘close-up or infinity’ focusing part of the challenge. It takes up the same space as the body cap, meaning that in essence it becomes a permanent wide-angle option in your camera bag: on days that you don’t have room for the kit lens, or even room for the 12mm f2, you will still have room for the BCL-15. As the shots I’ve posted below hopefully demonstrate, you can get good images out of this lens if you compose with its capabilities in mind and post-process appropriately.


Bygone era

PS: Don’t read anything into the fact that these shots are both black and white – the colour rendition of the lens is fine!

On comments


, ,

In my prolonged absence from blogging, I seem somehow to have missed a debate (bordering at times on a heated argument) as to whether comments remain the central feature they were when blogs first became popular. A post I read elsewhere (on the anti-comments site of things) was particularly thought-provoking, and since it was the spark that led to me opting for WordPress over Ghost – largely due to the latter’s lack of comments – I thought it would be a good idea to write a response of sorts on here.

The most frequent argument against enabling comments is in terms of moderation: you have to prevent spam, and you also have to moderate for content/tone to prevent the comments section becoming a cesspool of trolls. There isn’t really a counter-argument to this, because it’s true: sites with comments have to deal with comment spam. Though the rate of this isn’t as bad as it used to be, and the anti-spam filters are much better than they used to be, you do still have to deal with the occasional spam comment that gets through – whether submitted by a bot or a human. Similarly, trolls are a fact of life in any online discussion – and were part of the reason Popular Science shut down comments on their articles. I suppose it comes down to the amount of trolling a particular site has to deal with relative to the amount of time its authors have available to deal with the issue: a newspaper website with a paid moderating staff can handle trolls more easily than an individual blogger can.

There’s also a question of site performance: the most common method of supporting blog comments is to stick a ‘cloud comments’ widget like Disqus or Livefyre at the bottom of the page. Even national newspapers like the Telegraph (and even the paywalled Times) are doing this, despite the problems caused by iFrames. This can cause delays in page loads, as well as privacy concerns given that such widgets can track users (even unregistered ones) from one site to another. If anyone with a privacy plugin (Ghostery etc.) won’t see your blog comments in the first place, you have a problem. The problem (and the performance one) can however be solved by not relying on cloud widgets for comment functionality: with the notable exception of Ghost, all the major self-hosted (WordPress.org, MovableType, Textpattern) and service-hosted (WordPress.com, Blogger, Tumblr) blogging platforms support on-site comments

Added to concerns over workload and technical issues, Hugh Rundle makes two arguments for turning off blog comments on principle (my term, not his). The first is that in a world where social networks are a thing (they weren’t really when blogging got started 10 years ago), the walled garden of one’s own blog comments represents a sub-optimal means of interaction with readers. On this view, encouraging readers to respond on social networks instead has the potential to widen the scope of the discussion (and the audience for the site) to a great many people who wouldn’t otherwise have even read the post.

It’s obviously true that readers sharing your links on social networks has the potential to greatly increase your audience. Other than rants about politics, Twitter’s main utility is as a social news/link-sharing service – a Delicious for the 2010s, if you like. Similarly, encouraging people to comment directly on social media (rather than expecting them to trust your blog with their Facebook/Twitter login) would lead to greater engagement from people who might otherwise be reluctant to involve themselves in a debate. There’s no reason though why this should be mutually exclusive with blog comments, and the format of most social networks makes them impractical for the sort of reasoned, extended discussion that blog authors – whether for or against comments – really want.

With Twitter the character restriction is so obvious that it seems almost silly to mention it, but it is an issue in terms of site feedback. With 140 characters (minus 20 or so for the shortened URL, and say another 10 for an @reply to the author) you really don’t have room for any meaningful response at all – Twitter is great for sharing links to interesting pages, but the character restriction means that feedback is restricted to the very one-liners that are cited as a criticism of blog comments.

Facebook doesn’t have the same size restrictions, but even there the condensed format of the news feed (about 1/4 of the screen width) means that anything beyond a paragraph starts to look less than optimal (and probably falls into the TL;DR category for most people). Assuming you aren’t running a parallel Facebook fan page for your website, you also start to run into issues of privacy and post visibility with Facebook: if I share a link (with a hopefully-cogent response) to a blog post on my Facebook feed, my friends will see it but there’s an overwhelming chance that the blog author won’t – and nor will I be able to access his/her FB page to post it there.

Hugh’s second argument is that if comments are turned off then other bloggers will instead write a response post on their own blog. I’m not sure that this really holds true in terms of improving the quality of the debate – though I’m aware of the irony in seeking to contest this suggestion while doing exactly what he describes! Anyone who has their own blog will already prefer to write longer responses on their own site (rather than on someone else’s comment section), but I don’t think it’s necessarily true to divide responses into ‘post-length’ and ‘not-useful': a paragraph or two might well be a useful and thought-provoking contribution to the debate, but it wouldn’t be worth a post on its own. So too contributions from non-bloggers: if someone doesn’t have the inclination to write their own blog, they’re unlikely to start simply because comments are turned off on a site they want to reply to.

As for this blog? Comments are turned on, but I’d encourage you to start your own blog and respond that way if you’re at all inclined. Hit up WordPress.com and it takes 30 seconds to get started with a blog – no technical knowledge needed, no maintenance hassle. If you link to one of my posts, I’ll automatically get notified (via the trackback/pingback system that both WordPress.com and self-hosted WordPress support) and my site will link back to yours.

Site relaunch – still with WordPress


, , , ,

I fell out of ‘love’ with blogging for a long time (though love seems an inappropriate term for writing stuff on a website!) and intentions to relaunch this site properly have been on the back-burner for too long. The process of starting to write again was – predictably – hampered by the traditional procrastination over blog platform (which one to use, or whether to write one myself, etc.) and over hosting changes that might or might not be required as a result. This post is a brief explanation – for what it’s worth – of why I think WordPress is still the best option for a blog.


The reality of having to devote most of my time to academic work (summer project!) meant that the ‘to self-code or not’ question was answered in the negative. If I had more free time (and wasn’t spending most of my work time writing code at the moment!) I might have been more inclined to code a blog platform myself (most likely in django), but deprived of such time the obvious best option was to use someone else’s proven, feature-complete code to do all the lifting for me and just concentrate on writing blog posts again. Duplicating stuff that’s already been done elsewhere just doesn’t make sense.

The key question for a blogger in 2014 is whether to use WordPress (the established, ‘800-lb gorilla’ of the blogging and CMS world) or the up-and-coming Ghost (lightweight, node.js-based, shiny). I’ve perhaps got a slight attachment to WordPress, in that (barring brief experiments with Movable Type and then Textpattern) this site has been WordPress-based for as long as it’s been a blog. My best estimate, looking back at my email archives, is that I started using it at WordPress 1.5 – I certainly remember WordPress 2.0 being a big event (with a theme change added in).

There are lots of objections to WordPress, but most of them centre around its performance impact and on the complexity of its admin interface – it’s certainly weird to have WordPress talked about in those terms when I first encountered it as the upstart free-licensed blogging system challenging the dominance of (the then-paid and proprietary) Movable Type. The admin interface doesn’t bother me – I use the bits that are relevant to me and ignore the bits that aren’t, but I can see how it could be intimidating for non-techies (the perennial problem of FOSS: being primarily built by geeks, free software has a tendency towards geek-friendly rather than user-friendly UIs!). WordPress does have an advantage though, in that it can run on shared hosting (more on that below).

Ghost is the new kid on the block – node.js based, AJAX to its core, and with development funded by a Kickstarter appeal last year. Its admin interface is a thing of beauty, and its Markdown-based post editing is refreshingly lightweight (though not without its limitations). The problem with Ghost though is that it is missing so many basic features. The clue to that is of course in the version number (0.4.2 at the time of writing), but the project roadmap suggests that a number of design philosophies were just not what I was looking for. A key point to that is comments: the Ghost project does not support them and never will. Third-party plugin authors will be able to supply a comments system, but only when the plugin API is released (current estimate Q3 2014).

The current alternative – which really seems like the preferred option of the Ghost developers – is to use a third-party comments service like Disqus. Now there are arguments for and against blog comments (I’m going to deal with that in a separate post, which I’ll link to when it’s done), but I think any CMS which attempts to be a blogging platform really should give authors the option of enabling comments without resorting to third-party services. Without any degree of interactivity in blog posts, there really isn’t anything you can currently do in a default install of Ghost that you couldn’t do with static HTML/CSS (or perhaps with something like Jekyll to preserve your sanity). ‘Copy-paste an iframe-based widget from some cloud service’ should never be the answer for core functionality, particularly when such an emphasis has been placed on responsive and fast-loading pages.

There is no denying that Ghost is faster than WordPress, but while it would be unreasonable (in fact downright obtuse) to expect Ghost to match WordPress for features (since it is explicitly aimed at being a modern ‘back to blogging’ lightweight system), if your site can run with ‘just post editing’ then why not do that with static pages which will run on any hosting and are faster than any CMS?

There’s no denying the aesthetic beauty of Ghost though. I’ve borrowed a small portion of that beauty through Lacy Morrow’s WordPress port of Ghost’s default theme.Update: For various cosmetic reasons I’m not using ghost-wp after all. I’m now using a child theme (a snazzy non-destructive way of modifying WordPress themes) of the default Twenty Fourteen theme. I’ll likely be tweaking a few bits of the theme over the next few days – so if you come to the site and something looks odd then drop me a line in the comments!


As mentioned above, an advantage of WordPress is that (since it runs on PHP/MySQL) it will happily run on just about any shared hosting service. This shouldn’t really determine your choice of blogging platform – the cost of moving to something like a DigitalOcean instance isn’t that great – but it is a fringe benefit of using WordPress if/when one decides in its favour on other grounds. Running on something like a DO droplet also requires that you handle OS updates, firewall rules, and so on yourself: the time and hassle to do all that (and to replace the stuff that shared hosting includes, like email accounts/forwarding), and to make sure it doesn’t negatively impact the performance of the server, is a significant constraint in itself.

I’ve been with NSDesign for more than a decade, and the hosting is great. Yes, it’s shared hosting: it has the same limitations in terms of control and customisation as every other shared hosting service, but for any given level of hosting the key differentiator is really customer service. On that score I can’t fault them: I get answers to support tickets quickly (though admittedly I haven’t had much reason to use tech support since their Broadband Cloud Solutions joint venture) and from a human being rather than an auto-responder. While there are the usual limitations on storage/bandwidth I see no reason to switch to another host unless this site becomes popular enough to make the latter an active rather than a theoretical limitation (and I’m not holding my breath for such a deluge!)