Pints, royals, strawberries and cream, bulldogs, cricket, fry-ups, football hooligans, and porn filters. All things British. The last item on the list being a recent arrival announced by Prime Minister David Cameron in a speech last week.
The filters will require an active selection to be able to view porn — plus anything that gets misidentified as porn — on one’s home internet connection.
Now, given that many Brits seem barely able flirt without the aid of alcohol (sorry!), one wonders exactly how many people will go to the trouble of getting the filters removed… but allow FT Alphaville to note that we kinda put “porn” in at least a couple of headlines over the last year. Just FYI.
David Cameron’s speech last week covered both the horror that is child pornography and legal porn. This makes it unattractive to instigate debate about the filters on the latter. Even we hesitated. One suspects the blur is the reason there hasn’t been more coverage on this.
So in order to clear up any remaining potential confusion before we continue, let’s separate the two topics as clearly as possible:
• Child pornography = illegal and evil. Let us hope everyone does as much as is in their power to stop it. Concerning technology, internet service providers (ISPs) and search engines already make efforts to block such content. Where they can do more, they should.
• Legal pornography = legal and make up your own mind. David Cameron is concerned that children can access this legal content over the interwebs and therefore the ENTIRE COUNTRY needs to have it filtered out by default.
This represents a u-turn by the government. Previously it had been discussed that households would have to opt-in to put filters on. Cameron’s speech announced the opposite: that households will get the filters and they’ll have to opt-out if they don’t want them.
His speech also announced that ISPs have until the end of next year to get customers decide on their filter status. It’s up to the ISPs how they do this, and those that supply 95 per cent of the country have already stated their agreement.
Sit back and take a moment to imagine what the future might sound like.
“Hello, you’re through to BT. This call may be recorded for training purposes. How may I help you, Mr Smith?”
“Uhhh, I, errr… filters… ummm…”
“Would you like to view adult content, Mr Smith?”
“Oh uh, no! I mean, I think it’s blocking some educational material, you see I…”
“Sure, Mr Smith. Do you have the approval of the other account holder, Mrs Smith?”
Asides from all the potential for embarrassment in a nation of people who go to great lengths to avoid that particular emotion, there are some issues with this that go beyond one’s attitude towards legal pornography.
Where do the blocks stop?
Let’s say a magic machine can successfully block porn. Now, what about other sites the government or ISPs deem bad? As reported by The Independent (emphasis ours):
Speaking on the BBC’s Jeremy Vine programme, Mr Cameron said what would be included in the filters would evolve over time. “The companies themselves are going to design what is automatically blocked, but the assumption is they will start with blocking pornographic sites and also perhaps self-harming sites,” he said.
“It will depend on how the companies choose how to do it. It doesn’t mean, for instance, it will block access to a newspaper like The Sun, it wouldn’t block that – but it would block pornography.”
Mr Cameron said he did not “believe” written pornography, such as erotic novel Fifty Shades of Grey, would be blocked under the plans. But he added: “It will depend on how the filters work.”
The Open Rights Group, a digital advocacy organisation, spoke to ISPs and wrote in a post entitled “Sleepwalking into censorship“. That post outlines the group’s expectations that the potential blocks will also include “extremist and terrorist related content”, “anorexia and eating disorder websites”, “suicide related websites”, “alcohol”, “smoking”, “web forums”, etc. This seems extreme, but it does illustrate the next issue rather well.
Cory Doctorow wrote about porn filters for the Guardian last year when the topic was being debated in the House of Lords. We highly recommend reading the whole thing. Here’s a snippet about past attempts at filtering:
In 2003, the Electronic Frontier Foundation tested the censorware used by US schools to see how many of the most highly-ranked documents on concepts from the national school curriculum were blocked by the school’s own censorware. They discovered that 75-85% of these sites were incorrectly classified. That percentage went way, way up when it came to sensitive subjects such as sexuality, reproductive health, and breast cancer.
That study is a decade old, and dates from a time when the web was comparatively minuscule. Today’s web is thousands of times larger than the web of 2003. But labour isn’t thousands of times cheaper, and good content has not gotten thousands of times easier to distinguish from bad content in the interim.
Concerning why do filters do such a bad job, Doctorow writes:
To filter content automatically and accurately would require software capable of making human judgments – working artificial intelligence, the province of science fiction.
As for human filtering: there simply aren’t enough people of sound judgment in all the world to examine all the web pages that have been created and continue to be created around the clock, and determine whether they are good pages or bad pages.
Even if the filters were accurate, there are, ways around them. As The Economist writes:
The commentary on such filters tends to focus on whether tech-savvy users (often code for “teenagers”) can defeat them. Usually, they can. One simple filtering method is called “DNS poisoning”, in which an ISP manipulates the tables that allow computers to translate human-friendly URLs (such as http://www.economist.com) into the numerical addresses that computers use (such as 126.96.36.199). Requests for dodgy sites are simply redirected elsewhere, often to a page that displays a blocking message. But such restrictions are easy to bypass. A proxy—a third-party website that fetches pages on your behalf—will defeat such a system, as will instructing your computer to use untainted DNS tables, which are freely available online (from Google, among others).
No judge, no jury
For those not taking the long way around the filter, what about cases of incorrectly blocked sites?
The only way to know they are blocked would be to compare unblocked access to blocked access. Even if the owner of site manages to figure this out, what will the procedure be for them to get the block(s) lifted? At present, it’s an unknown.
The Porn-Watchers list
Soon there will be lists, of opt-ins and outs. Now that would be an interesting list to have…
According to the European Court of Justice, insurance companies can’t discriminate on the basis of gender when pricing policies. What about discriminating on the basis of one’s porn filters being on or off, whether in insurance or any other field?
How about when a person requests a criminal records check because they want to work with children and its required for the job — will their porn filter status affect the result? Let’s hope not. More information about the protections we can expect would be helpful.
Dissent in the ranks
Not all ISPs are in agreement about filters. One small provider, Andrews & Arnold Ltd, describes their approach as follows (click to expand):
It’s the implementation
We’re writing about this because David Cameron wants to make it less likely that children will view porn on the internet. That seems like a fine goal for any parent to have. It’s the suggested implementation that’s questionable, at least in this blogger’s opinion.
Such filters have the potential to create an illusion that such filtering can be effectively outsourced to the government and/or ISPs.
This illusion may lead to complacency. Even accepting an error rate in order to block some porn sites, or indeed other sites deemed bad by anonymous people or algorithms, does nothing to protect children against online bullying or grooming.
Given the lack of reasonably clear upsides, and the potential downsides, perhaps the question we should try to answer is: when they grow up, will our children thank us for this?
Further unfiltered reading:
Online pornography: Cameron’s ‘war’ muddles two separate issues – The Guardian
David Cameron’s War on Porn: Is He Still Using Netscape and Lycos? – Slate
Q&A: UK filters on legal pornography – BBC
David Cameron’s speech in full – GOV.UK