Five reasons why an opt-out porn filter is not a good idea
Spurred by the UK’s new anti-porn filter, Canadian MP Joy Smith is proposing that our country jump on the bandwagon and protect our children from viewing explicit content online. Naturally, Twitter responded immediately, with many worrying that this will amount to censorship of the internet, and with Smith responding that clicking a button to opt out is hardly onerous and is a reasonable sacrifice when it comes to Keeping Our Children Safe.
I’m not going to get into the debate about the affect of pornography on children, or even the one around the role of parental responsibility in this matter. You don’t need to touch on those things to understand why a filter like this is not a good idea. There are plenty of other reasons.
1. It won’t work.
China and Iran have tried to block pornography, along with plenty of other material, from their internet in its entirety. Their methods are likely broader than anything that Canada would be willing to attempt, because worries of being accused of censorship are pretty much non-existent. And it still doesn’t work. The internet is always changing, and sites pop up and disappear by the minute. That’s not going to change.
If kids want to find this material, they will. The easy response to this is: Well, at least it’ll make it a little harder, and isn’t doing something better than doing nothing? The less easy answer is: That depends. How much will it cost to implement this? Who maintains it? Is it going to make any difference? Or is it just going to be an empty gesture? Hard to say, and maybe it’s best to let the British situation pan out a bit before trying to answer it.
2. Someone needs to decide what pornography is, and we’re not very good at that.
There are some examples of sites that are plainly pornographic. They put it right up in their name. But there’s lots of things that walk the line. And putting that in the hands of a regulatory agency is always tricky. Is this just an image filter? Are tasteful nudes subject to it? Anatomical drawings? What about written content? Does erotic literature count? What about literature with some erotic sections? Can a 16-year-old read Lady Chatterly’s Lover? Someone in the government will have to make that call.
But Smith isn’t just talking about porn — she’s talking about “adult content.” What will that include? Tim Hortons recently got into trouble for blocking a gay news website as being inappropriate for public viewing, despite it being a valuable resource for LGBT teens. Timmy’s says they weren’t the ones who blocked the site, laying the blame on a third-party agency, but that really just goes to show that you can’t trust anyone but yourself to agree with you on what is and isn’t appropriate. But if the filter ends up far-reaching enough that a kid doesn’t have access to educational content that would help them to understand their sexuality, that’s a problem.
3. Someone would then need to tag all the porn, and people aren’t good at tagging “infringing” materials in a way that doesn’t vastly overreach.
Look at the DMCA in the US. Companies are only supposed to send takedown notices about content that they know for sure is infringing their copyright. And yet notices get sent to take down unrelated content. They get sent to take down content that is clearly fair use. Companies have sent takedown notices to their own official sites. People just aren’t good at this stuff. The filters will overreach, guaranteed. And that means kids will be denied access to legitimate, non-adult material. That’s bad.
4. If you want the filter to be effective, it will have to monitor personal communications.
Think about it–the internet is designed for sharing. If one kid finds a site, they’ll email it around. Or they’ll seed a torrent. Or something. And they’ll give it a name that isn’t pornographic. If you want to stop that, the only way is to look at every file that’s being transferred and every email that’s being sent. Don’t want to do that? Well, you shouldn’t want to do that, it’s super creepy and over-reaching, and I fully believe that it’s pretty far from what you’re proposing. But if you don’t do it to everyone — both those who have the filters and those who opt out — then you won’t have much of a filter.
5. The content that people are getting most upset about is already illegal.
This is more a UK thing, but it’s already cropping up in the articles about MP Smith’s proposal, so let’s be as clear as possible: Anyone who mentions child pornography in connection with these measures is either intentionally or unintentionally lying and emotionally manipulating the discussion. That’s already illegal and already blocked. Police already do crack down on it. Could they be tougher on it? Could they do more? Maybe, but this measure is not about that. This is an entirely separate issue, about trying to prevent children’s exposure to legal images. Don’t let anyone hijack this debate on that front–it is disrespectful to the real problem of child abuse and shuts down legitimate dialogue.