Automattic/WordPress.com fight back against Censorship

WordPress LogoAutomattic – the company behind WordPress.com, have taken a decisive step in the fight against bogus DMCA claims.

Under the Digital Millennium Copyright Act, people can submit a takedown notice to web service providers where their intellectual property is being used without permission. This is the legislative attempt to protect hosts like Google, WordPress, Tumblr, etc from being held responsible for the content that their users post – provided that they swiftly restrict access.

However, whilst this system is designed to give a balance between protection and enforcement, the reality is that many times it is abused by those who wish to silence critics, or to censor views with which they disagree. The Church of Scientology infamously issued thousands of DMCA takedown notices to stop the spread of anti-Scientology views on Youtube, for example. This tactic is highly effective, as the content is almost always restricted (at its peak moment of attention), and the process to challenge the notices (a ‘counter notice’) isn’t something that creators are, or arguably should be, familiar with. In effect, it becomes a virtual game of ping-pong, with the burden of proof shifting to the ‘author’ of the content to prove that they actually have the rights to publish. Sites themselves can take action, but with the sheer volume of notices that they receive, it is often impractical, and rarely a route that businesses want to go down.

I’m both pleased and proud to see that WordPress are fighting back against two such bogus DMCA claims, as announced in this latest blog post, where you can find all the details of the two cases in question.

For the full text of the original post from Oliver Hotham – one of those that fell victim to the misrepresentative DMCA, continue reading below, where it is republished with permission.

Continue reading “Automattic/WordPress.com fight back against Censorship”

No, It’s Not Just About Porn

Glasgow Guardian NewspaperThe Glasgow Guardian is the student newspaper from the University of Glasgow, and is generally better than most of the publications of this type that I’ve come across – I’m not just saying that because they used to use my pictures.

I was surprised to come across an article in the last issue entitled ‘The day the porn was still there‘, which spoke about the proposed Internet filtering that the Government were seeking to impose on ISPs in the UK.

It wasn’t great. Have a read for yourself.

I wrote a response, which has been published in the current issue, on page 11.

For those of you that aren’t fond of viewing content online in a format suitable for print (and why would you be?) the text is below:

I write in response to the article entitled ‘The day the porn was still there’, published on the 16th of September 2013, by Imants Latkovskis.

In the interests of full disclosure, I am a member of the Open Rights Group Supporters’ Council – the ‘London based NGO’ whose views were criticised in the original piece. However, the purpose of this response is not to rise to defend the content of the organisation’s claims; that job is for somebody else – and not what I signed up for.

Aside from the over-use of emotive language, and massively reductive statements (“ticking the ‘I want porn’ box is unlikely to be the start of a dystopian future.”), Latkovskis has managed to miss the point completely; the thrust of the article seeming to be that ‘illegal porn is bad, so it’s good that they want to block it’.

Latkovskis states that in justifying the proposed filter, David Cameron was ‘referring to child pornography, which seems to be forgotten about quote after quote.’ However, he himself is guilty of confusing two quite distinct issues. This isn’t something that he should feel too bad about though, as the proposals have been deliberately designed to have this effect.

We have to separate out whether the purpose of an on-by-default Internet filter is either to:

1. Prevent access to illegal material, e.g. child pornography

or

2. Prevent people (ostensibly children) from accessing any pornography

These are two very different aims, and require equally different approaches.

It’s a moot point for the purposes of this article that none of these type of filtering systems actually work effectively in any regard. If you aren’t sure about that, just ask yourself when the last time was that you or a ‘friend’ used a VPN to get onto American Netflix, or a proxy to peruse certain eye-patch clad torrent websites. Never, I’m sure!

The concern that any sort of default Internet filter will inevitably also block access to other content is not an unfounded one. Mobile operators such as Orange and T-Mobile have already blocked sites under categories headed ‘Chat’ and ‘Forums’ from their service without an explicit indication that you want to gain access – something that you often cannot obtain until you have been a customer for 6 months or longer. Nobody should have to submit their name to a database with tick-boxes against the categories of content they have chosen to view, or what information they want to exchange, and that is a realistic consequence of these proposals.

This fundamentally isn’t about pornography, and to suggest that those who question a blanket, State-mandated Internet filter are advocating free and unfettered access to ‘material depicting rape and child abuse’ is at best disingenuous, and dangerously mis-informed.

Yet another badly thought out, technologically incompetent piece of legislation (if ISPs are not pressured into this ‘voluntarily’) will do nothing to protect children, nothing to stop the spread of illegal material, and only serve to further squeeze the freedom to communicate and disseminate information online.

 

Online Abuse: Asking the Wrong Questions

ask.fm
The whole discussion about anonymous abuse online is asking the wrong questions. Rather than target the technology, let’s look at what’s really going on. It’s far too easy to point fingers at the web, rather than analyse how we use it and why.

Head over to the Open Rights Group website to read more from me on this.