Nov 07, 2019

Written By Becky Kells

Sex and censorship: Dissecting the UK’s now-abandoned “porn ban”

Nov 07, 2019

Written By Becky Kells

The Government recently dropped a proposal to limit access to pornographic websites—a proposal that had sparked debates about child protection, internet censorship and data breaches, to name but a few. Here we look at the now-abandoned porn ban, as well as examining the context surrounding it.

Until last month, commercial porn websites—accessible to all following a quick Google search—were threatened with a digital barrier. For the first time in the world, users would have had to prove their age before continuing, with all the videos, images and text one might expect to find on a porn site notably absent.

This proposed age-verification step emerged following a tangled debate around internet safety, web censorship and technology that had dominated the UK porn landscape. The so-called “porn ban” stems from the UK’s Digital Economy Act and had been in the works for a long time. In fact, it was actually meant to hit UK IP addresses in April, but concerns about its efficacy pushed the deadline back, until its abandonment in October.

The proposed benefit of a porn ban is the protection of underage internet users. The proposed drawback: putting porn behind a verification system meant that users would have to enter data, and data collection creates a risk of data breach.

How would it have worked?

When an IP address from the UK tried to access a website that comprises over a third of pornographic material, that user would have been redirected to a landing page—devoid of any sort of porn—which would then have prompted them to verify their age. Unlike some pre-existing age-verification gateways with a simple tick box or date-of-birth entry form, these landing pages would have been much more stringent. No longer would underage users be able to fudge a birthday or tick a box in bad faith.

It would have been up to the porn sites themselves to implement the technology, and the British Board of Film Classification (BBFC) had been given the responsibility of checking that this actually happened.

Verification methods would have varied from site to site, as it was up to the individual porn sites—or the companies that own them—to implement the checks. Some might have opted for third-party verification providers, prompting users to submit their passport, driving licence or card details. Others might have purchased a pass from a local shop. A number of third-party providers had spoken out amid fears of data collection and breaches. AgeChecked said it would not store any verification data; AgeID said it would not “see” any of the data; and AgePass was going to use Blockchain so the age-verified person couldn’t be tracked.

Those in favour…

Porn has long been a divisive subject, particularly in the UK. While the US and Europe made hardcore material legal in the 1970s, the UK held back until 2000. The decision to create a verification system for pornographic content tied in with conversations about online freedom of expression taking place all over the world—but the UK has historically made stronger efforts than its neighbours to regulate online content. In 2014, it was listed as an “enemy of the internet” due to high levels of censorship. Reporters Without Borders, who created the category, grouped the UK with China, Iran, Russia and Saudi Arabia.

Two years after this, a study commissioned by the NSPCC found that 53% of 11- to 16-year olds surveyed had seen sexually explicit content online. The debate shifted from concerns about censorship, to fears that children were exposed to pornographic content. 

The children surveyed had varying motives to seek out porn. Some admitted that they sought it out of curiosity, or to learn about sex and sexual identities. Others did it to “freak out” their friends, “for a laugh”, to “break the rules” or to “be disgusted”.

Whatever the motive, the NSPCC research suggested far-reaching consequences from porn use by children: “more negative attitudes towards roles and identities in relationships”; “risky sexual behaviour”; and “unrealistic expectations of body image and performance”. One anonymous boy said: “I’m always watching porn and some of it is quite aggressive. I didn’t think it was affecting me at first, but I’ve started to view girls a bit differently recently and it’s making me worried.”

Childline gave 158 online-porn-related counselling sessions in 2017-18, while 147 were given on child-sexual-abuse images.

It was these findings that brought us to the proposed ban. Baroness Shields, the UK Minister for Internet Safety, explained that the age verification would “help make sure children aren’t exposed to harmful sexual content online”, adding: “Just as we do in the offline world, we want to make sure that online content that is only suitable for adults is not freely accessible to children.”

Those against…

It’s hard to argue that pornographic content should be accessible to children. But those who access pornography come from a huge range of backgrounds and age ranges. While 60% of users on popular pornographic tube site PornHub are under the age of 35, the average user age is 35.3 years.

Recent history has given birth to a new kind of security risk, with Ticketmaster, Facebook and the NHS being subject to data hacks and breaches in the past few years. The Ashley Madison data breach of 2015 had a moral edge to it. When user data was leaked from the website—which facilitates extramarital affairs—families were broken apart and some users reportedly committed suicide.

Pornography comes with a similar threat of shame: porn use is hardly a public topic of conversation, and many have expressed an issue with any sort of verification system that prompts data entry. “It might lead to people being outed,” said Kim Kilnok. Speaking to The Guardian in March, Kinlock—the executive director of the Open Rights Group—emphasised that while hypothetical victims of a breach wouldn’t have been breaking the law, it could “destroy your reputation”, using the example of a “teacher with an unusual sexual preference” whose pupils would find out as the result of a leak.

There was also concern that the law would be implemented too quickly, as a knee-jerk reaction to one research report. Many also pointed out that all that’s required to bypass a UK-based porn ban is a VPN. And as cybersecurity journalist Charlie Osborne pointed out in a Zdnet article, “the teenagers who are meant to be ‘protected’ against this material are the same ones [who] have grown up with such technologies their entire lives and are, in many cases, far more familiar with technology than our lawmakers appear to be…they are not going to be stopped by an ill-thought-out, location-based verification check [which] can be bypassed in a few clicks”.

Porn in other places

While the NSPCC acknowledges the benefits of the porn ban, it also states that children are more likely to stumble across it online, via social media or due to pop-up pornographic images. Sites that would have been subjected to the verification system are “commercial porn sites” only. Porn has a huge presence on the internet: approximately 30% of the internet’s content is estimated to be porn. Ensuring that verification gateways are in place on the commercial porn sites would have been a mammoth task in itself. Regulating porn on other sites, blogs and social-media feeds goes beyond the scope of a single country or law.

That said, there has been a recent effort on the part of some social-media platforms to clean up their acts—or at least the acts of their users. Microblogging site Tumblr made headlines last year when it banned all 'not safe for work' (NSFW) content from across its sites. Tumblr markets itself as “a place to express yourself, discover yourself, and bond over the stuff you love”—by no means a commercial porn site. Yet it’s clear that a lot of users saw it as a ‘safe space’ for porn: since Tumblr banned pornographic content in December, traffic has dropped by nearly 30%, according to tech site The Verge. 

Tumblr’s decision to ban porn drew attention to a group perhaps not considered enough in the constant attempt to regulate porn: the porn performers themselves, for whom internet presence is a livelihood. In the simplest terms, Tumblr allows users to customise their own website form—a far cry from the rigid, uniform profiles of sites such as Twitter and Facebook. “I could do things my own way, on my own terms,” said Vex Ashley, a porn performer whose work originated on Tumblr. “Away from the immediacy of Twitter’s timeline and the punishing curated Instagram grid, your Tumblr was your intimate space, your messy bedroom.”

Reacting to Tumblr’s decision in December last year, Ashley wrote a Medium post about Tumblr porn’s benefits for “young, queer people”, heralding it as a platform where they could “take control and represent themselves”.

She continued: “There’s often moral handwringing about young people putting themselves in potentially dangerous face-to-face situations while they learn about their boundaries with sex, but then the avenues for relatively low-risk experimentation and curiosity are being systematically closed as our online life becomes more sterilised and censored.”

Shadow banning

Tumblr has approximately nine million users in the UK, and 45% of its audience are under 35. Also under the microscope for allegedly censoring NSFW content are Instagram and Twitter, with 14 and 13 million UK users respectively.

Neither Instagram nor Twitter has reported an outright ban of pornographic content—but recently, users have speculated that they may have been shadow banned from the platforms.

Shadow banning is a phenomenon in which users’ content is only visible to their followers, rather than appearing in searches or under hashtags as standard. Say you’re the owner of a non-commercial, pornographic social-media account, posting regularly. If you were shadow banned, you might be partially or completely hidden from users who would ordinarily be able to discover you. You might notice a drop-off in likes or interactions, but no official indication would’ve been given to explain what has happened, or why.

The social-media site is banking on your eventual frustration, and is expecting you to leave the site of your own accord. Shadow banning has been used in the past by the likes of Reddit and Craigslist, and can be effective in the case of trolls and spam accounts.

However, with Instagram recently increasing its review of 18+ content, some users are claiming that using hashtags that indicate NSFW content can land them with a shadow ban. While it’s a hard case to prove, Instagram does have a list of 114,000 banned hashtags, with words as ambiguous as #adultlife, #bikinibody, #eggplant and #girlsonly.

With social-media accounts taking steps on a global scale to reduce NSFW content, the controversy surrounding the UK’s porn ban came at a timely point in the censorship debate. At the very least, the verification process would have introduced a tedious extra step into the process of porn watching. At the most, it would have censored performers and increased the likelihood of a data breach.

Jim Killock, executive director of civil liberties organisation Open Rights Group, welcomed the Government’s U-turn: "We are glad the government has stepped back from creating a privacy disaster, that would lead to blackmail scams and individuals being outed for the sexual preferences. However it is still unclear what the government does intend to do, so we will remain vigilant to ensure that new proposals are not just as bad, or worse."

Advertisement

Advertisement

Commercial Insights