Why We Keep Getting 230 Wrong
by Kate D’Adamo, partner, Reframe Health and Justice
One of the biggest questions before Congress and the administration in 2021 is going to be the regulation of digital platforms and websites. Centered in this conversation will be section 230 of the Communications and Decency Act. 230 says that internet platforms are not responsible for the content posted on them that they did not create, a provision which has allowed the flourishing of social media, among other digital explorations. Congressmembers and op-ed-based experts have also decried it as the primary reason for why regulation of Facebook to stop encouraging genocide or why revenge porn exists. In attempts to regulate digital spaces, Congress has introduced dozens of bills with a range of provisions, and passed FOSTA/SESTA in 2018. Thus far nothing has been effective, and in the case of FOSTA/SESTA, there have only been collateral consequences.
Only weeks into the 117th Congressional session, there is already discussion about this topic, especially after the booting of serial sexual predator and former TV host Donald Trump from the platforms. But if the conversation continues, Congress is looking not to make advances, but double-down on their mistakes because of flaws in the foundation of the approach. Below are just some of the reasons why even the approach and understanding of 230 regulation, and broader regulation of digital space, is replicating harm — and inching us no closer to resolution.
Provisions around 230 aren’t Anti-Violence efforts — they’re Digital Gentrification
Efforts to clean up the internet aren’t new — especially when it comes to targeting the sex trade. Sex and sex workers have been at the forefront of expanding technology, and more importantly — making it desired and user friendly. As the internet expanded peer-to-peer connections, sex and pornography become a significant part of that. The introduction of message boards, bulletin boards where you could upload pictures, but not yet music or videos, was a prime place for people to find and access sexually explicit materials. Porn and individual explicit sites were at the forefront of developments around video streaming, monthly subscription services (Danni’s Hard Drive was one of the first, owned and operated by a Seattle-based stripper at a time when the other major subscription service was the Wall Street Journal), pop-up ads, and double opt-in processes.
But now that digital space has become accessible, to make it even more profitable means erasing those who made it that way. Just like Law Vegas and Times Square, erasure of the sex trade is essential for companies looking to invest more in a family-friendly atmosphere. Just like increasing law enforcement on 42nd street pushed sex workers into different, more isolated and dangerous areas of the city, or crackdowns on the strip make working at exploitative brothels more appealing, attacks on sex workers in digital spaces is just the advancement of an agenda of gentrification.
Since internet regulation began sex work/ers on the internet has raised the ire of Republicans and Democrats alike. In 2009 Senator Richard Blumenthal (D-CT), who has sponsored and championed SESTA and EARN IT, was threatening Craigslist with criminal investigation because of its “adult services” section. Backpage faced Congressional hearings, years of federal investigation and two failed criminal charges from VP-elect Kamala Harris before finally being arrested on promotion of prostitution charges.
This past year PornHub has become the new target to craft legislation around, and Congress continues to use the same efforts of policing and violence, simply moving criminalization from physical to digital space — which is why we’re getting the exact same harmful results.
Most of the efforts, though, are crafting expansions of civil instead of criminal policies — a tactic which has been a long-standing tool of gentrification. While sex workers and clients may be targeted for criminal prosecutions, the platforms themselves often do not meet the standard for criminal prosecution — which is why SESTA and now EARN IT were expansions of civil liability, which has a lower standard and doesn’t require law enforcement involvement.
Expansion of civil liability for spaces has been about targeting the people who turn a blind eye and allow unwanted behaviors to occur, but mostly for unwanted people to feel safe. When people were expanding Westward in the early 1900s and becoming frustrated that their towns had brothels, they lobbied for “red light abatement” laws, which allowed individual homeowners to sue brothels for bringing down their property values. When Memphis wasn’t satisfied with stings on massage parlors which arrested workers, they hit the parlors themselves with civil suits instead.
SESTA was written out of anger at the DOJ for not prosecuting Backpage fast enough, so a bill had to be written which not only expanded the civil provision, but also gave the ability to sue to every state’s attorney’s general. The goal of digital gentrification, and the tactics being used, have always had the same impact — displacement and disposal of unwanted people, in order to make a space at least look a little nicer.
Regulating Sex Work isn’t Regulating 230
A fundamentally flawed understanding of the regulation is similar to the assumptions about the sex trades — that it is monolithic and there is one typical experience, and that it’s completely different from other websites. Websites that have anything related to sexual content are not all alike, and their relationships with the people who might experience harm look different, too. Sometimes open platforms are used to transmit pornographic material — think about sending naked pictures on signal. Some places are posting material which would be considered pornographic under overly broad definitions, such as what was proposed last year by Sen. Josh Hawley, including health and sexual education sites. The websites actively targeted by Congress have been an integral part of the sex trade an d the livelihoods of sex workers — but even those are different.
Instead of understanding the nuanced dynamics of these situations, experiences, and sites, Congress is looking to carve out every website as the same and regulate them with a single hatchet swing as if you could regulate every business with one sweep because, despite being radically different companies, they are all about digital sex — so they get swept under the same singular approach.
Backpage and Pornhub are both websites which are integral to the sex industry, but operate in ways which are unique and require different approaches. The way that these spaces have been utilized by sex workers, and the relationship they had to the industry, is starkly different. Regulations and expectations of these two different sites means having at least a basic understanding of the sex trades — something that no Congressmembers have a baseline in, and too few are reaching out to develop.
Backpage was an open platform which directly produced no content, and had almost no relationship with the people who posted on the site or those who accessed it to find services. Sex workers used Backpage similar to a phone or a subway ride — it was integral to the function of ones work, but sex workers were purchasing the service of another entity. No one’s income was processed through Backpage as an entity and the ability of Backpage to moderate content was limited. PornHub, on the other hand, functions as a platform but has a significantly different relationship to sex workers. PornHub processes payments for workers and has a different level of control over how it is displayed and accessed. The tip function also acts as a way for viewers to pay for their porn more directly (side note: don’t forget to pay performers). Ultimately, they are an intermediary between workers and clients, and performers have a more long-term and integrated relationship with the site. The relationship between a platform like PornHub and those who utilize it, the ability to engage in oversight, and the opportunity for workers who use the platform to organize are all vastly different from an open platform like Backpage .Sites like PornHub and OnlyFans use the labor of independent contractors to derive income, while Backpage is a service employed by sex workers. These things are not the same.
While both are engaged in conversations exclusively around content moderation, this ignores the reality that we’re talking about very different companies. The only reason that they are being viewed as the same and the approaches are the same is because bills are written by people who do not consider sex workers as workers.
These nuances are foundational to the way that companies and platforms can be approached and regulated. Companies with a dependent workforce have different stakeholders, leverage points, and responsibilities than companies which operate as a third party service to truly independent clients. Ignoring that there are more kinds of websites than “ones with nipples” and “ones without”, which is the backbone of bills like Hawley’s, is negligent in how much it does not understand the topic at hand.
Congress Keeps Making it Worse
No one would disagree that fighting platforms like PornHub and Backpage make great headlines. It’s a way that Republicans have demonstrated they are family friendly and anti-porn, and the only downside is that the lives of sex workers are damaged. But these conversations and prosecutions are not just making sex workers’ lives more dangerous, they are making open conversation and the investment of good actors more difficult.
With the passage of FOSTA, the Mann Act (formerly known as the White Slave Traffic Act) was expanded to include internet-based facilitation of prostitution. This means that all websites and their operators which acknowledge the sex trade are currently vulnerable to a federal charge. After FOSTA/SESTA passed, dozens of platforms closed because of the expanded liability, and many either moved off-shore or were purchased by people/companies who live off-shore — meaning less vulnerable to a federal charge in a country that is probably not going to expedite for something like that.
Arrests and seizures have also targeted US-based companies who had staff within US borders to charge. Despite what anyone thinks of Backpage, one distinct facet of importance was simple: it was based within the United States. Over the last six years, multiple websites have been taken down, including MyRedBook (based in the Bay Area), RentBoy (based in lower Manhattan), Backpage (based in California), and CityXGuide (based in California). All these companies had staff within the United States, and most importantly, responded to US law enforcement when it came to subpoenas.
Further, what is being used as evidence that Backpage was helping traffickers often included things like transparency in what can and cannot be posted. Instead of the current default of having something flagged, and often not knowing why, Backpage actually told you what could and could not be said- essentially being transparent about moderation. Also noted was an acknowledgement that sex workers used the site. On page four of the Congressional investigation, this is described as “When moderators had the courage to point out illegal activity” — the illegal activity being sex work, not trafficking. This means that transparency about who is using the site, intentionality about violence and trafficking as opposed to broad anti-sex work stances, and access to things like harm reduction information and bad date lists are all incentivizing marginalization and erasure of sex workers, not effective solutions to trafficking.
And all of these mean we can’t have a real conversation.
The unspoken priority is the bottom line.
There’s also one more consistency in the bills introduced to, in name only, address harm: expanding liability is free. Broad and unclear expansion of liability is even cheaper because it inspires self-censorship, so companies will do the gentrifying for you.
None of these bills are attached to victim support, either in addressing these issues or the long-term impacts of harm. The initial claim of both SESTA and EARN IT was that victims were not getting appropriate compensation for being harmed, and that the fear of these law suits were going to force websites to change. These bills make the assumption that in cases where platforms hosted information related to a situation of violence was the existence of 230- and that’s bullshit.
The barriers to filing a civil suit against a company like Facebook are innumerable, and are compounded by people who face other social and financial obstacles. Retaining a civil attorney in a complex case is expensive, and finding one you trust with this level of harm is almost impossible. High-profile suits mean victims may be exposed, if not publicly at least the family and intimate community. Facebook has a civil litigation budget the size of a small country, so at its best, these suits will take years. Law suits like this are re-traumatizing. Mental health care is inaccessible. But no one is pointing to the number of cases filed and dismissed because of 230, because there just aren’t that many, most of them (see the most recent NCOSE/Twitter suit) fail in court because they don’t meet a civil legal standard, and because when cease and desist letter are sent to companies whose platforms are being used by bad actors, they take the content down. The problem isn’t 230 and the solution is a lot more costly.
Despite much hand-wringing, Congress did nothing to address the serious problem that companies are hard to sue for reasons beyond liability, especially when the issue is humiliating. Congress offered no expansion of pro bono civil support for those whose videos are still non-consensually being displayed or sold, which can often require things like cease and desist letters. Congress did not offer expanded mental health services or support. Congress did not invest in anti-violence prevention. The budget-neutral bill is a starting point in this conversation and no one is saying it out loud.
But companies, especially ones like Facebook and libertarian tech groups like R Street, also have a bottom line to be concerned about, and it’s impacting the proposed solutions which come to the table. Facebook will never advocate for an adequate number of responsive customer service reps because it severely impacts their profitability, and you’ll rarely have members of Congress who aren’t going to advocate for something that would dramatically impact the profitability of tech companies.
So what now?
We need a new line of thinking, and we need a new set of values guiding this conversation. If we rely on protecting profitability and gentrifying digital landscapes, we will come out with all the same problems which sit in front of us right now. Marginalized people will have to find new avenues of organizing and connection, policing and criminalization will increase violence and exploitation, and companies will only grow in their power over the day to day lives of people. The Santa Clara principles, which prioritize transparency and accountability, are a place to start. Understanding how racism, misogyny and other forms of bias make their way into algorithms is essential. Recognizing the inherent danger of technological surveillance in state and private actors is central.
Anti-violence work is about ending violence. It takes investment in people and communities, changing power imbalances, and taking on the root causes of harm. It takes resource investment. Using a framework that centers ending the harm of violence and not potential harm of lawsuits means understanding that digital space is a created extension of physical space, and the problems may be coded in ones and zeros, but at their core they are not categorically different. Without shifting the foundation of our conversations all we’re looking at is more marginalization, and in fifty years trying the same failed tactics to police the space that will be created when all the sex workers, hackers, democracy activists and visionaries had to begin envisioning and building in 2021.