SUBMIT

The Hard Truths Every Woman Needs To Know About Online Content Moderation

woman on computer
Source: Unsplash
By

May 3 2023, Published 9:00 a.m. ET

Share to XShare to FacebookShare via EmailShare to LinkedIn

What does the “Wolf of Wall Street” have to do with modern-day women’s issues? A lot, as it turns out.

In 1995, Jordan Belfort – the wolf himself – sued an online message board called Prodigy because the company didn’t remove user-generated posts that accused Belfort of fraud.

Prodigy argued that, because it wasn’t a publisher, like a newspaper, it wasn’t responsible for moderating content posted on its site. The court, however, found that because Prodigy had removed some content that it deemed offensive (Prodigy wanted to be a “family-oriented platform”), it had taken on the role of publisher.

Essentially, Prodigy’s decisions to moderate any obscene content had increased its liability for all the content it hosted.

The decision alarmed members of Congress who didn’t want tech companies to avoid moderating content for fear of increasing their risk exposure.

To incentivize content clean-up, Section 230 of the Communications Decency Act was introduced. The act, adopted in 1996, shielded technology providers from being treated like publishers – a role in which they would assume responsibility for hosted content – while also ensuring they wouldn’t be held liable for good faith actions to remove offensive material.

What’s happened in the years since is unfortunate: State and lower-federal courts interpret the law as an exemption from moderation for tech companies. “Bad Samaritans” – such as sites that deliberately republish illegal content like hate speech, nude images used without consent, and false news – are immunized from liability.

“Sites have no liability-based incentive to take down illicit material – especially if that material gets them extra clicks,” the author Danielle Keats Citron writes in her book The Fight for Privacy. “Digital platforms wield enormous power, yet bear no responsibility.”

Article continues below advertisement
woman on laptop
Source: Pexels

Following Russian interference in the 2016 election, public pressure shifted the ways in which Big Tech companies moderate content to some extent. Facebook, Google, and Twitter, among others, announced sweeping changes to their ad policies, stating that content they deemed political or a social issue would either be disallowed (in Twitter’s case) or require added review, as well as a disclaimer visible on any ad material (in Google and Facebook’s case).

Article continues below advertisement

The problem, however, is that there is no oversight for what technology companies deem social issues. The list of categories requiring review from Facebook, for example, includes many topics concerning women that, when interpreted broadly, limit the reach of organizations like Planned Parenthood, which offers services for women’s health in addition to services classified as more “political” in nature, such as abortion care.

This lack of applied nuance is a phenomenon that SAGE Journals calls “flattening.”

“Because ‘adult male’ is the norm, conditions considered normal for adult males do not require categories,” SAGE research reflects. “When it comes to political issues [on social media], we see that most categories do not subdivide into more specific interests.”

Organic content – which is unpaid user-generated content – receives no more review or moderation than it ever did, presumably based on the traffic activity generated by bad behavior. Notably, Facebook has even allowed conservative news outlets and personalities to repeatedly spread false information without facing any of the company’s stated penalties.

And this doesn’t stop with Facebook. “It can be argued,” the author Safiya Noble writes in her book Algorithms of Oppression, “that Google functions in the interests of its most influential paid advertisers or through an intersection of popular and commercial interests.” Noble asserts that Google biases information toward the interests of neoliberalism and social elites.

Resultantly, we face a double-edged sword: paid content that’s deemed political if it’s related to certain women’s issues, alongside unmoderated, unpaid speech that receives more algorithmic traction when it reflects conservative, instead of unbiased, leanings.

It’s imperative that we continue to explore strategies and best practices that help close this gap in the years ahead so the content we see is the result of a nuanced understanding of women’s issues.

Ambition Delivered.

Our weekly email newsletter is packed with stories that inspire, empower, and inform, all written by women for women. Sign up today and start your week off right with the insights and inspiration you need to succeed.

Advertisement
Screen Shot 2023-06-09 at 1.59.52 PM
By: Elyse Wallnutt

Elyse Wallnutt is a senior marketing and tech leader with 16+ years of experience driving results at the intersection of revenue, advocacy, brand, and policy objectives. After nearly two decades working in leadership roles for some of the largest nonprofit brands in the world — including Amnesty International, Heifer International, Feeding America, UNICEF USA, and The Nature Conservancy – she launched Agility Lab Consulting to help business leaders understand and adjust for the impact that audience demand for privacy is having on mission reach and donor acquisition strategy.

Latest Technology News and Updates

    Link to InstagramLink to FacebookLink to XLinkedIn IconContact us by Email
    HerAgenda

    Opt-out of personalized ads

    Black OwnedFemale Founder