Eradicating Civil Rights Regulation From Part 230 Will Create Many New Issues, Whereas Failing To Repair Current Ones

We have maliciously treated so many bills that seek to undermine Section 230 for stupid and insincere reasons. However, I expect a lot more bills to emerge that actually mean good and have good intentions … but are still problematic and potentially make the situation worse. A recent example of this is a not yet introduced bill by Rep. Yvette Clarke with Rep. Mike Doyle. They published a “Discussion Draft” of what they called the Civil Rights Modernization Act of 2021. This bill does two things that so many Section 230 reform bills fail to do: (1) It appears to be a real, well-articulated problem, and (2) it tries to take a narrow approach.

Unfortunately, as currently written, the bill does not address the real problems and is likely to have a host of unintended consequences that do much more harm than good.

The idea behind the bill is simple: to add another exception to section 230 so that it no longer applies to civil rights in one particular situation: when dealing with targeted advertising. That bill comes almost directly in response to a report from ProPublica years ago that found landlords were able to exclude Facebook users by race because of Facebook’s ad targeting tools. This is awful and bad, and takes the world back in decades of horrific and unfortunate US history where redlining was the norm and communities (with government support) designed to shut out people of color. Civil rights laws should help end this practice, and it’s perfectly understandable to be appalled to see Facebook may have accidentally brought it back.

Of course, after this report was released, Facebook promised to update its policies and tools to deal with it, expressly prohibit discriminatory practices in its ads, and promise stronger enforcement against such ads. As we know, moderation of content on a large scale is obviously not possible, and a follow-up report from ProPublica a year later found that the problem persists. Facebook blamed a “technical bug” for the lack of these ads, but … yes … Facebook not looking good.

A year and a half later, Facebook again announced changes to its policies for dealing with discrimination in advertising, finding that a number of civil rights groups had sued the company for discriminatory advertisements. This was part of a lawsuit settlement with these groups. Of course, just a week and a half later, Facebook was hit by another lawsuit, this time by the US government over the same discriminatory ads.

Meanwhile, last summer, the markup found … the same kind of discriminatory ads on Facebook. Whatever Facebook does, it couldn’t solve this problem.

With all of this in mind, it may seem reasonable to argue that this bill makes sense. However, once you start peeling the layers off, it doesn’t seem to be, and that calculation can do a lot more harm than good. First, let’s go back to the main reason § 230 even exists: to put liability in its place. Currently, nothing prevents someone from properly holding landlords who advertise discriminatory to account. In fact, it only seems appropriate to correctly accuse them of violating fair housing laws for targeting them in this way. And if you go after many of them to abuse targeting tools in this way, hopefully that will remove much of the problem by convincing the ad buyers themselves to avoid such discriminatory and obnoxious practices.

But from then on it gets worse. As Public Knowledge pointed out in an article last year, a platform’s liability for certain types of language can result in significant suppression of important and useful language:

Although this is an unpopular opinion among some, the ideas behind section 230 on Third Party Discussion are still applicable. Online platforms are not like publishers who can review and stand behind every user contribution, and we want online platforms to have a free hand to moderate content without fear of being held responsible for what they remove . A regime in which platforms are responsible for discriminatory behavior by third parties can very easily lead platforms to cool the language of their users for fear of liability. We have evidence that this is likely to be the case, as evidenced by the platform’s struggles to contain COVID-19 misinformation. The current AI for content moderation isn’t as sophisticated as some platforms expect us to be, especially when the content is moderated by BIPOC staff. To make matters worse, the roles of the platform and the user (employer, broker, financial institution, etc.) are not always clear. Did the user take discriminatory measures with the tools provided by the platform, or did the platform present discriminatory tools to an ignorant user? A recent study showed that even with neutral advertising, Facebook showed different ads for different groups at different rates, even when it was controlled for the population. This shows that even with the best of intentions, the platform’s liability may need to be prioritized over the third-party content of the advertiser or user.

And let’s be realistic about what is likely to happen when such a bill becomes law. The risk of liability in an area that, as noted in the paragraph above, is virtually impossible to tackle would lead to a tremendous overreaction and limitation of incredibly useful tools – and could do more harm than good to help and support the desired people. For example, in recent years (thanks in part to the power of the Internet) a large number of new companies have sprung up offering health and beauty items with the aim of serving people with color who are often not as well served by the market.

It is not difficult to see that, because of this law, Facebook and others are completely blocking the ability to effectively target such audiences, even if doing so is perfectly reasonable and non-discriminatory. However, the risk of liability for internet websites can be too high, and you are back in an unfortunate world where standard advertising (as it has for decades) targets a middle-class white consumer because whatever is more precisely targeted the liability risk under such a law.

Another way to look at this is that there are times when it is completely shameful to target members of a particular community with ads based on a certain trait. Typically, when you are selling Passover hagadadas, you want to target a Jewish population. You can advertise in Jewish magazines or publications. It’s not against civil rights law, and it wouldn’t be if you were putting these ads online for Jewish people. However, because of the very high risk of liability, websites can completely ban this targeted advertising, which in turn leads to an end result where these niche communities are underserved because the only ads you can run are targeted at the most common, lowest common denominator audiences.

That seems to be the opposite of what people who support civil rights should want.

And then the very serious question arises whether or not § 230 is a problem here at all. As mentioned above, Facebook has been sued several times over the ads by both civil rights organizations and the government. And while it is true that it tried to use 230 in response to the HUD lawsuit, we already have a similar case in the books that is considered to be one of the most important cases in Section 230. In the San Francisco Fair Housing Council against, the 9th Circle found that roommates were not protected by Section 230 for discriminatory content they created. In this case, the roommates created a pull-down menu that allowed users to select a preferred breed of roommate. And the court found that since this pulldown was created by the company and not a third party, it was not immune to the lawsuit. (For what it’s worth, an often-forgotten coda in this story is that, although the court denied Roommates’ Section 230 defense years later, found the company had not actually broken the rules of fair housing discrimination ).

There’s one more reason to be very careful with this civil rights outsourcing for Section 230: there is a very, very good chance that white nationalists will abuse the services they demand rather than tackle housing discrimination they want Not. If you look at the actual case law in which civil rights claims have been made to circumvent Section 230, you’ll find a number of highly questionable cases – including a Russian internet troll farm, a proudly misogynistic video blogger, and a Twitter user who used his account for the Has lost tweeting of hateful content directed to the Daily Show host Trevor Noah, who claims he lost his Twitter account for expressing his “heterosexuality and Christian affiliation,” and a well-known white man Supremacist – everyone claimed their civil rights had been violated by removing them from social media.

Most of these allegations have been dismissed on Section 230 grounds, but the ability to remove the Civil Rights Act from Section 230 protection can lead to a spate of similar lawsuits from truly horrific people who are insane that they are cast out for hateful views social media has been removed claiming that such moves violate their civil rights.

Again, the bill is almost certainly coming from a place with good intentions. And there are reasonable concerns about how Facebook targeting has been used specifically to discriminate in housing and possibly other places (e.g., workplaces) as well. But that’s not a problem that we need to fix. Instead, you would think that a smarter approach is to investigate those who are perpetrating the actual discrimination.

In summary, this calculation reads:

  1. Not clearly needed / misaligned
  2. Marginalized communities are likely to be harmed by curtailing some of their own perfectly reasonable advertising skills
  3. Already treated in the case of roommates
  4. Probably abused by awful, awful people to claim that their hateful views are being discriminated against.

May not seem like the best approach.

Removing the Civil Rights Act from Section 230 will create many new problems while existing problems cannot be fixed

Largest MAGA conference threatens Politico with bogus lawsuit for reporting conference problems
Wall Street Journal kisses Big Telecom’s ass in a tearful floor over ‘Big Tech’
Papers Please tell you about the “No Fly” list and it is going to make you sad

Comments are closed.