Skip to main content

Facebook civil rights audit says white supremacy policy is ‘too narrow’

Facebook’s second progress report pertaining to the civil rights audit conducted by former ACLU Washington Director Laura Murphy is here. Over the last six months, Facebook has made changes around enforcing against hate, fighting discrimination in ads and protecting against misinformation and suppression in the upcoming U.S. presidential election and 2020 Census, according to the […]

Facebook’s second progress report pertaining to the civil rights audit conducted by former ACLU Washington Director Laura Murphy is here. Over the last six months, Facebook has made changes around enforcing against hate, fighting discrimination in ads and protecting against misinformation and suppression in the upcoming U.S. presidential election and 2020 Census, according to the progress report.

While Facebook has made changes in some of these areas — Facebook banned white supremacy in March — auditors say Facebook’s policy is still “too narrow.” That’s because it solely prohibits explicit praise, support or representation of the terms “white nationalism” or “white separatism,” but does not technically prohibit references to those terms and ideologies.

“The narrow scope of the policy leaves up content that expressly espouses white nationalist ideology without using the term ‘white nationalist,'” the report states. “As a result, content that would cause the same harm is permitted to remain on the platform.”

Therefore, the audit team recommends Facebook expand its policy to prohibit content that “expressly praises, supports, or represents white nationalist ideology” even if the content does not explicitly use the terms “white nationalism” or “white separatism.”

In Facebook COO Sheryl Sandberg’s note today, she acknowledges the recommendation.

“We’re addressing this by identifying hate slogans and symbols connected to white nationalism and white separatism to better enforce our policy,” she wrote.

Sandberg also noted how Facebook recently updated its policies to ensure people don’t use Facebook to organize events intended to intimidate or harass people.

“Getting our policies right is just one part of the solution,” Sandberg said. “We also need to get better at enforcement — both in taking down and leaving up the right content.”

Sandberg is referring to the fact that Facebook has sometimes wrongfully taken down content meant to draw attention to racism and discrimination.

Facebook temporarily banned a social justice activist for commenting on racism

As Murphy noted in her report, “the definition and policing of hate speech and harassment on the platform has long been an area of concern. The civil rights community also claims that a lack of civil rights expertise informing content decisions leads to vastly different outcomes for users from marginalized communities.”

Facebook now says it’s taking steps to address this. One step, Sandberg says, is to have some content reviewers focus just on hate speech.

“We believe allowing reviewers to specialize only in hate speech could help them further build the expertise that may lead to increased accuracy over time,” Sandberg wrote.

Additionally, Sandberg has formalized a civil rights task force at Facebook. This task force will live on beyond the audit in order to continue building more awareness around civil rights issues on Facebook.

And ahead of the upcoming presidential election, Facebook says it is working on new protections against voter interference and is adding a policy that prohibits “don’t vote” ads. That policy is expected to go into effect before the 2019 gubernatorial election. On the census side, Facebook is working on an interference policy that it expects to launch this fall.

In March of this year, Facebook settled with the ACLU and others pertaining to discriminatory job ads. Just days later,  the U.S. Department of Housing and Urban Development said Facebook was in violation of the Fair Housing Act through its ad-targeting tools. This case is still pending.

In the meantime, Facebook has since begun working on a new system so that advertisers running US housing, employment and credit ads will no longer be able to target by age, gender, race, religion or zip code.

When this system launches, there will be a limited number of options by which to target. Additionally, Facebook won’t make any new terms available without first running it by the ACLU and the other plaintiffs from the March 2019 settlement.

In order to implement this new system, Facebook will ask advertisers to explicitly note if the ad involves housing, employment or credit opportunities. If it does, advertisers will be directed to the new system. Facebook is also putting tools in place to identify ads that advertisers failed to flag.

Additionally, Facebook is working on a tool that will let users search active housing ads by the advertiser and by location, whether or not they are in the target audience. This is expected to be available by the end of this year. Down the road, Facebook plans to make similar tools available for employment and credit opportunities.

“Given how critical access to housing, employment and credit opportunities are, this could have a significant impact on people’s lives,” Murphy wrote in her progress report.

This audit began in May 2018 following one scandal after the other pertaining to misinformation, and Facebook’s policies and people of color on its platform. The first six months entailed Murphy conducting interviews with civil rights organizations to determine their concerns. This last six months largely focused on content moderation and enforcement. The civil rights audit is far from over, and Facebook says we can expect to see the next update early next year.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.