The Facebook Papers

Just some of the revelations about how Facebook operates:

Facebook fails to moderate harmful content in developing countries

When a pair of Facebook researchers created a dummy account in 2019 to test how users in Kerala, India, experienced the social media site, they found a staggering amount of hate speech, misinformation and calls for violence on the platform. “I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the researcher wrote.

Facebook AI struggles with non-English languages

A document showed that in 2020, the company did not have screening algorithms to find misinformation in Burmese, the language of Myanmar, or hate speech in the Ethiopian languages of Oromo or Amharic.

Facebook labeled election misinformation as “harmful, non-violating” content

“Harmful but non-violating”? Really?

Facebook was aware that maids were being sold on its platform

Apple threatened to remove Facebook and Instagram from its app store over the issue, but changed course after the social media giant removed 1,000 accounts linked to the sale of maids from its platform.

Facebook internally debated removing the Like button

When asked why Facebook hasn’t made Instagram safer for children, Haugen said during her testimony that the company knows “young users are the future of the platform and the earlier they get them the more likely they’ll get them hooked.”

https://time.com/6110234/facebook-papers-testimony-explained/

Leave a comment