Human trafficking, rampant misinformation, and even a push to restrict voter information for Spanish-speaking individuals: All of these controversies are laid out in detail as part of a series of whistleblower complaints filed with the SEC by former Facebook product manager Frances Haugen. Redacted versions of these documents were obtained by 17 U.S. media outlets, providing a trove of stories for each publication to home in on. Haugen is currently testifying before the U.K. Parliament about her complaints against the company.
One of the most damning revelations from these papers is Facebook’s continued failures in reigning in human trafficking on its flagship app, Instagram, and WhatsApp. An internal paper published last year shows that Facebook knows its platforms enable “all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks."
As CNN reports, Facebook seemingly only took this issue seriously after Apple threatened to remove Facebook and Instagram from its app store in 2019 following a BBC report showing how Facebook and its platforms foster domestic worker exploitation. Facebook knew about its human trafficking problems since at least 2018, documents revealed.
The problem is still ongoing: CNN used some of the naming trends used by accounts associated with domestic servitude listed in an internal Facebook document and found multiple Instagram accounts offering such services. Facebook removed those accounts after CNN brought it to the company’s attention. It continues to rely on users to report such accounts and still has some admitted blind spots when it comes to its own internal enforcement.
"Our goal is to help deter people from searching for this type of content," Facebook spokesperson Andy Stone told CNN. "We're continuing to refine this experience to include links to helpful resources and expert organizations."
This and other issues are something many staffers have expressed alarm at, including the widespread misinformation plaguing the social media giant. The headline for a Bloomberg article published Monday really says it all: “Facebook Staff Say Core Products Make Misinformation Worse.”
For years, Facebook has researched its algorithm’s propensity for drawing users closer and closer to extremist and downright false content. In 2019, the company created a fake account depicting a 41-year-old North Carolina mom it named Carol. Carol’s account followed Donald Trump and Fox News. Within a day, the fake account was receiving recommendations for what Facebook deemed “polarizing” content. Within a week, Carol’s recommendations included content featuring conspiracy theories like QAnon.
A similar test was performed on WhatsApp in India, where misinformation ran rampant on the app during the 2019 election. A fake account was created depicting a 21-year-old woman, who soon began receiving graphic images as well as fake photos of Indian airstrikes in Pakistan. This poses a major issue in India, where more than 400 million of the 460 million internet users in the country use WhatsApp. Misleading information on WhatsApp tied to elections has plagued Brazil and Nigeria as well.
WhatsApp is truly a global app and boasts more than 2 million users. It has been used to discuss politics around the world as well as in the U.S. and was even under consideration by Facebook to be part of its “voting information center” initiative, which directed users to information on registering to vote or signing up to become a poll worker. The Washington Post reports that WhatsApp employees were pushing for a Spanish-language version of the “voting information center” to become available on WhatsApp, where there’s an estimated user base of 32 million Latino Americans, according to nonprofit newsroom Pulso.
Facebook CEO Mark Zuckerberg himself reportedly raised concerns over a Spanish-language voting information center feature for WhatsApp not being “politically neutral.” It should go without saying that Latino voters are not a monolith and that political preferences vary wildly around the country. What was eventually rolled out on WhatsApp were text chat bot services allowing users to either flag misinformation or get voting info from Vote.org. All chat bots were created by outside parties.
Whistleblower Haugen believes Zuckerberg’s control and micromanagement are the reasons the Facebook CEO should be held responsible for the many issues the social media site has created across the globe. Haugen isn’t the only former Facebook employee hoping Facebook finally faces consequences: Last week, yet another whistleblower submitted complaints to the SEC.
Calls from lawmakers to break up Facebook—most notably from Sen. Elizabeth Warren, who made it a key component of her presidential campaign in 2020—have persisted for years. If you’re concerned about Facebook’s bad behavior, there is a way you can fight back. Sign the petition to demand that Facebook be held accountable and refuse to use the platform on Nov. 10 as part of the National Log Off.
One of the most damning revelations from these papers is Facebook’s continued failures in reigning in human trafficking on its flagship app, Instagram, and WhatsApp. An internal paper published last year shows that Facebook knows its platforms enable “all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks."
As CNN reports, Facebook seemingly only took this issue seriously after Apple threatened to remove Facebook and Instagram from its app store in 2019 following a BBC report showing how Facebook and its platforms foster domestic worker exploitation. Facebook knew about its human trafficking problems since at least 2018, documents revealed.
The problem is still ongoing: CNN used some of the naming trends used by accounts associated with domestic servitude listed in an internal Facebook document and found multiple Instagram accounts offering such services. Facebook removed those accounts after CNN brought it to the company’s attention. It continues to rely on users to report such accounts and still has some admitted blind spots when it comes to its own internal enforcement.
"Our goal is to help deter people from searching for this type of content," Facebook spokesperson Andy Stone told CNN. "We're continuing to refine this experience to include links to helpful resources and expert organizations."
This and other issues are something many staffers have expressed alarm at, including the widespread misinformation plaguing the social media giant. The headline for a Bloomberg article published Monday really says it all: “Facebook Staff Say Core Products Make Misinformation Worse.”
For years, Facebook has researched its algorithm’s propensity for drawing users closer and closer to extremist and downright false content. In 2019, the company created a fake account depicting a 41-year-old North Carolina mom it named Carol. Carol’s account followed Donald Trump and Fox News. Within a day, the fake account was receiving recommendations for what Facebook deemed “polarizing” content. Within a week, Carol’s recommendations included content featuring conspiracy theories like QAnon.
A similar test was performed on WhatsApp in India, where misinformation ran rampant on the app during the 2019 election. A fake account was created depicting a 21-year-old woman, who soon began receiving graphic images as well as fake photos of Indian airstrikes in Pakistan. This poses a major issue in India, where more than 400 million of the 460 million internet users in the country use WhatsApp. Misleading information on WhatsApp tied to elections has plagued Brazil and Nigeria as well.
WhatsApp is truly a global app and boasts more than 2 million users. It has been used to discuss politics around the world as well as in the U.S. and was even under consideration by Facebook to be part of its “voting information center” initiative, which directed users to information on registering to vote or signing up to become a poll worker. The Washington Post reports that WhatsApp employees were pushing for a Spanish-language version of the “voting information center” to become available on WhatsApp, where there’s an estimated user base of 32 million Latino Americans, according to nonprofit newsroom Pulso.
Facebook CEO Mark Zuckerberg himself reportedly raised concerns over a Spanish-language voting information center feature for WhatsApp not being “politically neutral.” It should go without saying that Latino voters are not a monolith and that political preferences vary wildly around the country. What was eventually rolled out on WhatsApp were text chat bot services allowing users to either flag misinformation or get voting info from Vote.org. All chat bots were created by outside parties.
Whistleblower Haugen believes Zuckerberg’s control and micromanagement are the reasons the Facebook CEO should be held responsible for the many issues the social media site has created across the globe. Haugen isn’t the only former Facebook employee hoping Facebook finally faces consequences: Last week, yet another whistleblower submitted complaints to the SEC.
Calls from lawmakers to break up Facebook—most notably from Sen. Elizabeth Warren, who made it a key component of her presidential campaign in 2020—have persisted for years. If you’re concerned about Facebook’s bad behavior, there is a way you can fight back. Sign the petition to demand that Facebook be held accountable and refuse to use the platform on Nov. 10 as part of the National Log Off.