Bloomberg
Facebook Inc said it has spent more than $13 billion on safety and security efforts since the 2016 US election, and now has 40,000 employees working on those issues.
The 40,000 safety and security workers include outside contractors who focus on content moderation, a spokesman said. Facebook said it had over 35,000 safety and security employees in October 2019.
The new statistics — meant to demonstrate how seriously the company takes safety and security issues — were published in a blog post after a series of stories last week in the Wall Street Journal used leaked documents to show that despite hefty investments, Facebook struggles to combat a myriad of serious issues, including Covid-19 misinformation and illegal human trafficking.
The documents showed that Facebook’s internal researchers often identified serious problems with inappropriate content or user behaviour on the company’s services, but Facebook routinely failed to fix them. The stories spurred calls by US lawmakers for an investigation and possibly hearings on the issues.
The blog post addressed some of these criticisms without citing the newspaper’s stories specifically. The company said that while it has historically been responsive to issues on the platform, it’s trying to be more proactive by having safety and security employees embedded in product teams during the development process.
“In the past, we didn’t address safety and security challenges early enough in the product development process,†Facebook said in its blog. “But we have fundamentally changed that approach.â€
Facebook also shared new statistics around its global political ad library, an archive where people can search for political ads that run on Facebook or the Instagram photo-sharing app. Facebook said 3 million people use the ad library each month, and the company rejected 3.5 million political or social ad submissions over the first six months of 2021 for failing to provide proper information.
Instagram, which was the focus of a story last week that revealed internal research showed the company knows its product can be emotionally damaging to young women, said this week that it’s considering “nudges,†which will prompt users to look at healthier content on the service, or take a break from scrolling.
Facebook’s Chief
Technology Officer to Step Down next year
Facebook Inc Chief Technology Officer Mike Schroepfer, a 13-year veteran who oversees the social network’s work in artificial intelligence, virtual reality and the blockchain, will step down next year.
Another longtime Facebook executive, Andrew Bosworth, will take over as CTO, according to an internal message on Wednesday from Chief Executive Officer Mark Zuckerberg. Schroepfer’s move marks the most significant departure from the company in years and follows the recent exits of several other top executives.
His decision to step aside comes at a time when Facebook is under escalating pressure to clean up its service — a challenge the company believes it can tackle using the artificial intelligence software that Schroepfer’s teams are building.
Known as “Schrep,†Schroepfer joined Facebook in 2008 and has been CTO since 2013, reporting to Zuckerberg. He sits atop many of Facebook’s most ambitious organizations — including groups that the social network is depending on for future growth — such as engineering, infrastructure, augmented reality and
VR, and the blockchain and finance unit. His desk sits next to Zuckerberg’s and operating chief Sheryl Sandberg’s at Facebook headquarters.
Schroepfer’s most central role may be his oversight of Facebook’s AI organization, which he helped build. That group develops the technology Facebook uses to automatically find and remove content that violates its policies, like nudity, hate speech and graphic violence. Pressure to improve those systems increased last week following a series of reports in the Wall Street Journal that found evidence describing the company’s struggles to reckon with issues like Covid-19 misinformation and human trafficking.
With billions of global users to serve, Facebook executives have pointed to AI technology as the best way to police posts at such a large scale. The technology is far from perfect, and Facebook also uses thousands of human content moderators to monitor posts on its apps.
Facebook has lost a number of veteran executives in recent months. Fidji Simo, the head of the company’s flagship social networking app, left in July to become CEO at Instacart Inc., and was joined there shortly after by Carolyn Everson, who was a Facebook vice president running its global business relationships with advertisers. Both women were at Facebook for more than 10 years.