Facebook removes just three per cent of hate posts according to a whistleblower

0
69

FACEBOOK removes less than three per cent of violent hate-filled posts, a whistleblower said last night.

Former Facebook exec Frances Haugen, 37 blew open Facebook’s claim that they remove nearly all bile-spewing posts, telling MPs the social network is very good at “dancing with the data.”

Frances Haugen speaking to MPs
Facebook has come under fire for not protecting users

And she said the tech giant is “unquestionably, it’s making hate worse” by driving it’s users to political extremes, enabling bullies and forcing anorexia content onto young girls – all in the name of profit.

Facebook claims it removes 97 per cent of hateful posts, but Ms Haugen said that was just hate that their computers found.

She said one of the reasons Facebook’s bots failed to pick up hate was because they could not recognise regional dialects.

Giving an example she said: “UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually under-enforcing in the UK.”

In an explosive hearing, she said social network is “hurting the most vulnerable among us” and leading people down “rabbit holes.”

Ms Haugen also met with Ian Russell the father of Molly Russel – the 14 year-old who killed herself after watching self-harm videos on Instagram.

Chillingling she told MPs: “The algorithms take people who have very mainstream interests and they push them towards extreme interests.

Adding: “Someone looking for healthy recipes will get pushed towards anorexia content.”

And she said bullying follows children home from school and carries on in their bedrooms at night via Instagram, a whistleblower has claimed.

She said: “When I was in high school, it didn’t matter if your experience in high school was horrible, most kids had good homes to go home to and they could at the end of the day disconnect, they would get a break for 16 hours.

“Facebook’s own research says now the bullying follows children home, it goes into their bedrooms.

“The last thing they see at night is someone being cruel to them.

“The first thing they see in the morning is a hateful statement and that is just so much worse.”

Ms Patel held a “constructive meeting” on online safety with Ms Haugen before her meeting with MPs.

She said after “tech companies have a moral duty to keep their users safe”

But Ms Haugen refused to label the actions of the company as “evil” or “malevolent”, but said there was a “pattern of inadequacy.”

Facebook and other social media companies will also speak to the daft Online Safety Bill committee later this week.

Andy Burrows, Head of Child Safety Online Policy, at the NSPCC, said the evidence showed that “safety is simply not a priority for those at the top of Facebook.”

He said: “She was also explicit about the scale of the challenge needed to make the company’s services safe for children after years of putting profit and growth first.”

A Facebook company spokesperson said: “Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites.

“People don’t want to see it when they use our apps and advertisers don’t want their ads next to it.

“That’s why we’ve invested $13 billion and hired 40,000 people to do one job: keep people safe on our apps.

“As a result we’ve almost halved the amount of hate speech people see on Facebook over the last three quarters – down to just 0.05 per cent of content views.

“While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own.

“The UK is one of the countries leading the way and we’re pleased the Online Safety Bill is moving forward.”