News APP

NewsApp (Free)

Read news as it happens
Download NewsApp

Available on  gplay

This article was first published 2 years ago
Rediff.com  » News » 'We've reduced hate speech by half'

'We've reduced hate speech by half'

By Neha Alawadhi
November 26, 2021 11:28 IST
Get Rediff News in your Inbox:

'Our proactive detection rate for hate speech in India is close to 97 per cent -- which means that of the hate speech content we remove, we detect 97 per cent of it proactively, even before anyone reports it.'

Photograph: Dado Ruvic/Reuters
 

Amid the ongoing controversy over Meta (Facebook) not doing enough to prevent hate speech and misinformation across its platforms, Monika Bickert, head of global policy management, Meta, tells Neha Alawadhi about the steps the company has taken in India in that direction.

Have the whistle-blower complaints affected the growth of the Meta platforms, in terms of user numbers and advertising?

There are now almost 3.6 billion people who actively use one or more of our services globally.

And more than 200 million businesses who use our tools to connect with customers.

There is much more happening on our platforms that clearly empowers people.

More and more businesses and communities join us every day because it touches their lives positively.

It is this immense good that we seek to maintain by continuing to invest in processes and resources that can help mitigate any shortcomings.

'We know that many things that generate engagement on our platform leave users divided and depressed', a researcher has been reported as saying in the leaked docs. Is this correct?

While we continue to invest heavily in understanding social media's relationship with polarisation, the current evidence suggests that Facebook is not a prime driver of polarisation.

For example, academic research shows that the increase in political polarisation in the United States predates social media by several decades.

Studies of Internet trends suggest there is no clear relationship between internet use and polarisation across numerous countries.

In fact, in Eastern Europe, recent causal experiments have shown that Facebook use actually reduced polarisation.

We also want to give people more control over what they see.

We already give people the ability to over-ride the algorithm -- to compose their own newsfeed on Facebook.

Since we're hearing that people want to see more friends and less politics, we're testing ways in which we can respond to that.

How is hate speech and misinformation content tackled in India, given its multiplicity of languages?

Over the years we have invested significantly in technology to find hate speech and tackle misinformation on our platform.

Since 2016, we have invested more than $13 billion on our teams and tech that focuses on safety and security and this year we are on track to spend more than $5 billion in this area.

Today, we have more than 40,000 people working on safety and security issues, including more than 15,000 dedicated content reviewers who review content in more than 70 languages.

As a result of our efforts, we've reduced the amount of hate speech that people see by half this year. Today, it's down to 0.03 per cent.

In India, under the new information tech Rules and as part of our commitment to respect local laws, we started publishing the monthly compliance reports from May and since then we have proactively removed more than 1.05 million pieces of hate speech content from our platform.

Our proactive detection rate for hate speech in India is close to 97 per cent -- which means that of the hate speech content we remove, we detect 97 per cent of it proactively, even before anyone reports it.

How do you view the recent IT Rules formulated in India? How do you see the changes in the Indian regulations around content and tech?

We respect Indian laws. We are fully committed to the agenda of safety and privacy of people on our platform.

We are already investing a lot in this area -- in the form of new improved features, artificial intelligence tech -- that allow us to remove violating content that surfaces on our platform, even before it is reported to us.

We have made significant efforts to work towards compliance with the provisions of the IT Rules and continue to discuss a few of the issues which need more engagement with the government.

We remain committed to people's ability to freely and safely express themselves on our platform.

Frances Haugen's testimony in the US senate raises several questions on Meta's policies and how data has been misused. Also, what are the implications for countries like India? How do you intend to make people/users believe you are not misusing their data for profitability?

Our integrity and our efforts to be more transparent about our data aren't static, but always evolving.

This evolution can be tracked across our public policy minutes, new reports we create, new tools we build, and additional data points we put into existing reports.

We welcome scrutiny and feedback, but these documents are being used to paint a narrative that we hide or cherry-pick data, when, in fact, we do the opposite.

We iterate, examine, re-evaluate our assumptions, and work to address tough problems.

We show this commitment to transparency and invite accountability with data, tools, and policies that we share with the public.

No one else shares as much as we do via our industry-leading, quarterly community standards enforcement Report, widely-viewed content report, Facebook's Open Research and Transparency Initiative, CrowdTangle or Ad Library.

Is there a conversation among social media companies around how they plan to tackle hate speech and misinformation?

As mentioned earlier, we've always said that private companies like ours should not have to make so many decisions around important topics, such as election integrity or content on their own.

We've repeatedly called for regulation to provide clarity on these topics.

At the same time, we should want every other company in our industry to make the investments and achieve the results that we have.

Our CEO Mark Zuckerberg has said he worries about the incentives we're creating for other companies to be as introspective as we have been.

But we're committed to continuing our work, because we believe it will be better for our community and our business over the long term.

Get Rediff News in your Inbox:
Neha Alawadhi
Source: source
 
India Votes 2024

India Votes 2024