Ƶ

Balancing act: Facebook’s Fares Akkad on how the company is treading the line between freedom and facts

Balancing act: Facebook’s Fares Akkad on how the company is treading the line between freedom and facts
Fares Akkad, Director of Media Partnerships Growth Markets (APAC, LATAM, MEA and Turkey), News at Facebook. (Supplied)
Short Url
Updated 24 August 2020

Balancing act: Facebook’s Fares Akkad on how the company is treading the line between freedom and facts

Balancing act: Facebook’s Fares Akkad on how the company is treading the line between freedom and facts

DUBAI: Half of the Arab youth population gets their news on Facebook, on a daily basis, according to the report “Social Media in the Middle East: 2019 in review.”

This phenomenon is not exclusive to the Middle East. A 2018 Pew Research Center study found that 68 percent of American adults at least occasionally get news on social media, yet 57 percent said they expect the news they see on social media to be largely inaccurate.

In 2020, as the pandemic hit countries worldwide and governments enforced lockdowns, people were restricted to their homes and devices.

A PwC survey found that three-quarters of people in the Middle East had increased their consumption of social media such as Facebook, Instagram, Twitter and TikTok due to social distancing and other isolation measures.

Of these, 95 percent indicated they were either “very likely” or “likely” to continue using social media to the same extent as they have during lockdown once COVID-19 containment measures are removed.

Social distancing was really more physical distancing. In fact, social connections increased during this period, with people spending more time staying in touch with family and friends through social media and video calling apps.

All of this put the spotlight once again on problems that are not new but are perhaps more important than ever before, from fake news and misinformation to hate speech and harmful content.

This, in fact, is also the reason that advertisers stopped spending on these platforms, and this is not the first time.

The first boycott started around March 2017 when advertisers paused social advertising — mainly YouTube and Facebook — due to brand safety concerns.

In 2020, more than 1,000 advertisers joined the boycott publicly, including the likes of Coca-Cola, Unilever, Starbucks, LEGO and even small businesses, mainly due to the prevalence of hate speech on the platforms.

In the region, for instance, LEGO had to significantly alter its strategy for the launch of its Nintendo Mario Collection due to the pause on social advertising.

“We fully stopped (spending on) any Facebook-related platforms such as Instagram as well as YouTube (Google),” said Urszula Bieganska, LEGO’s head of marketing for the Middle East and Africa.

She added that the conversations with Google were at an advanced stage due to the company’s previous investments in YouTube, including YouTube Kids, especially around child safety.

“It’s not just something that we see clearly as a company; these are requests from parents, who are very concerned about children’s safety online, and as a parent myself I totally relate,” she said.

In a global statement, Coca-Cola announced its return to channels like YouTube and LinkedIn on Aug. 1, but will remain paused on Facebook, Instagram and Twitter globally, among others.

Arab News caught up with Fares Akkad, director of media partnerships for growth markets, news at Facebook, to better understand how the company is moderating content and managing misinformation in these times.

“Our journey with making the platform a safe place and a positive place for dialogue isn’t new. This is something we embarked on a while back,” he said, adding that due to COVID-19, the platform has seen an “unprecedented spike” in usage and a spike in misinformation.

The company’s way of managing this has been through a few steps. The first is setting up a COVID-19 information center that features content in partnership with relevant health authorities in each country.

“We’ve also focused on minimizing hoaxes and harmful misinformation such as conspiracy theories and false claims around the disease,” Akkad said.

“We’re also blocking people who are trying to take advantage of and exploit the situation by claiming that they have a cure or that they’re selling a face mask for exuberant amounts.”

Akkad said Facebook recognizes the pivotal role of newsrooms and the pressure they have been under due to increased workloads and reduced commercial success, which resulted in the company creating a $100 million news organization grant, in addition to $750,000 for the Middle East.

It has also invested $2 million in increasing fact-checking efforts. Currently, it funds over 50 fact-checking organizations that cover over 60 languages.

Facebook’s regional fact-checking efforts include a partnership with Agence France-Presse for Arabic, English and French content, as well as local company Fatabyyano.

If a piece of content is flagged, whether by a fact-checker or user, Facebook deprecates the distribution of that piece.

If the publisher / page is a repeat offender, Facebook takes action against them by ultimately closing the page.

The platform will also notify users who have engaged with a questionable piece of content. “We’re keen to empower our users to think before they share, and to be responsible for the spread of misinformation,” said Akkad.

When asked why Facebook would not simply remove a piece of news that has been flagged, he said: “It’s a constant balance between freedom of speech and fact-checking.”

For instance, when Brazil’s President Jair Bolsonaro touted hydroxychloroquine as a cure for COVID-19 despite it being deemed a health risk by the World Health Organization, Facebook removed any posts spreading this information.

“However, in other instances, there are points of view that might be misinformed but don’t represent harm, and those are flagged, deprecated, and punitive action is taken,” Akkad said.

Despite the measures put in place by Facebook, advertisers and users continue to expect and demand more.

Facebook’s Civil Rights Audit Report found the company to be largely reactive, with its own auditors calling its policies a “setback.”

However, Akkad said: “The same report also praised us for significant improvement in both the rate and the timeliness at which we take down content.”

He added: “At the scale at which we operate, I think zero tolerance for incidents isn’t always going to mean no incidents. That’s an unfortunate reality that any platform that reaches this kind of scale is always going to have.”

Akkad was keen to point out the positive side of Facebook’s family of apps. “We’ve taken actions like prioritizing friends and family content over other content which we knew was going to have a detrimental impact on financials, at least in the short or medium term, just because we know that this is what people wanted to see on our platform,” he said.

During the pandemic, he added, the company has played a pivotal role in people’s lives not only from a health care or information perspective, but also from a deeply personal perspective by helping people across segments connect through features ranging from live taraweeh prayers during Ramadan to the new video calling feature Rooms.

“It’s a journey that we have to continue on and try to balance providing people the tools for freedom of speech while also safeguarding them,” he concluded.