Connect with us


How Meta fumbled propaganda moderation during Russia’s invasion of Ukraine

Days after the March 9 bombing of a maternity and children’s hospital in the Ukrainian city of Mariupol, comments claiming the attack never happened began flooding the queues of workers moderating Facebook and Instagram content on behalf of the apps’ owner, Meta Platforms.

For the latest headlines, follow our Google News channel online or via the app.

The bombardment killed at least three people, including a child, Ukraine’s President Volodomyr Zelenskyy said publicly. Images of bloodied, heavily pregnant women fleeing through the rubble, their hands cradling their bellies, sparked immediate outrage worldwide.

Among the most-recognized women was Mariana Vishegirskaya, a Ukrainian fashion and beauty influencer. Photos of her navigating down a hospital stairwell in polka-dot pajamas circulated widely after the attack, captured by an Associated Press photographer.

Online expressions of support for the mother-to-be quickly turned to attacks on her Instagram account, according to two contractors directly moderating content from the conflict on Facebook and Instagram. They spoke to Reuters on condition of anonymity, citing non-disclosure agreements that barred them from discussing their work publicly.

The case involving the beauty influencer is just one example of how Meta’s content policies and enforcement mechanisms have enabled pro-Russian propaganda during the Ukraine invasion, the moderators told Reuters.

Russian officialdom seized on the images, setting them side-by-side against her glossy Instagram photos in an effort to persuade viewers that the attack had been faked.

On state television and social media, and in the chamber of the UN Security Council, Moscow alleged – falsely – that Vishegirskaya had donned make-up and multiple outfits in an elaborately staged hoax orchestrated by Ukrainian forces.

Swarms of comments accusing the influencer of duplicity and being an actress appeared underneath old Instagram posts of her posed with tubes of makeup, the moderators said.

At the height of the onslaught, comments containing false allegations about the woman accounted for most of the material in one moderator’s content queue, which normally would have contained a mix of posts suspected of violating Meta’s myriad policies, the person recalled.

“The posts were vile,” and appeared to be orchestrated, the moderator told Reuters. But many were within the company’s rules, the person said, because they did not directly mention the attack. “I couldn’t do anything about them,” the moderator said.

Reuters was unable to contact Vishegirskaya.

Meta declined to comment on its handling of the activity involving Vishegirskaya, but said in a statement to Reuters that multiple teams are addressing the issue.

“We have separate, expert teams and outside partners that review misinformation and inauthentic behavior and we have been applying our policies to counter that activity forcefully throughout the war,” the statement said.

Meta policy chief Nick Clegg separately told reporters on Wednesday that the company was considering new steps to address misinformation and hoaxes from Russian government pages, without elaborating.

Russia’s Ministry of Digital Development, Communications and Mass Media and the Kremlin did not respond to requests for comment.

Representatives of Ukraine did not respond to a request for comment.

‘Spirit of the policy’

Based at a moderation hub of several hundred people reviewing content from Eastern Europe, the two contractors are foot soldiers in Meta’s battle to police content from the conflict. They are among tens of thousands of low-paid workers at outsourcing firms around the world that Meta contracts to enforce its rules.

The tech giant has sought to position itself as a responsible steward of online speech during the invasion, which Russia calls a “special operation” to disarm and “denazify” its neighbor.

Just a few days into the war, Meta imposed restrictions on Russian state media and took down a small network of coordinated fake accounts that it said were trying to undermine trust in the Ukrainian government.

It later said it had pulled down another Russia-based network that was falsely reporting people for violations like hate speech or bullying, while beating back attempts by previously disabled networks to return to the platform.

Meanwhile, the company attempted to carve out space for users in the region to express their anger over Russia’s invasion and to issue calls to arms in ways Meta normally would not permit.

In Ukraine and 11 other countries across Eastern Europe and the Caucasus, it created a series of temporary “spirit of the policy” exemptions to its rules barring hate speech, violent threats and more; the changes were intended to honor the general principles of those policies rather than their literal wording, according to Meta instructions to moderators seen by Reuters.

For example, it permitted “dehumanizing speech against Russian soldiers” and calls for death to Russian President Vladimir Putin and his ally Belarusian President Alexander Lukashenko, unless those calls were considered credible or contained additional targets, according to the instructions viewed by Reuters.

The changes became a flashpoint for Meta as it navigated pressures both inside the company and from Moscow, which opened a criminal case into the firm after a March 10 Reuters report made the carve-outs public. Russia also banned Facebook and Instagram inside its borders, with a court accusing Meta of “extremist activity.”

Meta walked back elements of the exceptions after the Reuters report. It first limited them to Ukraine alone and then canceled one altogether, according to documents reviewed by Reuters, Meta’s public statements, and interviews with two Meta staffers, the two moderators in Europe and a third moderator who handles English-language content in another region who had seen the advisories.

The documents offer a rare lens into how Meta interprets its policies, called community standards. The company says its system is neutral and rule-based.

Critics say it is often reactive, driven as much by business considerations and news cycles as by principle. It’s a complaint that has dogged Meta in other global conflicts including Myanmar, Syria, and Ethiopia. Social media researchers say the approach allows the company to escape accountability for how its policies affect the 3.6 billion users of its services.

The shifting guidance over Ukraine has generated confusion and frustration for moderators, who say they have 90 seconds on average to decide whether a given post violates policy, as first reported by the New York Times. Reuters independently confirmed such frustrations with three moderators.

After Reuters reported the exemptions on March 10, Meta policy chief Nick Clegg said in a statement the next day that Meta would allow such speech only in Ukraine.

Two days later, Clegg told employees the company was reversing altogether the exemption that had allowed users to call for the deaths of Putin and Lukashenko, according to a March 13 internal company post seen by Reuters.

At the end of March, the company extended the remaining Ukraine-only exemptions through April 30, the documents show. Reuters is the first to report this extension, which allows Ukrainians to continue engaging in certain types of violent and dehumanizing speech that normally would be off-limits.

Inside the company, writing on an internal social platform, some Meta employees expressed frustration that Facebook was allowing Ukrainians to make statements that would have been deemed out of bounds for users posting about previous conflicts in the Middle East and other parts of the world, according to copies of the messages viewed by Reuters.

“Seems this policy is saying hate speech and violence is ok if it is targeting the ‘right’ people,” one employee wrote, one of 900 comments on a post about the changes.

Meanwhile, Meta gave moderators no guidance to enhance their ability to disable posts promoting false narratives about Russia’s invasion, like denials that civilian deaths have occurred, the people told Reuters.

The company declined to comment on its guidance to moderators.

Denying violent tragedies

In theory, Meta did have a rule that should have enabled moderators to address the mobs of commenters directing baseless vitriol at Vishegirskaya, the pregnant beauty influencer. She survived the Mariupol hospital bombing and delivered her baby, the Associated Press reported.

Meta’s harassment policy prohibits users from “posting content about a violent tragedy, or victims of violent tragedies that include claims that a violent tragedy did not occur,” according to the Community Standards published on its website.

It cited that rule when it removed posts by the Russian Embassy in London that had pushed false claims about the Mariupol bombing following the March 9 attack.

But because the rule is narrowly defined, two of the moderators said, it could be used only sparingly to battle the online hate campaign against the beauty influencer that followed.

Posts that explicitly alleged that the bombing was staged were eligible for removal, but comments such as “you’re such a good actress” were considered too vague and had to stay up, even when the subtext was clear, they said.

Guidance from Meta enabling commenters to consider context and enforce the spirit of that policy could have helped, they added.

Meta declined to comment on whether the rule applied to the comments on Vishegirskaya’s account.

At the same time, even explicit posts proved elusive to Meta’s enforcement systems.

A week after the bombing, versions of the Russian Embassy posts were still circulating on at least eight official Russian accounts on Facebook, including its embassies in Denmark, Mexico and Japan, according to an Israeli watchdog organization, FakeReporter.

One showed a red “fake” label laid over the Associated Press photos of Mariupol, with text claiming the attack on Vishegirskaya was a hoax, and pointing readers to “more than 500 comments from real users” on her Instagram account condemning her for participating in the alleged ruse.

Meta removed those posts on March 16, hours after Reuters asked the company about them, a spokesperson confirmed. Meta declined to comment on why the posts had evaded its own detection systems.

The following day, on March 17, Meta designated Vishegirskaya an “involuntary public person,” which meant moderators could finally start deleting the comments under the company’s bullying and harassment policy, they told Reuters.

But the change, they said, came too late. The flow of posts related to the woman had already slowed to a trickle.

Read more:

Meta virtual money moves could include ‘Zuck Bucks:’ Report

Meta suspends Facebook accounts to tackle misinformation ahead of Philippines polls

General who ran Syria operation to lead Russia troops in Ukraine

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.


Tarjama, Ureed, Future Work sign MOU to empower Saudi youth in labor market

Tarjama Saudi Arabia, Ureed and Future Work have signed a Memorandum of Understanding (MOU) to expand the freelancer landscape in Saudi Arabia and build a platform for young freelancers to find job opportunities.

For all the latest headlines follow our Google News channel online or via the app.
The MoU was signed in the Kingdom’s capital Riyadh. The initiative hopes to meet the objectives of Saudi Vision 2030 through the empowerment of young talents, the creation of equal job opportunities for women and the creation of local content.
Future Works CEO, Eng. Bandar bin Abdullah al-Mohamadi described the MoU as a continuation of “efforts to support the training and employment of youths.”
“This agreement helps further target modern work models, including freelance work and flexible work to empower citizens working in the digital economy, and provide them with job opportunities and services,” he said.
Regarding this strategic partnership, Tarjama CEO Nour al-Hassan said, “We are honored to be collaborating with Future Work to build a platform for flexible work opportunities in the Kingdom and give the Saudi youth access to bigger employment opportunities.”
Tarjama is a language and translation service based in the MENA region while Ureed is the region’s largest freelance marketplace platform. Future Work is a company dedicated to empowering youths in labor markets.

Read more:
Amethis completes minority investment in language technology & services firm Tarjama

Nour al-Hassan’s ‘Tarjama’ translation service featured in Gartner report on AI

Continue Reading


Seven dead, another 55 feared killed in landslide in Indian state of Manipur

At least seven people have died and another 55 are feared to have been killed after a massive landslide in a remote area of the north-eastern Indian state of Manipur, local officials said on Thursday.
Rescue workers battled heavy rains and inclement weather to pull out nineteen survivors from the rubble on Thursday morning after the landslide occurred at a railway construction site in the early hours, but said the likelihood of finding any more was thin.

For the latest headlines, follow our Google News channel online or via the app.
“In all there were about 81 people. The chances of survival of the remaining 55 people are very thin considering the fact that the landslide occurred around 2 a.m.,” Haulianlal Guite, district magistrate of Noney district in Manipur, where the accident occurred, told Reuters by telephone.
This month unprecedented rains have lashed India’s north-eastern states and neighboring Bangladesh, killing more than 150 people.
Millions have been displaced by the catastrophic floods in recent weeks, and in some low-lying areas houses have been submerged.
Army helicopters were on standby and assisting in rescue operations at the site of the landslide, a statement from the
Indian army said.
“Army helicopters are on standby. The weather is very hostile and more landslides are hampering our rescue operations,” the statement said.

Read more: Monsoon floods kill 42, leave millions stranded in Bangladesh and India

Continue Reading


UK FM Truss: West must learn from Ukraine lessons and apply them to Taiwan

The West must learn from its mistakes in failing to deter Russia’s invasion of Ukraine and apply those lessons to “protect peace and stability in the Taiwan Strait,” British foreign minister Liz Truss said on Thursday, as Beijing protested.
Tensions between Taiwan and China, which claims the democratically-ruled island as its own territory, have risen in recent years as China steps up military activities near Taiwan to pressure it to accept Chinese rule.

For all the latest headlines follow our Google News channel online or via the app.

Truss said the West, and in particular countries in the Indo-Pacific region, had to make sure Taiwan was defended.
“We need to learn the lessons of Ukraine, which was that we could have ensured that Ukraine had the defensive capability earlier,” Truss told LBC radio.
“And that would have done more to deter [Russian President Vladimir] Putin from invading, so-called deterrence by denial, and that is a similar approach to the approach we need to take for other sovereign nations, including Taiwan.”
In Beijing, the foreign ministry said China had lodged an official complaint with Britain over Truss’ remarks on Taiwan.
“The lack of common sense and the arrogance of her remarks are surprising,” ministry spokesman Zhao Lijian told a regular briefing on Thursday. “We hope she will not make such irresponsible remarks in the future.”
At a NATO meeting in Spain the previous day, Truss had told a panel session that China was “extending its influence through economic coercion and building a capable military.”
She added, “There is a real risk that they draw the wrong idea, which results in a catastrophic miscalculation such as invading Taiwan.”
Asked to comment on Truss’ Wednesday remarks about Taiwan, Zhao reiterated China’s position that Taiwan is part of China, its internal affair and said no external force had a right to interfere.
On Thursday, Truss avoided questions about whether she was suggesting that Britain should arm Taiwan, saying only: “We also need to make sure that together, the free world are ensuring that Taiwan has the defense capability it needs.”
Britain should continue to build trade ties with China but avoid becoming strategically dependent on it, she added.
“Of course, we should continue to trade with China. But we need to be careful not to become strategically dependent on China.”
On Thursday, the spokesman, Zhao, responded that using ideology and small circles to artificially separate the world’s supply chains would not succeed.
Britain and at least six nations have been helping Taiwan in a secretive program to build submarines, a Reuters investigation found last year.

Read more: NATO returns to combat stance to counter a new and hostile world

Continue Reading