Dont say Taliban Facebook suppresses Afghan activists and artists leading the resistance

Jalal Nazari was a Wall Street Journal reporter in Afghanistan until it fell to the Taliban. He is currently a Fellow in Global Journalism at the Dalla Lana School of Public Health at the University of Toronto. He is awaiting transit documents in Kyiv, Ukraine.

Facebook’s anti-Taliban rules have swept up dozens of opponents to the new regime, blocking their accounts simply because they use the word.

The social network is Afghanistan’s most popular â€" internet-service bundles that include free Facebook membership start at just $30 a month â€" and activists have been using the platform to organize civil protests through its private messenger groups.

But in the past few weeks, many have had accounts restricted by Facebook after sharing anti-Taliban pictures, videos and texts, apparently as a result of the company’s efforts to block expressions of support for the group that regained power in Afghanistan last month.

“The Facebook algorithm restricted my account for three days after I shared an anti-Taliban performance on my Facebook wall,” said Ali Anis, 29, an Afghan photographer based in France.

And simply mentioning the regime’s name appears to trigger suspensions.

“When I shared a text containing the word ‘Taliban’ on my wall, Facebook sent me a notification saying my account has been suspended for 72 hours because I didn’t follow the Facebook Community Standards,” said Aman Farahmand, a 28-year-old Kabul teacher in a phone interview.

This photo out of Afghanistan, showing Taliban forces in contrast to a slogan on a wall circulated on WhatsApp, got a WhatsApp user suspended.

Technology firms have faced new challenges after the Taliban’s rapid takeover Afghanistan about how to deal with the content related to the group. In the meantime, a Facebook spokesman said that they have a dedicated team of Afghanistan experts, who are native Dari and Pashto speakers and have the knowledge of local context, helping to identify and alert them to emerging issues on the platform according to BBC News.

Nonetheless, Afghan users claim they have been blocked after they shared condemnations of the regime, and they blame Facebook. “I have no idea why Facebook suspended my account for three days. I had shared a video of female protesters in Kabul, said Jafar Rahimi, 30-year-old photojournalist now based in Canada. “The motto protesters were chanting was ‘No More Taliban.’

“Few moments after, Facebook sent me a notification saying my video didn’t follow their Community Standards. What was wrong with it?”

As a result, avoiding Facebook jail has become a dominant topic among Afghans on social media. Some are avoiding the word entirely, while others use different spellings in order to dodge the algorithm.

One of those is Yama Farhad, 28, part of a graffiti team in Kabul dubbed Art Lords, whose account was suspended after he shared a photo of two Taliban members standing in front of some of the group’s graffiti (reading in English, “There can be no peace without women”) with a short caption condemning the Taliban. He now uses alternate spellings for Taliban whenever he posts about the group.

Facebook, which did not reply to questions about its Afghan policies, said in August that it would continue to block pro-Taliban messages on its site.

Further annoying Afghan opponents of the new regime is that Facebook’s rules appear to be applied unevenly. Users in Afghanistan and Pakistan appear to be being blocked, while others are not â€"for example, Pakistan-based journalist Adnan Rehmat faced temporary suspension by Facebook for sharing the post of a U.S.-based journalist who faced no such consequences.

Facebook has been a critical tool for Afghans to raise their voices, organize protests and share personal experiences of violence they encounter. Now, with the Taliban government, many journalists and human-rights activists based in the country either have deactivated their accounts or never use them to write about the Taliban.

“Due to shameful restrictions by Facebook, some users write the word ‘Taliban’ with different spelling order. I also wanted to try, but the only word matching their wickedness is the word ‘Taliban,’” wrote Afghan poet Mosawer Zadfar on his page on Sept. 10 of this year.

After the consequences from Facebook, many Afghan users stopped talking about Taliban’s violence on the ground â€" Facebook has always been the main tool for them to show the human rights violations to the world. “I personally stopped writing about the Taliban due to the fear of getting blocked again. Writing the word ‘Taliban’ with different spelling order makes it very hard to access today’s information in the future,” said an Afghan female activist who doesn’t want to be named for safety reasons.

Content moderation has always been challenging for social media companies, particularly for Facebook, which reportedly has 2.6 billion monthly active users. Distinguishing genuine harassment from friendly banter, identifying harmful images and videos from among the tens of millions uploaded every day, and differentiating between authentic political messages and professional trolling operations â€" all while considering the different languages, cultural and social norms â€" is pretty hard, especially when the system depends on Artificial Intelligence or algorithms to flag content. “I was missing my nephews and shared their photos on my story. Facebook identified it as child abuse and suspended my account for three days,” said Hassan Naser, 23. “I used to always share their photos. I have no idea why Facebook detected them as such this time.”

It seems that such problems happen in conflict zones much more than other places. Based on a report by Washington Post, Facebook had wrongly blocked or restricted millions of mostly pro-Palestinian posts and accounts related to the crisis there; the Palestinian situation erupted into a full-blown public relations and internal crisis for Facebook. CEO Mark Zuckerberg had to dispatch the company’s top policy executive, Nick Clegg, to meet with Israeli and Palestinian leadership.

The New York Times has also reported about the ambiguity of Facebook’s content moderation policies during Myanmar’s protests in February 2021. The company was blamed for not removing the Burmese military accounts despite the spread of violence. “Donald Trump was kicked off Facebook for inciting violence and an attempted coup, but the Burmese military are allowed to stay on Facebook despite committing genocide and holding a coup,” wrote Mark Farmaner, director of the advocacy group Burma Campaign UK, in a statement on Feb. 16.

Jillian York, a director at the Electronic Frontier Foundation, an advocacy group that opposes government surveillance, has researched tech company practices in the Middle East and said that she doesn’t believe that content moderation â€" human or algorithmic â€" can work at scale. “Ultimately, what we’re seeing here is existing offline repression and inequality being replicated online, and Palestinians are left out of the policy conversation,” York said.

According to a report published by The Start Up last month, one of the reasons why Facebook jail was put up was the controversy during the 2016 U.S. presidential election when the American government openly criticized Zuckerberg for not having policing content enough.

Facebook jail can be frustrating and unfair in some cases, but the company has said the intention is to protect safety, privacy and authenticity of all users. “While we’re transparent about our policies, we understand that people can still be frustrated by our decisions, which is why we’re committing to doing more,” said Emily Cain, a Facebook spokeswoman in a written statement reported by The Wall Street Journal.

According to a report by New York University’s Stern Center for Business and Human Rights published last year, Facebook has 15,000 content moderators and 60 fact-checking partners worldwide which work in more than 50 languages. Zuckerberg admitted in a white paper that moderators “make mistakes in more than one out of every 10 cases,” which means 300,000 mistakes happen every day. The report says that Facebook has to double its moderators and increase its fact-checking partners to improve the quality of content review.

Experts believe that content moderation is a central function of the business of social media and shouldn’t be treated this way by Facebook leaders. “Some errors have deadly effects. For example, members of Myanmar’s military used Facebook to incite genocide against the mostly Muslim Rohingya minority in 2016 and 2017,” says Stern Center deputy director Paul M. Barrett.

0 Response to "Dont say Taliban Facebook suppresses Afghan activists and artists leading the resistance"

Post a Comment