Conflict in Sudan: Google Search Brings Controversy to Light
Amidst one of the globe’s most perilous enduring conflicts, a search investigation into Sudan’s armed forces has unveiled a militia that has faced sanctions from U.S. authorities and is accused by the United Nations of egregious human rights violations.
The entity implicated in these search results is none other than Google, the technology behemoth that commands over 96% of Africa’s search engine market.
The Rapid Support Forces (RSF), a paramilitary organization, face scrutiny from the UN for purported crimes against humanity.
Evolving from the notorious Janjaweed, the RSF maintains the same leadership and violent methodologies that characterized the Darfur genocide nearly two decades ago.
In April 2023, this militia reignited the ongoing conflict, resulting in what the UN declares to be the world’s largest displacement crisis.
Despite the extensive ramifications of this situation, the RSF’s official web presence consistently occupies a prominent position at the top of Google search results pertaining to Sudan’s national forces. Search for Sudan Armed Forces yields RSF’s website at the top of search results
Since the civil war commenced, the leaders of the RSF have been penalized by both the United States Treasury and the United Nations Security Council.
In light of these developments, inquiries regarding Google’s compliance with sanctions, its stance on armed groups, potential propaganda risks, and its criteria for de-indexing were submitted.
Google’s retort was succinct: “We are guided by local law and court decisions when it comes to removing pages from our results. We comply with all valid legal removal requests.”
Why Google’s Algorithm Grants the RSF an Unmerited Advantage
Compounding the complexity of the issue is a structural element. The Sudan Armed Forces (SAF) lack an official website; currently, their communications are relayed through SUNA, the state news agency.
From a search engine optimization (SEO) perspective, this circumstance positions the RSF’s domain as the sole dedicated military online presence competing for queries regarding Sudan’s armed forces.
Google’s algorithm does not engage with geopolitical context or sanctions; rather, it rewards authority and optimization.
In the absence of a competing authoritative entity, the militia’s website triumphs in the ranking battle by default. Different search query still has RSF’s website near the top of the search results
According to YouTube, a subsidiary of Google, the indexing of this militia’s website instigates what appears to be an internal contradiction in its defined policy regarding terrorist content as outlined on its support pages.
“We terminate any channel where we have a reasonable belief that the account holder is a member of a designated terrorist organization, such as a foreign terrorist organization (U.S.) or an entity specified by the United Nations.”
This policy seemingly necessitates precise actions that Google Search has not undertaken.
The Office of Foreign Assets Control (OFAC) of the U.S. Treasury, which has sanctioned the RSF, asserts that all transactions by U.S. entities involving the property of a blocked organization are forbidden absent specific licensing.
Whether indexing and serving traffic to a sanctioned entity’s website constitutes a “transaction” under OFAC regulations remains untested in court; however, Google has not faced charges of any sanctions breach.
Nonetheless, the query posed—a legitimate one—remains unresolved. Currently, the RSF fulfills both the criteria of being designated as problematic by the UN Security Council and faces sanctions from the U.S. government.
This is not Google’s inaugural encounter with pressure to address harmful content without legal injunction; it has acted in similar circumstances before.
Following the January 6 attack on the United States Capitol, the Washington Post reported that Google, alongside other tech giants, excised incendiary content and pledged to intensify efforts against harmful actors.
Such determinations were made based on internal risk assessments rather than “legal removal requests.”
Moreover, in 2017, Google’s general counsel, without prompting from legal requests, asserted that “there should be no place for terrorist content on our services.”
The post also articulated initiatives employing machine-learning systems aimed at detecting and eliminating ISIS-related content.
Google’s Elusive Parameters for Defining “Bad Actors”
In response to inquiries, Google referenced a blog post explaining the rationale behind content removal from search results.
A notable line states, “We’re consistently evolving our approach to protect against bad actors on the web and ensure Google continues to deliver high-quality, reliable information for all.”
Given the documented accusations of killings, sexual violence, and starvation attributed to the RSF by the UN and various human rights organizations, Google’s criteria for labeling a “bad actor” remain undisclosed.
This query is far from theoretical; Google has demonstrated its capacity to act unilaterally when it sees fit.

Andrew Kibe, a Kenyan content creator, experienced a restriction on his YouTube account without any formal complaints from the Communications Authority of Kenya, no court dictates, and no governmental sanctions.
Kibe is an individual with a polarized public image, but has not been accused of any acts of violence. Conversely, the RSF has faced accusations of massacring 6,000 people in a single location—El Fasher.
Google’s application of its own standards remains, at best, perplexing.
Source link: Techweez.com.






