This article seeks to explore the concept of disinformation, its impact on current events, and how average citizens can be cognizant of, recognize, and learn to avoid its harmful effects. Specifically, we will discuss the growing trend of state-sponsored disinformation campaigns, their goals, and several recent examples of them.

This is an expansive and incredibly complex topic to cover and we hope to not reduce, downplay, or generalize any of the issues, themes, or sub-topics herein, while also not belaboring any points already known to the readership.

We’re already embattled

The New York Times recently reported, citing several officials briefed on recent U.S. intelligence, that Russian SVR intelligence operatives have increased their efforts to inflame racial tension in the U.S. in order to influence November’s presidential election.

Specifically, Russian disinformation would be used to incite violence by white supremacists and stoke anger among African Americans over institutional and societal racism. In short, Russian intelligence would focus on vulnerable populations and manipulate them in support of desired outcomes. Their weapon of choice: social media platforms such as Facebook and Twitter.

Russian active measures such as cyber-attacks, disinformation spread by the St. Petersburg-based Internet Research Agency (commonly known as the “troll farm”), and others epitomize Russian asymmetric warfare in the information environment.

Russian intelligence used such measures in 2016. An extensive and independently-verified investigation confirmed that Russia had indeed interfered in the U.S. presidential election in support of their preferred candidate, Donald Trump. In 2016, Russian operatives amplified racial tensions by “creating fake Black Lives Matter groups online and spreading disinformation to depress black voter turnout.”

How the Russian Web Brigade uses its troll farms to target America

Read Next: How the Russian Web Brigade uses its troll farms to target America

While it remains difficult — if not impossible — to accurately assess the extent to which such efforts may or may not have influenced voter behavior, the more insidious issue is Russia’s devious corruption of the sacred integrity of U.S. democratic practices.

Presently, Russian intelligence is employing similar tactics to influence white supremacist groups amidst ongoing protests surrounding the death of George Floyd in Minnesota. Specifically, U.S. intelligence is aware of at least one neo-Nazi organization that has Russian ties and funding.

Disinformation is a snowball rolling downhill

When examining such complex, murky, but critical proceedings, how can one say what is truth? How does one recognize it?

Information has long been considered a mechanism through which states traditionally exert power. It is vital for informing people, expressing views, and protecting fundamental rights of free speech. Disinformation — the deliberate weaponization of false information to achieve a political objective — dangerously alters this paradigm.

Disinformation is difficult to identify, can only be stopped retroactively, and is enabled by natural technological advances in global connectivity. Once-innocent social media platforms have become vehicles capable of inciting violence, misinforming populations, and stoking unrest much like wildfire. This often harming unwitting consumers along the way.

The axiom of arguing with strangers in the online comments section of an article is a light-hearted version of the divisiveness that naturally develops when humans attempt to communicate with complete strangers over the internet.

Worse still, powerful state actors have harnessed disinformation as a capability, and seek to use it to influence other nations. The information environment is a realm where traditional sovereignty is not recognized and is difficult to defend. Actors such as Russia, China, and Saudi Arabia have all capitalized on benefits reaped from coordinated influence operations through disinformation.

Cui bono?

The author is fond of discovering the so-called “root cause” of political violence and terrorism, as it is critical in properly addressing the valid grievances held by the affected population(s). Said colloquially, the issue is seldom the issue.

There are few that benefit from rioting or looting in American cities. Even fewer benefit from divisiveness, chaos, and disorder — except America’s adversaries. Such is the goal of Russian and Chinese disinformation campaigns in particular. The more we remain distracted by the terror, injustices, righteous anger, and wanton destruction of our otherwise generally prospering society, the less capable we are.

The FBI has been running Facebook ads aimed at Russians working in Washington DC

Read Next: The FBI has been running Facebook ads aimed at Russians working in Washington DC

Now, we walk a fine line here between identifying and stopping disinformation and simply allowing citizens to rightfully express their justified grievances. Such is the difficulty at play. Events in Minneapolis and across the nation are tragic, emotionally-charged, and compelling enough on their own. They bring to the forefront a serious, important, and much-needed conversation regarding racism, police brutality, and the like. This conversation the United States owes its citizens. What is not due is agitation and amplification of extremes by foreign actors that foment destruction, violence, and chaos.

Twitter takes a valuable and necessary stand

A major platform used for disinformation campaigns is Twitter. Indeed, recent social media analysis of Russian and Chinese-connected accounts identified that Russia and China are “flooding social media with content targeting the ongoing unrest and violence in the United States.” The analysis was derived from research by the Alliance for Securing Democracy, a project at the nonpartisan think-thank The German Marshall Fund of the United States.

Sample data from the Hamilton 2.0 dashboard tracking Chinese and Russian government-connected Twitter activity.

Twitter disinformation is not limited to Russian and Chinese campaigns, however. An early May disinformation campaign attributed to Saudi Arabia spread rumors of an alleged coup attempt in Qatar. While the story eventually died down, the intangible effects remain — namely the impression that the Qatari state is weak and unstable, a fitting image according to Qatar’s longtime regional adversary, Saudi Arabia.

In response to the rising trend of disinformation, Twitter recently embraced a more aggressive posture to identify and suspend accounts suspected of spreading disinformation. Twitter recognizes the divisiveness inherent in disinformation and is acting on its policy to promote healthy discussion.

#DCBlackout and white nationalists pretending to be “Antifa

As part of its more aggressive anti-disinformation posturing, Twitter recently suspended hundreds of accounts associated with spreading false information pertaining to an alleged recent communications blackout in Washington, D.C.

According to research conducted by the Washington Post, the #dcblackout trend originated from an account with three followers. It became a nationwide trend with over half a million mentions before Twitter stepped in to remove it.

The Post reports an example of a typical tweet as saying, “no one from DC has been heard from since 1am. police had silencers on their rifles which do not need to be used with rubber bullets. All signals are jammed. The city is on blackout. The president is hiding in a bunker. What … is going on and where is everyone #dcblackout.”

An Anonymous-affiliated Twitter account correcting the #dcblackout disinformation trend.

Such tweets aim to foment additional violence and unrest by spreading chaos and capitalizing on already-present uncertainty and fear. The author himself received several requests from friends and relatives inquiring as to the validity of the blackout and the status of the nation’s capital.

While such reports could appear plausible initially, due diligence quickly disproves their validity. Indeed, cross-referencing the claims with other social media reports (say, using Snapchat mapping) easily dispels notions of jamming or a blackout. However, the damage has already been done.

Yet another egregious example of disinformation came on the heels of the #dcblackout trend when Twitter revealed that an account claiming to be an ANTIFA organization was actually linked to a white supremacist group. The account had been “pushing violent rhetoric related to ongoing protests” while pretending to originate from the militant leftist organization. A wolf in another wolf’s skin.

Disinformation Twitter account attributed to a white supremacist group, claiming to be ANTIFA. The tweet incites violence and chaos.

Russian subterfuge is alive and well

Lest the extent of Russian active measures be forgotten or misunderstood, it is useful to also highlight a 2017 example of Russian interference. In that poignant example, Russia was revealed to have purchased at least 3,000 Facebook ads for use in the 2016 presidential election.

These ads reportedly “conveyed the wide range of influence Russian-linked groups tried to enact on Americans,” and were used to organize two rallies in Houston, Texas — at the same time and place — across from an Islamic center. Based on the political, social, and other factors at play in that specific environment, such actions could easily stoke unrest or division.

Selection of ads purchased by Russian operatives to schedule competing protests at the same time and place in Texas in 2017.

It has been said that a common tactic for Russian “trolls” is to organize various rallies using Facebook and other social media platforms and then find an unwitting rally participant to take over and manage the event in-person on their behalf.

Such activities offer insight to the portfolio of Russian influence operations, and the wide extent to which they can manipulate preexisting tension, grievances, or vulnerabilities to achieve their desired effect. Russian disinformation methods are not relegated to anonymous, bot, or fake social media accounts, however.

Not unlike other state enterprises, the Russian intelligence organ also harnesses the power of public offices for information operations. It is not uncommon for state-backed news organizations (like the government-funded Russia Today) to also push “divisive racial narratives, including stories emphasizing allegations of police abuse in the United States, and highlighting racism against African Americans within the military.”

In one recent example, Russia Today editor-in-chief, Margarita Simonyan, posted “incredibly racist advice” directed towards Minnesota African Americans on a Telegram channel. Ever the detailed Russia watchers, online investigations firm Bellingcat identified the original Russian post and translated portions of it to highlight its inherent biases, racism, and unhelpful rhetoric. When confronted for accountability, RT staff was unable to defend their actions.

What to do about disinformation

Certainly, the disinformation phenomenon has reached critical mass. Americans appear to becoming informed enough to recognize the extent of foreign influence operations conducted for, or at the behest of, adversary intelligence organizations with the purpose of undermining the nation and its values. This realization was evidenced most strongly in the Russian 2016 election interference.

Unfortunately, the topic of disinformation is reaching the mainstream several years too late. While social media platforms such as Twitter are beginning to enact policies to combat disinformation, the ultimate onus lies on the information consumer. It is irresponsible to abscond one’s personal duties of thinking critically and conducting a thorough analysis of available information while forming educated opinions.

Naturally, this is difficult in an information-saturated environment that values both speed and accuracy — an incredibly demanding endeavor. SOFREP may explore the possibility of drafting additional recommendations on this topic at a later time, but the criticality of analysis, corroboration, access to diverse sources of information, and critical thinking cannot be understated. Much like diversifying an investment portfolio, consuming a diverse “information diet” is invaluable.

Thankfully, there are tools and methods available to information consumers. One readily-available method is the use of open-source research and verification to investigate dubious claims.

The author spoke with Bellingcat researcher Aric Toler on the matter, who shared that “digital research can be just as trustworthy and appealing to audiences as shoe leather journalism… if properly presented and contextualized.” Indeed, such a capability is accessible to anyone with an internet connection and basic research skills, as the rise of open source collectives has “launched the methodology to a new level of exposure and gained a degree of public trust.”

If nothing else, it is important to remember that our freedom to share information and — despite any differences we have — to cooperate on a wide variety of issues to reach common goals is exactly that which U.S. adversaries fear.

It is the author’s hope that instead of division and chaos, disinformation births a renewed societal emphasis on the benefits of rigorous analysis and critical thinking, while exercising God-given reason to resolve differences and celebrate the diversity which makes our system function.

“And ye shall know the truth, and the truth shall make you free.”

Thanks for listening.

Author’s notes: for more technical social media analysis, we recommend visiting the Hamilton 2.0 dashboard, a project of the aforementioned German Marshall Fund think tank. The dashboard offers analysis of the narratives and topics promoted by Russian and Chinese government officials across various media platforms.