Skip to main content

Thought Capturing through Influence Operations

October 20, 2020 | Expert Insights

As the dates in the cliff-hanger of an election draw closer in the U.S., there is a crescendo of voices claiming inimical external forces waging ‘influence operations’ on the American public consciousness in a bid to deflect the minds of the American voter to a conclusion in their favour.

The prestigious Carnegie Endowment for International Peace in an article of October 13th has claimed to have unravelled a Russian Influence operation. This exploited the free and outspoken elements in the media to plant seeds of fake news, disinformation and subtle nudges to the 'fence-sitters' to edge closer to a particular candidate. 

The article alleges that more than 20 journalists all over the world were tricked into joining a so-called non-profit organisation called Peace Data. Many were struggling to find jobs after the layoffs of the pandemic and were thus vulnerable to such traps which offered a lucrative payoff (up to $250 per article) for writing along certain predictable theme lines.

This incident followed the removal of several accounts by Facebook and Twitter which were said to be managed by the Russian Internet Research Agency, the prime accused ‘troll farm’ behind the Russian influence operations during the 2016 elections as alleged by the Democrats.

In a similar vein, in September Microsoft announced that it had detected efforts by China, Russia and Iran to influence the voters in the American elections. Apparently, external players that preferred outcomes in such democratic exercise would further their own preferred vector of influence in that country's policies.

On October 8th, the Rand Cooperation ran a piece which described in a fair amount of detail how a well-orchestrated influence campaign was being conducted on Twitter through trolls and super-connectors to spread fake news, panic, fear and ethnically divisive propaganda; in sum an effort to derail the free and fair democratic process that is an election. While shying away from naming and shaming the perpetrator, Rand Cooperation made a comparison with the previous Russian attempts in the past to split the voters.

Earlier this year, Indian newspaper ran a story originally researched by the Indian Express how a Chinese IT firm was tracking over 10,000 influential Indians politics, government, business, technology, media and civil society activists. This was done through imbedded backdoor tools in Chinese manufactured hardware being used in the Internet of Things to collect metadata of targets. Why such an extensive campaign was being conducted and its aims and objectives remain a matter of conjectures.

THE THREAT

The concept of information warfare is nothing new; it was practised with finesse by Alexander who loved to exaggerate the size and power of his armies as he advanced through Asia Minor towards India, winning many cities without striking a single blow. From Sun Tzu to Clausewitz, all great strategists have propounded using information (and disinformation) to blind and outmanoeuvre the enemy- its military, its leaders and its people.

It is well-known to strategists that perceptions can be made more convincing than reality itself, provided the originators can create narratives that are convincing enough. On the recipients, such perceptions can have a devastating effect.

During the Cold War, both sides perfected the deadly art of information operations. A prime example is the 1980s campaign run by the KGB blaming U.S. biological warfare experiments going awry for the AIDs epidemic in Africa. The stories were first planted in an Indian newspaper from where it spread like wildfire across the globe.

In 2020 China took a leaf out of the KGB manual by spreading a similar canard about the "COVID Virus" being a creation of an American biowarfare lab. In mid-March this year, the "Wolf Warrior" spokesperson of the Chinese Foreign Ministry, Zhao Lijian shocked the world by claiming that the CIA had “smuggled” the coronavirus into Wuhan.

There is now a shift as to how information is used as a weapon to further one's national objectives, much like weaponisation of information. This is what Influence Operations are all about, although it is difficult to get a precise definition of this phenomenon. The Rand Corporation calls them as “the collection of tactical information about an adversary as well as the dissemination of propaganda in pursuit of a competitive advantage over an opponent.”

The exponential mushrooming of communication networks, the emergence of advanced wired and wireless IT facilities and the ability to share information in real-time has created a tool that is both empowering to mankind as also a deadly weapon in the hands of ruthless nations and rogue organisations. Today, influence operations can disrupt, corrupt or usurp a nation's ability to make and share decisions.

Influence operations give many distinct advantages-social media enables the use of cyberspace to target an entire population, allowing deniability through the use of complex and convoluted networks at relatively low cost. If employed in a comprehensive and convincing manner, by harnessing the powers of a state, influence operations can degrade and confuse the thought process of the target population and deflect their intentions towards their manipulator's goals.

These operations can have a multitude of objectives-intrude into democratic election systems by influencing the voters, create violence through inflammatory and doctored videos, paralysing decision making processes in adversary countries etc. At the individual level, data can be manipulated to blackmail and force persons to indulge in illegal activities against their national interests.

Influence operations are played out by a variety of actors- advertisers, activists, gullible journalists or just plain criminals out to make a quick buck. The targets can vary - from covert political operations to undermine the legitimacy and hold of liberal democracies to commercial interests to further own businesses by maligning the competitors. 

India, with its fastest-growing market of social media usage is especially vulnerable, to influence operations. With the overall lack of awareness, laws and processes to deal with rumours and fake news, the population is susceptible to manipulation. This can result in degradation of rational thought and plant divisive emotions to split the population along caste, religious and political divides. The implications for national stability, well being and security are extremely serious.

THE ANTIDOTE

This genie, let out by technology, can be defeated using technology only. Researchers urge social media platforms to weed out the intruders and disrupt their election influencing efforts by a combination of network analysis using machine learning algorithms.

The biggest door to such interlopers is opened by the public itself, and therefore, they have to be educated in the first place through the print and electronic media of the efforts being made by adversary states to manipulate them.

Platforms like Twitter and Facebook have a major responsibility and have to publicise the efforts they are making to mitigate the influence operations by states. This would put the brakes on governments behind such operations as it would impose reputational costs on their political leaders. The platforms have to evolve policies to prevent the misuse of their outreach and close the existing loopholes in their architecture using technological tools.

Speaking in a Synergia Foundation Virtual Forum, Laura Bate, a senior director at U.S. Cyberspace Solarium Commission Task Force provided a reassuring view on the counter strategy to efforts to disrupt the fairness of democratic elections. She said, “What is interesting this time around is that directors of organisations that handle cybersecurity and the FBI, among others, came out and said that the elections are safe and secure. What we need to do, however, is to be cognizant of the fact that there are attempts to shape the information space. There are attempts to influence, not the elections themselves, but voters, through the information put out on the internet. That’s another interesting question: how does one work with social media providers, with other private sector actors and the population at large to figure out how to protect the information space.”