The digital age began in 1948. During that year, two seismic events took place - both of which happened in Bell Labs. First, was the invention of the transistor which forms the basis of our present digital platforms. The second, was Claude E. Shannon’s ground-breaking paper, “A Mathematical Theory of Communication” which provided a launch-pad for the theory of information.
The basic idea was that information isn’t a function of content, but the absence of ambiguity which can be broken down into a single unit, a choice between two alternatives. It’s like a coin toss, which lacks information while it is in the air, but takes on a level of certainty when it lands - information arises when ambiguity disappears. When Guttenberg invented his first printing press, he ushered the age of renaissance with the spread of information. It is commonly said that information is power. We can have all the information in the world but if we don’t do anything with that information, it renders itself useless.
Wars of the future will use computational propaganda and advanced digital deception to distort the enemy’s perception of reality and manipulate public opinion. Such a type of information warfare will have five key characteristics - Reconnaissance, Weaponization, Attack, Infection, and Destruction.
Reconnaissance: An adversary scrapes and steals a target's meta-data to create a psychological profile to identify the target’s vulnerability.
Weaponization: Artificial Intelligence (AI) enabled editing software is used to generate malicious fake news and audio content.
Attack: Bot armies strategically pump deceptive content onto online information systems. Machine learning enabled bots can then feed content to people who are most likely to share fake news.
Infection: Social media feeds enable wide-spread sharing and viewing of deceptive content.
Destruction: Disinformation runs rampant online, endangering society’s trust in institutions leading to chaos, confusion, and rebellion.
Information wars can be summed up in a century-old word “Provokatsiya”, the Russian word for provocation. Provokatsiya describes the act staging cloak and dagger deception to discredit, disarm, and confuse the opponent. It is said to have been practiced by Russian spies dating back to the Tsarist era and can continue even today via technology abled tools.
Diplomacy manipulation can be undertaken by grafting audio clips onto realistic lip-synched video. AI can be used to create fake digital content through Generative Adversarial Neural Networks (GAN). GAN’s are a type of AI used to carry out unsupervised machine learning. In a GAN, opposed neural networks work together to facilitate realistic audio and visual content. Such neural networks also make it easier to fake audio as they are able to convert the elements of an audio source into statistical properties which can be rearranged to make “original” fake audio clips. Stanford researchers have also published results indicating that it is possible to alter pre-recorded faces in real time to mimic another person’s expression.
Memes and social life have finally become weaponized, and many governments seem ill-equipped to understand and deal with the new reality of information warfare. DARPA, the same defense agency that helped spawn the internet has recommended the establishment of a meme control center.
It is our assessment that strategically speaking, information can be both an asset and a liability. We are witnessing technological developments that facilitate the manipulation of information in unprecedented ways.
We feel that governments need to recognize this and take precursory steps to protect sensitive information that is vulnerable to such manipulation by adversaries. The exercise of protecting and preserving information is as important as gathering and producing it.
We think that counter AI measures seem to be the need of the hour when it comes to information security. If AI enabled disinformation cannot be contained, its effects on society can be disastrous- to the government, as well as democracy itself.
We believe that as a source of information, social media possesses a high level of strategic influence. What we have witnessed thus far is only the tip of the iceberg when it comes to social media based misinformation. Adversaries are yet to exploit the full potential of social media weaponization, but once they do, governments must be prepared with the necessary infrastructure and know-how to preemptively counter any attempt to undermine information security. Since misinformation is almost impossible to contain once it has proliferated the internet, strategic prevention is certainly better than cure.