top of page
Search
jananolte5

RISE OF DISINFORMATION

Disinformation has increasingly become a cybersecurity issue as large, coordinated disinformation campaigns compromise democratic elections and stir civil unrest worldwide. Additionally, the usage of disinformation in attack delivery packages often means that victims of disinformation become victims of cyberattacks. Disinformation campaigns seek to exploit the psychology of victims’ emotions, identities, politics, or societal divisions affecting the victim.

Threat actors who use disinformation as a tool range from independent threat actors to large nation-state operations capable of manufacturing and delivering immense amounts of malicious information. Threat actors can have hundreds of hands-on-keyboard human attackers or may choose to use large scale bot operations. Their intentional disinformation is becoming exceedingly difficult for users on the internet to differentiate from genuine information. This means that victims of disinformation can become spreaders of disinformation.



This diagram outlines the key differences between misinformation, disinformation and malinformation.

It’s important to note the key trait of disinformation is a deliberate action to fabricate or manipulate information. Often, people share incorrect information believing it to be the truth, making them misinformed rather than intentionally malicious. Examples of this include things such as “Someone has told me Covid is in location X” when it is not. People often share information like this with the ultimate end goal of helping others or protecting others, believing it to be the truth. On the other hand, disinformation is a knowing, intentional fabrication or manipulation of information maliciously spread with an aim to self-propagate and replicate. This would include claims such as “Bill Gates made the vaccine as a form of population control”.

Despite many disinformation claims being outrageous on the surface, they may not seem that way to those whom they target. Often disinformation seeks to stir division by targeting minority groups and groups who have been historically wronged. For example, anti-vaccine campaigns in West Africa had heavy themes of rejecting the vaccine as a form of neo-imperialism with references to past wrongs Western countries and Western pharmaceutical companies have inflicted on West Africa.

Even those who can consistently identify attempts at disinformation in the wider web, struggle to combat it. This is in part due to the complex relationship disinformation has with the people who continue to propagate it without understanding that it is untrue. Disinformation simultaneously turns users into both victims and perpetrators. Because of this unpleasant fact, it can be difficult and frustrating for many people to help victims of disinformation. This leads to them becoming increasingly absorbed by whatever disinformation they’re consuming, and in turn increasingly isolated from others.


A doctored image showing NZ-specific vaccine disinformation


Disinformation campaigns also have key techniques to deliver its ‘payload’ like any other method of cybersecurity attack:

  • Spree-Posting: Single accounts mass posting a message or URL

  • Coordinated Link Sharing: Multiple profiles sharing the same URL at the same time

  • Coordinated ‘Copypasta’: A method utilising text/memes and encouraging users to engage in copying it too.

  • Repeated Rehashing: Accounts that regularly repost conspiracies but adapt them to new contexts.


These posts utilise consistent themes of heavy negative emotions, using words such as ‘terrifying’ or ‘horrific’ to generate emotional appeal. Another common theme of posts is the usage of ‘fake experts’, by fabricating experts on a topic they lend credibility to their own posts.

Disinformation also poses a significant threat to organisations, as threat actors build an audience and become more proficient in using their techniques the potential consequences continually increase. These threat actors can manipulate fake news, social media, and search engines to spin the perception of organisations and cause severe reputational and brand damage without ever having to compromise their target. This can lead to ongoing financial harm and loss of trust from stakeholders and customers.

While it’s nigh impossible to stop ‘fake news’ on the internet, there are steps to mitigate potential harm to your organisation. Being careful about what your staff posts on official accounts and sourcing any potentially contentious claims by your business make it more difficult for malicious actors to reduce trust in your business. Encouraging critical thinking and digital literacy in your staff not only makes them more resistant to disinformation and cyberattacks in their professional capacity, but also benefits them personally.

While pulling people out of the disinformation abyss remains a difficult task due to the isolation that often comes with it, we can take a proactive approach towards ‘pre-bunking’ disinformation as opposed to attempting to debunk it after the fact. By proactively anticipating fears and confusion that may arise we can get in before disinformation does. Amusingly, in the same way, a vaccine inoculates against a disease, a controlled small exposure to disinformation and an explanation of tactics and motives used can help protect people against disinformation themselves.



In this same way, beyond a professional scope, InPhySec recognises our responsibility as individuals to combat disinformation by exposing ourselves to a diverse multitude of people and perspectives. Beyond protecting our clients, we all endeavour to protect those around us where we can. IT and Cybersecurity skillsets naturally tend towards high digital literacy, low trust of internet sources and an ability to spot hoax/obfuscation tactics & techniques. We believe those with this ability carry the mantle of breaking the ‘chain of disinformation’ by using tactics such as ‘prebunking’ and teaching scepticism


BY JAMES COLLINS


131 views

Recent Posts

See All

Comments


bottom of page