Political disinformation: The new organized social media mafia
Political disinformation: The new organized social media mafia
Political disinformation: The new organized social media mafia
- Author:
- November 2, 2022
Insight summary
Computational propaganda, which uses algorithms, automation, and Big Data to influence public life, is becoming the norm. When operated by political parties, disinformation campaigns become an organized attack against the truth, freedom, and fundamental human rights. The long-term implications of this trend may include increased online harrassment of journalists and societal mistrust of media institutions.
Political disinformation context
Disinformation is when people deliberately spread false information to deceive others. It should not be confused with misinformation, which is inaccurate information, but more out of careless ignorance and lack of research. Disinformation campaigns have become the lifeblood of modern politics. From using propaganda bots to deepfake videos to artificial intelligence (AI)-generated op-eds, political parties and organizations have influenced international politics, election results, and public policies.
In 2019, the University of Oxford found that social media manipulation campaigns occurred in 48 countries in 2018, up from 28 in 2017. In addition, authoritarian states have regulated access and content on social media platforms. Political disinformation is used to control citizens in 26 countries and has three distinct purposes: suppressing human rights, discrediting political opponents, and taking down critics.
One of the increasingly used techniques in political disinformation is the establishment of cyber troops. These groups comprise government or political party affiliates who use the Internet to control public opinion. Their methods include:
- Using bots to amplify hate speech,
- Scraping data from sites,
- Micro-targeting specific groups, and
- Unleashing an army of “patriotic” trolls to harass journalists and opposing voices online.
One of the characteristics of manipulation campaigns is the collaboration of different stakeholders. For example, cyber troops frequently partner with private firms, civil society organizations, Internet subcultures, youth groups, hacker collectives, fringe movements, social media influencers, and volunteers who believe in their mission. This partnership is what makes political disinformation so effective because of its ability to reach specifically identified demographics.
Disruptive impact
In 2020, a document leak from disbanded data company Cambridge Analytica revealed how many political firms, actors, and organizations have worked with the firm for disinformation campaigns during elections. Over 100,000 documents were released detailing wide-scale voter manipulation tactics in 68 countries. The files came from the company’s ex-Director of Program Development, Brittany Kaiser, who became a whistleblower.
Kaiser said these documents indicate that electoral systems are open to abuse and fraud. Similarly, Christopher Steele, the former head of the Russia desk of the UK’s Secret Intelligence Service MI6, said that the lack of punishment and regulation has only encouraged disinformation actors, making it more likely that they will interfere in future elections and policies.
Among social media platforms, Facebook remains the most used site for political disinformation campaigns; due to its vast reach and market size, communication features, group pages, and the following options. This popularity is why Cambridge Analytica illegally harvested profile data from the site. According to the University of Oxford research, other apps are rising in popularity.
Since 2018, there has been an increase in cyber troop activity on image and video-sharing sites such as Instagram and YouTube. Cyber troops are also operating campaigns on the encrypted messaging platform WhatsApp. These platforms are expected to become increasingly significant as more people use social networking technologies for political expression and news.
Implications of political disinformation
Wider implications of political disinformation may include:
- Cyber troops target more journalists and traditional media sites whenever there are high-profile political cases. These attacks may include creating deepfake content and unleashing bots in the comments section.
- The use of AI to flood the Internet with disinformation and misinformation content to distract, polarize, and confuse online readers.
- Disinformation-as-a-service will become a key market as more political actors hire hackers and content creators to disseminate propaganda.
- More universities and schools collaborating on teaching young people to discern disinformation, including content analysis and source verification.
- Entire societies becoming increasingly disaffected, distrustful, apathetic, and disoriented by a lack of clarity about what is factual vs. what is fake. Such populations may become easier to influence and control.
- Regulatory bodies increasing scrutiny and control over social media platforms, leading to stricter content moderation policies and potential changes in digital freedom of speech.
- Public demand for verifiable, transparent news sources growing, driving the emergence of new, credibility-focused media platforms.
- Political campaigns shifting strategies to include counter-disinformation units, focusing on rapid response and fact-checking to mitigate the impact of false narratives.
Questions to consider
- How has your country been affected by disinformation campaigns?
- How do you think this political tactic will evolve further?
Insight references
The following popular and institutional links were referenced for this insight: