Techniques
Shadow Ban : https://en.wikipedia.org/wiki/Nudge_theory
Shadow banning is a practice on social media where a user’s content is made invisible or less visible to others without their knowledge. It’s used to curb spam or inappropriate behavior, reducing post reach without deleting the account. Platforms like Twitter, Instagram, or YouTube are said to have used it, though it’s often unofficial and controversial. Users may suspect a shadow ban if they notice a sudden drop in visibility or engagement. The technique raises debates about transparency, censorship, and online free speech.
Nudge : https://en.wikipedia.org/wiki/Character_assassination
Nudge theory involves subtly influencing people’s behavior without restricting choices, often for their benefit. It uses small cues or design changes, like placing healthier food at eye level, to guide decisions. Popularized in behavioral economics, it’s applied in policy and marketing. Critics argue it can feel manipulative if not transparent.
Character Assassination : https://en.wikipedia.org/wiki/Character_assassination
Character assassination is the deliberate spread of false or damaging information to ruin someone’s reputation. It often involves slander, rumors, or exaggerated claims, targeting public figures or rivals. Historically used in politics, it aims to discredit without evidence. The damage can be long-lasting and hard to refute.
Sock Puppet : https://en.wikipedia.org/wiki/Sock_puppet_account
A sock puppet is a fake online identity used to deceive others, often to push agendas or fake support. Common on forums or social media, it amplifies opinions or attacks opponents anonymously. Platforms try to detect them, but they persist. It undermines trust in online discourse.
Bad Jacketing : https://en.wikipedia.org/wiki/Bad-jacketing
Bad jacketing is falsely labeling someone as an informant or traitor within a group to sow distrust. Used by authorities or rivals, it isolates targets and disrupts unity, especially in activist circles. Historically linked to COINTELPRO, it’s a covert betrayal tactic. Suspicion alone can ruin relationships.
Kompromat : https://en.wikipedia.org/wiki/Kompromat
Kompromat is compromising material, like scandals or secrets, collected to blackmail or discredit someone. Often used in espionage or politics, it leverages fear of exposure to control targets. Russia’s intelligence is famously associated with it. It’s effective because it exploits personal vulnerabilities.
Cherry Picking : https://en.wikipedia.org/wiki/Cherry_picking
Cherry picking is selectively presenting data or evidence to support a claim while ignoring contradictions. Common in debates or media, it distorts reality for persuasion. Scientists and propagandists alike use it. It misleads by crafting a one-sided narrative.
Emotional Hijacking : https://www.ei-magazine.com/post/what-is-emotional-hijacking-and-how-can-you-prevent-it
Emotional hijacking triggers an intense emotional response (fear, anger, joy) to bypass rational thinking. Once emotionally charged, a person is more likely to accept a message uncritically. Example: pairing propaganda with shocking imagery.
Ignorant Agent : https://disarmframework.herokuapp.com/technique/7/view
An ignorant agent is someone unknowingly spreading disinformation, manipulated by others. They act sincerely, making their message convincing, often on social media. Part of disinformation campaigns, it exploits trust. Their ignorance shields the true orchestrators.
Clickbait : https://en.wikipedia.org/wiki/Clickbait
Clickbait uses sensational or misleading headlines to lure users into clicking content. It prioritizes engagement over substance, common in online media. Titles like “You Won’t Believe This!” drive traffic but often disappoint. It thrives on curiosity and frustration.
Keyword Squatting : https://mediamanipulation.org/definitions/keyword-squatting/
Keyword squatting involves hijacking trending terms to spread unrelated or false content. Used in disinformation, it floods search results or hashtags with noise. Think propaganda piggybacking on breaking news. It confuses and misdirects public attention.
Swarming : https://disarmframework.herokuapp.com/technique/49/view
Swarming is coordinated online attacks by many accounts to overwhelm or harass a target. Often seen in trolling or disinformation, it amplifies impact through sheer volume. Think Twitter pile-ons. It intimidates and drowns out opposition.
Fake Experts : https://disarmframework.herokuapp.com/technique/5/view
Fake experts are unqualified individuals presented as authorities to push a narrative. Common in disinformation, they deceive with credentials or charisma. Think anti-vaccine “doctors.” It exploits trust in expertise to mislead.
Contradictory Injunction : https://en.wikipedia.org/wiki/Double_bind
Contradictory injunction, or double bind, is giving conflicting demands where no response wins. Used in manipulation, it traps victims in confusion or guilt, like “be spontaneous” orders. Psychologically draining, it’s a control tactic.
Doxing : https://en.wikipedia.org/wiki/Doxing
Doxing is publicly revealing private information, like addresses, to harm or intimidate someone. Often a revenge tactic online, it violates privacy and invites harassment. Think hackers targeting critics. It’s a weapon of exposure.
Cyberbullying : https://disarmframework.herokuapp.com/technique/193/view
Cyberbullying is repeated online harassment to humiliate or distress a target. Using insults, threats, or rumors, it thrives on anonymity. Social media amplifies its reach. It causes real emotional harm.
Seed Distortions : https://disarmframework.herokuapp.com/technique/35/view
Seed distortions plant small lies or half-truths to grow broader misinformation. Subtle and strategic, they spread naturally via shares. Think rumors taking root online. It’s insidious because it feels organic.
Bait Influencer : https://en.wikipedia.org/wiki/Rage-baiting
Bait influencers provoke outrage or reactions to boost engagement. They post inflammatory content, like rage-bait, to go viral. Think divisive TikTok rants. It exploits emotions for attention.
Online Polls : https://en.wikipedia.org/wiki/Open-access_poll
Online polls are informal surveys open to manipulation, skewing public perception. Unregulated and unscientific, they’re gamed by bots or brigades. Think Twitter polls swaying opinion. They mimic legitimacy but lack rigor.
Echo Chamber : https://en.wikipedia.org/wiki/Echo_chamber_(media)
An echo chamber is an online space where ideas are reinforced without challenge. Algorithms and groupthink amplify biases, like on partisan forums. It isolates users from dissent. Reality becomes distorted.
Copypasta : https://en.wikipedia.org/wiki/Copypasta
Copypasta is repetitive, copied text spammed online, often for humor or disruption. Think meme walls flooding comments. It annoys or derails discussions. It’s low-effort noise.
Scarcity Manipulation : https://uxdesign.cc/5-types-of-scarcity-how-to-influence-anyone-using-these-7f309d328dbb
This technique uses the fear of missing out (on time, resources, or opportunities) to prompt quick action without reflection. For example, spreading a rumor that critical information will soon be ored to push people to share it immediately.
Motivate Mediocrity : https://www.tanbou.com/2022/Noam-Chomsky-10-strategies-manipulation.htm
Motivating mediocrity encourages apathy or low standards to keep people docile. Linked to manipulation theories, it praises conformity over ambition. Think “don’t question, just follow.” It stifles progress.
Develop Deep/Cheap Fakes : https://datasociety.net/library/deepfakes-and-cheap-fakes/
Deep and cheap fakes are manipulated media, like AI-altered videos, to deceive. Deep fakes are sophisticated; cheap ones are quick edits. Think fake politician speeches. They erode trust in reality.
Firehose of Falsehood : https://en.wikipedia.org/wiki/Firehose_of_falsehood
Firehose of falsehood is rapid, high-volume lies to overwhelm truth. Used in propaganda, it floods discourse with contradictions. Think Russian disinformation tactics. Volume trumps accuracy.
Dismiss / Distract / Distort / Dismay : https://fromthepenof.com/red-flag-professional-behaviour/discrediting
The 4 Ds discredit foes by dismissing, distracting, distorting, or dismaying. Think smear campaigns dodging facts. It’s a systematic attack on credibility. Opponents are silenced.
Gaslighting : https://en.wikipedia.org/wiki/Gaslighting
Gaslighting manipulates someone into doubting their reality or sanity. Using denial or lies, like “that never happened,” it confuses victims. Think abusive relationships online. It’s psychological sabotage.
Illusory Truth Effect : https://en.wikipedia.org/wiki/Illusory_truth_effect
The illusory truth effect makes repeated lies seem true. Familiarity breeds belief, even if false. Think propaganda slogans. It exploits cognitive bias.
Microtarget : https://www.merriam-webster.com/dictionary/microtarget
Microtargeting tailors ads or messages to specific individuals using data. Used in campaigns, it’s precise persuasion. Think Facebook ads hitting voters. It’s powerful but creepy.
Maintaining Guilt and Ignorance : https://www.tanbou.com/2022/Noam-Chomsky-10-strategies-manipulation.htm
Maintaining guilt and ignorance keeps people submissive via shame and confusion. A manipulation tactic, it blames the masses. Think “you’re too dumb to know.” It kills critical thought.
Manipulate Platform Algo : https://github.com/DISARMFoundation/DISARMframeworks/blob/main/generated_pages/techniques/T0121.md
Manipulating platform algorithms games systems to boost content visibility. Using bots or trends, it cheats reach. Think SEO for disinformation. It hijacks attention.
Framing : https://en.wikipedia.org/wiki/Framing_(social_sciences)
Framing manipulates how information is presented to shape its interpretation. For instance, saying “90% success rate” instead of “10% failure rate” alters perception without changing the facts. It’s often used to make a narrative more favorable or alarming.
Bandwagon Effect : https://en.wikipedia.org/wiki/Bandwagon_effect
The bandwagon effect exploits the human tendency to follow the majority. By creating the illusion that an idea or belief is widely accepted (“everyone thinks this”), it pressures individuals to conform without questioning it.
Astroturfing : https://disarmframework.herokuapp.com/technique/145/view
Astroturfing is a manipulation technique that involves creating a false impression of popular support or grassroots opinion (from the ground up). It uses fake accounts, bots, or paid individuals to simulate a spontaneous movement on social media or other platforms. This can amplify a cause, brand, or idea while concealing its orchestrated origin. For example, fake positive comments about a product can deceive consumers. This method exploits the perceived credibility of crowds to influence.
Butterfly attacks : https://disarmframework.herokuapp.com/technique/134/view
Butterfly attacks, on the other hand, are a disinformation tactic where multiple small attacks or distractions are launched simultaneously to overwhelm and destabilize a target. They aim to divert attention from a main event or exhaust the adversary’s resources by forcing them to respond on multiple fronts. This can include rumors, misleading posts, or coordinated minor provocations. This dispersion makes defense difficult and blurs the perception of truth. It’s a strategy of calculated chaos.