BLUECountermeasures 140

ID Countermeasure Tactic Metatechnique Summary
C00006 Charge for social media TA01 M004
Friction
Include a paid-for privacy option, e.g. pay Facebook for an option of them not collecting your personal information. There are exa…
C00008 Create shared fact-checking database TA01 M006
Scoring
Share fact-checking resources - tips, responses, countermessages, across respose groups.
C00009 Educate high profile influencers on best practices TA02 M001
Resilience
Find online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to …
C00010 Enhanced privacy regulation for social media TA01 M004
Friction
Implement stronger privacy standards, to reduce the ability to microtarget community members.
C00011 Media literacy. Games to identify fake news TA02 M001
Resilience
Create and use games to show people the mechanics of disinformation, and how to counter them.
C00012 Platform regulation TA01 M007
Metatechnique
Empower existing regulators to govern social media. Also covers Destroy. Includes: Include the role of social media in the regulat…
C00013 Rating framework for news TA01 M006
Scoring
This is "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news. Example: j…
C00014 Real-time updates to fact-checking database TA06 M006
Scoring
Update fact-checking databases and resources in real time. Especially import for time-limited events like natural disasters.
C00016 Censorship TA01 M005
Removal
Alter and/or block the publication/dissemination of information controlled by disinformation creators. Not recommended.
C00017 Repair broken social connections TA01 M010
Countermessaging
For example, use a media campaign to promote in-group to out-group in person communication / activities . Technique could be in te…
C00019 Reduce effect of division-enablers TA01 M003
Daylight
includes Promote constructive communication by shaming division-enablers, and Promote playbooks to call out division-enablers
C00021 Encourage in-person communication TA01 M001
Resilience
Encourage offline communication
C00022 Innoculate. Positive campaign to promote feeling of safety TA01 M001
Resilience
Used to counter ability based and fear based attacks
C00024 Promote healthy narratives TA01 M001
Resilience
Includes promoting constructive narratives i.e. not polarising (e.g. pro-life, pro-choice, pro-USA). Includes promoting identity n…
C00026 Shore up democracy based messages TA01 M010
Countermessaging
Messages about e.g. peace, freedom. And make it sexy. Includes Deploy Information and Narrative-Building in Service of Statecraft:…
C00027 Create culture of civility TA01 M001
Resilience
This is passive. Includes promoting civility as an identity that people will defend.
C00028 Make information provenance available TA02 M011
Verification
Blockchain audit log and validation with collaborative decryption to post comments. Use blockchain technology to require collabora…
C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise TA02 M002
Diversion
Create websites in disinformation voids - spaces where people are looking for known disinformation.
C00030 Develop a compelling counter narrative (truth based) TA02 M002
Diversion
C00031 Dilute the core narrative - create multiple permutations, target / amplify TA02 M009
Dilution
Create competing narratives. Included "Facilitate State Propaganda" as diluting the narrative could have an effect on the pro-stat…
C00032 Hijack content and link to truth- based info TA06 M002
Diversion
Link to platform
C00034 Create more friction at account creation TA15 M004
Friction
Counters fake account
C00036 Infiltrate the in-group to discredit leaders (divide) TA15 M013
Targeting
All of these would be highly affected by infiltration or false-claims of infiltration.
C00040 third party verification for people TA15 M011
Verification
counters fake experts
C00042 Address truth contained in narratives TA15 M010
Countermessaging
Focus on and boost truths in misinformation narratives, removing misinformation from them.
C00044 Keep people from posting to social media immediately TA15 M004
Friction
Platforms can introduce friction to slow down activities, force a small delay between posts, or replies to posts.
C00046 Marginalise and discredit extremist groups TA15 M013
Targeting
Reduce the credibility of extremist groups posting misinformation.
C00047 Honeypot with coordinated inauthentics TA15 M008
Data Pollution
Flood disinformation spaces with obviously fake content, to dilute core misinformation narratives in them.
C00048 Name and Shame Influencers TA15 M003
Daylight
Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify s…
C00051 Counter social engineering training TA15 M001
Resilience
Includes anti-elicitation training, phishing prevention education.
C00052 Infiltrate platforms TA15 M013
Targeting
Detect and degrade
C00053 Delete old accounts / Remove unused social media accounts TA15 M012
Cleaning
remove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available f…
C00056 Encourage people to leave social media TA15 M004
Friction
Encourage people to leave spcial media. We don't expect this to work
C00058 Report crowdfunder as violator TA15 M005
Removal
counters crowdfunding. Includes ‘Expose online funding as fake”.
C00059 Verification of project before posting fund requests TA15 M011
Verification
third-party verification of projects posting funding campaigns before those campaigns can be posted.
C00060 Legal action against for-profit engagement factories TA02 M013
Targeting
Take legal action against for-profit "factories" creating misinformation.
C00062 Free open library sources worldwide TA15 M010
Countermessaging
Open-source libraries could be created that aid in some way for each technique. Even for Strategic Planning, some open-source fram…
C00065 Reduce political targeting TA05 M005
Removal
Includes “ban political micro targeting” and “ban political ads”
C00066 Co-opt a hashtag and drown it out (hijack it back) TA05 M009
Dilution
Flood a disinformation-related hashtag with other content.
C00067 Denigrate the recipient/ project (of online funding) TA15 M013
Targeting
Reduce the credibility of groups behind misinformation-linked funding campaigns.
C00070 Block access to disinformation resources TA02 M005
Removal
Resources = accounts, channels etc. Block access to platform. DDOS an attacker. TA02*: DDOS at the critical time, to deny an adver…
C00071 Block source of pollution TA06 M005
Removal
Block websites, accounts, groups etc connected to misinformation and other information pollution.
C00072 Remove non-relevant content from special interest groups - not recommended TA06 M005
Removal
Check special-interest groups (e.g. medical, knitting) for unrelated and misinformation-linked content, and remove it.
C00073 Inoculate populations through media literacy training TA01 M001
Resilience
Use training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education …
C00074 Identify and delete or rate limit identical content TA06 M012
Cleaning
C00000
C00075 normalise language TA06 M010
Countermessaging
normalise the language around disinformation and misinformation; give people the words for artefact and effect types.
C00076 Prohibit images in political discourse channels TA06 M005
Removal
Make political discussion channels text-only.
C00077 Active defence: run TA15 "develop people” - not recommended TA15 M013
Targeting
Develop networks of communities and influencers around counter-misinformation. Match them to misinformation creators
C00078 Change Search Algorithms for Disinformation Content TA06 M002
Diversion
Includes “change image search algorithms for hate groups and extremists” and “Change search algorithms for hate and extremist quer…
C00080 Create competing narrative TA06 M002
Diversion
Create counternarratives, or narratives that compete in the same spaces as misinformation narratives. Could also be degrade
C00081 Highlight flooding and noise, and explain motivations TA06 M003
Daylight
Discredit by pointing out the "noise" and informing public that "flooding" is a technique of disinformation campaigns; point out i…
C00082 Ground truthing as automated response to pollution TA06 M010
Countermessaging
Also inoculation.
C00084 Modify disinformation narratives, and rebroadcast them TA06 M002
Diversion
Includes “poison pill recasting of message” and “steal their truths”. Many techniques involve promotion which could be manipulated…
C00085 Mute content TA06 M003
Daylight
Rate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns. Online archives of conten…
C00086 Distract from noise with addictive content TA06 M002
Diversion
Example: Interject addictive links or contents into discussions of disinformation materials and measure a "conversion rate" of use…
C00087 Make more noise than the disinformation TA06 M009
Dilution
C00090 Fake engagement system TA07 M002
Diversion
Create honeypots for misinformation creators to engage with, and reduce the resources they have available for misinformation campa…
C00091 Honeypot social community TA06 M002
Diversion
Set honeypots, e.g. communities, in networks likely to be used for disinformation.
C00092 Establish a truth teller reputation score for influencers TA02 M006
Scoring
Includes "Establish a truth teller reputation score for influencers” and “Reputation scores for social media users”. Influencers a…
C00093 Influencer code of conduct TA15 M001
Resilience
Establish tailored code of conduct for individuals with many followers. Can be platform code of conduct; can also be community cod…
C00094 Force full disclosure on corporate sponsor of research TA06 M003
Daylight
Accountability move: make sure research is published with its funding sources.
C00096 Strengthen institutions that are always truth tellers TA01 M006
Scoring
Increase credibility, visibility, and reach of positive influencers in the information space.
C00097 Require use of verified identities to contribute to poll or comment TA07 M004
Friction
Reduce poll flooding by online taking comments or poll entries from verified accounts.
C00098 Revocation of allowlisted or "verified" status TA07 M004
Friction
remove blue checkmarks etc from known misinformation accounts.
C00099 Strengthen verification methods TA07 M004
Friction
Improve content veerification methods available to groups, individuals etc.
C00100 Hashtag jacking TA08 M002
Diversion
Post large volumes of unrelated content on known misinformation hashtags
C00101 Create friction by rate-limiting engagement TA07 M004
Friction
Create participant friction. Includes Make repeat voting hard, and throttle number of forwards.
C00103 Create a bot that engages / distract trolls TA07 M002
Diversion
This is reactive, not active measure (honeypots are active). It's a platform controlled measure.
C00105 Buy more advertising than misinformation creators TA07 M009
Dilution
Shift influence and algorithms by posting more adverts into spaces than misinformation creators.
C00106 Click-bait centrist content TA06 M002
Diversion
Create emotive centrist content that gets more clicks
C00107 Content moderation TA06 M006
Scoring
includes social media content take-downs, e.g. facebook or Twitter content take-downs
C00109 Dampen Emotional Reaction TA09 M001
Resilience
Reduce emotional responses to misinformation through calming messages, etc.
C00111 Reduce polarisation by connecting and presenting sympathetic renditions of opposite views TA01 M001
Resilience
C00112 "Prove they are not an op!" TA08 M004
Friction
Challenge misinformation creators to prove they're not an information operation.
C00113 Debunk and defuse a fake expert / credentials. TA08 M003
Daylight
Debunk fake experts, their credentials, and potentially also their audience quality