AMITT/amitt_blue_framework.md
Sara-Jayne Terp ea1e5c37ae DESIGN UPDATE: AMITT Red Framework
Added 3 new techniques to TA12: T0062 Behaviour changes, T0063 Message reach, T0064 social media engagement
2021-03-11 13:20:47 +00:00

17 KiB
Исходник Ответственный История

AMITT Blue framework: Latest Framework

TA01 Strategic Planning TA02 Objective Planning TA03 Develop People TA04 Develop Networks TA05 Microtargeting TA06 Develop Content TA07 Channel Selection TA08 Pump Priming TA09 Exposure TA10 Go Physical TA11 Persistence TA12 Measure Effectiveness
C00006 Charge for social media C00009 Educate high profile influencers on best practices C00034 Create more friction at account creation C00047 Coordinated inauthentics C00065 Reduce political targeting C00014 Real-time updates to fact-checking database C00097 Require use of verified identities to contribute to poll or comment C00112 "Prove they are not an op!" C00089 Throttle number of forwards C00129 Use banking to cut off access C00131 Seize and analyse botnet servers C00090 Fake engagement system
C00008 Create shared fact-checking database C00011 Media literacy. Games to identify fake news C00036 Infiltrate the in-group to discredit leaders (divide) C00052 Infiltrate platforms C00066 Co-opt a hashtag and drown it out (hijack it back) C00032 Hijack content and link to truth- based info C00098 Revocation of "verified" C00113 Debunk and defuse a fake expert / credentials. Attack audience quality of fake expert C00122 Content moderation. Censorship? C00130 Mentorship: elders, youth, credit. Learn vicariously. C00133 Deplatform Account* C00140 "Bomb" link shorteners with lots of calls
C00010 Enhanced privacy regulation for social media C00028 Make information provenance available C00040 third party verification for people C00053 Delete old accounts / Remove unused social media accounts C00216 Use advertiser controls to stem flow of funds to bad actors C00071 Block source of pollution C00099 Strengthen verification methods C00114 Don't engage with payloads C00123 Bot control C00135 Deplatform message groups and/or message boards C00147 Make amplification of social media ports expire (e.g. can't like/ retweet after n days)
C00012 Platform regulation C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise C00042 Address truth contained in narratives C00056 Get off social media C00072 Content censorship in non-relevant domains e.g. Pinterest antivax C00100 Hashtag jacking C00115 Expose actor and intentions C00124 Don't feed the trolls C00136 Microtarget most likely targets then send them countermessages C00148 Add random links to network graphs
C00013 Rating framework for news C00030 Develop a compelling counter narrative (truth based) C00044 Keep people from posting to social media immediately C00059 Verification of project before posting (counters funding campaigns) C00074 Identify identical content and mass deplatform C00101 Create participant friction C00116 Provide proof of involvement C00125 Prepare the population with pre-announcements C00137 Pollute the AB-testing data feeds C00149 Poison the monitoring & evaluation data
C00016 Censorship - not recommended C00031 Dilute the core narrative - create multiple permutations, target / amplify C00046 Marginalise and discredit extremist groups C00062 Free open library sources worldwide C00075 normalise language C00102 Make repeat voting harder C00117 Downgrade de-amplify label promote counter to disinformation C00126 Social media amber alert C00138 Spam domestic actors with lawsuits
C00017 Repair broken social connections C00060 Legal action against for-profit engagement factories C00048 Name and Shame Influencers C00162 collect data/map constellations of Russian“civil society”. Unravel/target the Potemkin villages C00076 Prohibit images in political discourse channels C00103 Create a bot that engages / distract trolls C00118 Repurpose images with new text C00128 Create friction by marking content with ridicule or other "decelerants" C00139 Weaponise youtube content matrices
C00019 Reduce effect of division-enablers C00070 Block access to disinformation resources C00051 Counter social engineering training C00078 Change Search Algorithms for Disinformation Content C00105 Buy more advertising than the adversary to shift influence and algorithms C00119 Engage payload and debunk. Provide link to facts. C00151 “fight in the light” C00143 (botnet) DMCA takedown requests to waste group time
C00021 Encourage in-person communication C00092 Reputation scores for social media influencers C00058 Report crowdfunder as violator C00080 Create competing narrative C00106 Click-bait centrist content C00120 Open dialogue about design of platforms to produce different outcomes C00156 Better tell the U.S., NATO, and EU story. C00144 Buy out troll farm employees / offer them jobs
C00022 Innoculate. Positive campaign to promote feeling of safety C00164 compatriot policy C00067 Denigrate the recipient/ project (of online funding) C00081 Highlight flooding and noise, and explain motivations C00107 Content moderation C00121 Tool transparency and literacy for channels people follow. C00158 Use training to build the resilience of at-risk populations. C00145 Pollute the data voids with wholesome content (Kittens! Babyshark!)
C00024 Promote healthy narratives C00207 Run a competing disinformation campaign - not recommended C00077 Active defence: run TA03 "develop people” - not recommended C00082 Ground truthing as automated response to pollution C00109 De-escalation C00154 Ask media not to report false information C00169 develop a creative content hub
C00026 Shore up democracy based messages C00222 Tabletop simulations C00093 Influencer code of conduct C00084 Modify disinformation narratives, and rebroadcast them C00110 Monetize centrist SEO by subsidizing the difference in greater clicks towards extremist content C00188 Newsroom/Journalist training to counter SEO influence C00178 Fill information voids with non-disinformation content
C00027 Create culture of civility C00155 Ban incident actors from funding sites C00085 Mute content C00111 Present sympathetic views of opposite side C00193 promotion of a “higher standard of journalism” C00182 malware detection/quarantine/deletion
C00073 Inoculate populations through media literacy training C00160 find and train influencers C00086 Distract from noise with addictive content C00195 Redirect Method C00203 Stop offering press credentials to propaganda outlets C00184 Media exposure
C00096 Strengthen institutions that are always truth tellers C00189 Ensure that platforms are taking down flagged accounts C00087 Make more noise than the disinformation C00196 Include the role of social media in the regulatory framework for media C00204 Strengthen local media C00190 open engagement with civil society
C00153 Take pre-emptive action against actors' infrastructure C00197 remove suspicious accounts C00091 Honeypot social community C00214 Create policy that makes social media police disinformation C00194 Provide an alternative to Russian information by expanding and improving local content.
C00159 Have a disinformation response plan C00094 Force full disclosure on corporate sponsor of research C00215 Use fraud legislation to clean up social media C00200 Respected figure (influencer) disavows misinfo
C00161 Coalition Building and Third-Party Inducements: C00142 Platform adds warning label and decision point when sharing content C00217 Registries alert when large batches of newsy URLs get registered together C00211 Use humorous counter-narratives
C00170 elevate information as a critical domain of statecraft C00165 Limit access to alterable documents C00212 build public resilence by making civil society more vibrant
C00174 Create a healthier news environment C00167 Deploy Information and Narrative-Building in Service of Statecraft C00218 Censorship
C00176 Improve Coordination amongst stakeholders: public and private C00171 social media content take-downs
C00205 strong dialogue between the federal government and private sector to encourage better reporting C00172 social media page removal
C00220 Develop a monitoring and intelligence plan C00202 Set data 'honeytraps'
C00221 Run a disinformation red team, and design mitigation factors C00219 Add metadata to content thats out of the control of disinformation creators
C00223 Strengthen Trust in social media platforms