Social Media Platform Interventions Taxonomy

A comprehensive classification system for understanding and analyzing various intervention mechanisms employed by social media platforms to moderate content, behavior, and user experiences.

Filters

Interventions (32)

Private Profiles
A setting that allows account owners to restrict access to their content.
Focus:Virality
Driver:User
User Journey:Proactive
Scope:Systemic
Platform:
Twitter
Considerations:

Offers control but contributes to filter bubbles

Tradeoffs:

Isolation vs. control

Slowed Virality
Friction like confirmation screens to slow unwitting sharing
Focus:Virality
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
TwitterFacebook
Considerations:

Friction can annoy but also promote deliberation

Tradeoffs:

Friction vs. deliberation

Removal
The complete deletion of unwanted content that violates platform or regional consumer safety policies.
Focus:Virality
Driver:Platform
User Journey:Retroactive
Scope:Targeted
Platform:
FacebookTwitterReddit
Considerations:

Promotes safety but risks perceived over-censorship

Tradeoffs:

Censorship vs. safety

Suspensions
Platforms temporarily ban consumers who violate their terms and conditions.
Focus:Appearance
Driver:Platform
User Journey:Retroactive
Scope:Systemic
Platform:
YouTubeTwitterInstagram
Considerations:

Increases safety but risks perceived censorship

Tradeoffs:

Safety vs. censorship

Community Rules
Behavioral and ethical standards that are set and maintained by consumers who define guidelines for discourse.
Focus:Behavior
Driver:User
User Journey:Proactive
Scope:Systemic
Platform:
RedditDiscord
Considerations:

Enables autonomy but risks fragmenting shared spaces

Tradeoffs:

Fragmentation vs. autonomy

Crowd Reporting
Buttons that enable consumers to manually flag and report potentially violating content for the platform to review.
Focus:Behavior
Driver:User
User Journey:Retroactive
Scope:Targeted
Platform:
YouTube
Considerations:

Catch violations but enables harassment campaigns

Tradeoffs:

Harassment vs. policing

Community Notes
Notes or comments that users post under content to offer contextual info to the posts' integrity or message.
Focus:Behavior
Driver:User
User Journey:Retroactive
Scope:Targeted
Platform:
Reddit
Considerations:

Harnesses collective knowledge but risks coordinated misuse and manipulation

Tradeoffs:

Crowd wisdom vs. crowdsourcing harassment

Demonetization
Platforms remove the potential for consumers to receive compensation for the content they upload.
Focus:Behavior
Driver:Platform
User Journey:Retroactive
Scope:Targeted
Platform:
YouTubeTwitterFacebookInstagram
Considerations:

Deters violations but impacts content creator revenue

Tradeoffs:

Revenue loss vs. deterrence

Identity Verification
Features that require consumers to submit identifying documents to create accounts to participate in an online community.
Focus:Behavior
Driver:Platform
User Journey:Proactive
Scope:Systemic
Considerations:

Adds accountability but creates barrier to access

Tradeoffs:

Barrier to entry vs. accountability

Nudging
Automated messages, pop-up text, and other design features that aim to equip consumers to make healthy and informed choices to better themselves and other consumers.
Focus:Behavior
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
Twitter
Considerations:

Can guide choices but risks manipulative paternalism

Tradeoffs:

Manipulation vs. guidance

Prompts
Automated messages show to consumers cautioning against toxic content or unwanted behaviors.
Focus:Behavior
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
Facebook
Considerations:

Can encourage reflection but risks annoying users

Tradeoffs:

Annoyance vs. self reflection

Public Service Announcements (PSAs)
Educational messages or pop-up text that shares trusted sources to readers who can learn fact-check information, government safety announcements, or the sources of research information to combat misinformation, disinformation, or any content that would benefit from additional footnotes.
Focus:Appearance
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
YouTube
Considerations:

Raises awareness but risks message fatigue

Tradeoffs:

Awareness vs. message fatigue

References:
Labeling
Short disclaimers about questionable or unwanted content to help users make healthy choices, which include but are not limited to fact-checking labels, warning labels, dispute labels, or AI labels.
Focus:Appearance
Driver:Platform
User Journey:Retroactive
Scope:Targeted
Platform:
FacebookTwitterReddit
Considerations:

Promoting verified information while risking being over prescriptive to consumer behavior

Tradeoffs:

Trusted sources vs. over prescriptive

Sponsor Labels
Unique labels are paired with content that contains paid content that is promoted by an individual consumer or a third-party company working with the consumer.
Focus:Appearance
Driver:Platform
User Journey:Retroactive
Scope:Systemic
Platform:
TwitterFacebook
Considerations:

Increases transparency but impacts sponsored content visibility

Tradeoffs:

Visibility vs. revenue loss

Prebunks
Purposely exposing consumers to anticipated misinformation narratives and techniques so they are equipped with the knowledge needed to spot future misinformation and disinformation content.
Focus:Behavior
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
Facebook
Considerations:

Inoculates against misinfo but paternalistically assumes susceptibility

Tradeoffs:

Paternalism vs. inoculation

Reverse Chronological
Feed rankings designed to show users the most recent post rather than algorithmically sorted content aimed to drive high engagement.
Focus:Appearance
Driver:User
User Journey:Retroactive
Scope:Systemic
Platform:
Twitter
Considerations:

Shows most recent but hides potentially relevant content

Tradeoffs:

Recency vs. relevance

References:
Community Moderators
User-driven enforcement of rules such as banning, inviting, and suspending from digital communities on platforms.
Focus:Behavior
Driver:User
User Journey:Retroactive
Scope:Targeted
Platform:
Reddit
Considerations:

Leverages localized judgment but risks bias

Tradeoffs:

Bias vs. localized judgment

Restricted Recommendations
Harmful content detected through user reports or AI can be limited from user's algorithms or inboxes.
Focus:Virality
Driver:Platform
User Journey:Retroactive
Scope:Systemic
Platform:
YouTubeFacebook
Considerations:

Allows oversight but risks opaque censorship

Tradeoffs:

Oversight vs. opacity

Muting
A feature that enables users to limit the content they see on their algorithms or inboxes.
Focus:Appearance
Driver:User
User Journey:Retroactive
Scope:Systemic
Considerations:

Empowers consumers but risk lack of accessibility

Tradeoffs:

TBD

Age-restricted Content
All shared content is filtered to show age-appropriate themes to users under 18 years old.
Focus:Virality
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
YouTube
Considerations:

Risk inappropriate content being share to minors if parents cannot enforce safety measure

Tradeoffs:

TBD

Forwarding Limits
Limitations on how many times messages can be forwarded and shared with other users.
Focus:Behavior
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
WhatsAppFacebook
Considerations:

Slows virality but risks inconveniencing good faith users

Tradeoffs:

Viral spread vs. inconvenience for good faith users

Downvoting
A tool that allows users to demote or vote for content they do not want to see, allowing crowd-sourced needs to be amplified to online safety professionals.
Focus:Behavior
Driver:User
User Journey:Retroactive
Scope:Targeted
Platform:
FacebookYouTubeReddit
Considerations:

Aggregates opinions but risks mob behavior

Tradeoffs:

Brigading vs. crowd wisdom

Downranking
Platforms adjust their algorithms to reduce the visibility or prominence of certain content or accounts.
Focus:Virality
Driver:Platform
User Journey:Retroactive
Scope:Targeted
Considerations:

Allows oversight but risks opaque censorship

Tradeoffs:

Oversight vs. opacity

Collapsed Comments
An added design feature that allows comments, both wanted and unwanted, to be displayed in accordions that organize commentary for users to sort through in order to view.
Focus:Appearance
Driver:User
User Journey:Retroactive
Scope:Systemic
Platform:
Reddit
Considerations:

Reduces clutter but risks suppressing minority opinions

Tradeoffs:

Clutter vs. suppression

References:
Filtered Comments
The editorial decision to auto-filter potentially offensive replies to a consumer's post or publication into a separate accordion, altering the appearance of the user-to-user interaction.
Focus:Appearance
Driver:Platform
User Journey:Retroactive
Scope:Targeted
Platform:
Twitter
Considerations:

Mitigates abuse but risks incorrectly filtering benign replies

Tradeoffs:

False positives vs. safety

Hidden Comments
Comments that are made invisible or shown under a separated accordion based on both the platform's standards and the consumer's standards which are expressed in the settings.
Focus:Appearance
Driver:User
User Journey:Retroactive
Scope:Targeted
Platform:
FacebookYouTube
Considerations:

Upholds decorum but enables covert censorship

Tradeoffs:

Censorship vs. decorum

Message Requests
A process that filters Direct Messages (DMs) from unknown senders and shows the consumer on a page different from their standard inbox.
Focus:Appearance
Driver:Platform
User Journey:Proactive
Scope:Systemic
Platform:
Instagram
Considerations:

Adds safety friction but annoys legitimate users

Tradeoffs:

Friction vs. safety

Community Bans
A system established by community moderators to prohibit the formation of consumers around certain topics.
Focus:Virality
Driver:User
User Journey:Proactive
Scope:Targeted
Platform:
Reddit
Considerations:

Limits harm but perceived as censorship

Tradeoffs:

Censorship vs. safety

References:
Keyword Flagging
An automated system where unwanted words or phrases get flagged for content removal, labeling, or suspension.
Focus:Virality
Driver:Platform
User Journey:Retroactive
Scope:Systemic
Platform:
Facebook
Considerations:

Efficient but prone to over-pruning or editing content

Tradeoffs:

Overblocking vs. automation

Blurred Content
A visual filter that alters and obscures content including images, text, or video.
Focus:Appearance
Driver:Platform
User Journey:Retroactive
Scope:Targeted
Platform:
Twitter
Considerations:

Balances context and sensitivity but opaque

Tradeoffs:

Context vs. sensitivity

Content Filters
A feature that allows users to customize the content they are exposed to through settings that offer customization to blocked or limited keywords, phrases, other users, accounts, or channels.
Focus:Behavior
Driver:User
User Journey:Proactive
Scope:Systemic
Platform:
Twitter
Considerations:

Gives control but enables creating echo chambers

Tradeoffs:

Echo chambers vs. control

Parental Controls
Focus:Appearance
Driver:User
User Journey:Retroactive
Scope:Systemic

Intervention Contribution System

Contribute new interventions to our research database. Download the template, fill it out, and upload your contributions.

Step 1: Download Template

Download the CSV template with the correct format and an example intervention.

Step 2: Add Interventions

Either upload a filled CSV template or use the form below to add interventions one by one.

Option A: Upload CSV File

— OR —

Option B: Add Single Intervention

Instructions

📋 For CSV Upload Option:

  1. Download the template CSV file using the button above
  2. Delete the example intervention row (keep only the headers)
  3. Add your own intervention data to the file
  4. Save the file and upload it back here

📝 Field Requirements:

Required Fields: Intervention Type, Description, Focus, Driver, User Journey, Scope

Optional Fields: Link (URL to platform intervention)

📋 Valid Options for Dropdown Fields:

Focus: Behavioral, Content, Visibility

Driver: Platform-Driven, User-Driven

User Journey: Proactive, Retroactive

Scope: Systemic, Targeted