Post Author
You posted it. You can see it. But nobody else can. Welcome to Reddit’s invisible moderation layer.
- What Researchers Found When They Looked Under the Hood
- The Three Layers of Invisible Removal
- A Problem Bigger Than One Platform
- Why Reddit Built It This Way
- Who Gets Hit Hardest
- The Karma Catch-22
- How to Tell If Your Post Was Silently Removed
- The Notification Gap
- What Actually Triggers Silent Removal
- What You Can Actually Do About It
- The Uncomfortable Design Choice
- Sources
There is a subreddit called r/CantSayAnything that exists for one purpose: to demonstrate how Reddit’s content removal actually works. Go there, post anything you want, and watch what happens.
Your post will be removed instantly. You will not receive any notification. And here is the part that makes people uncomfortable: when you look at your post while logged in, it will appear completely normal. Your title, your text, everything right where you left it. But log out, open an incognito window, and try to find that same post. It will not be there. The body text will show as [removed]. Nobody saw it. Nobody ever could.
This is not a bug. This is how Reddit is designed to work. The Reveddit FAQ, a site dedicated to revealing removed Reddit content, puts it bluntly: “It will be removed, you will not receive a message, and it will appear to you as if it is not removed while you are logged in.”

Millions of Reddit users have experienced this without ever knowing. They post something, check back later, see zero engagement, and assume nobody cared. The thought that their content was silently removed before anyone could see it often never crosses their mind.
What Researchers Found When They Looked Under the Hood
The gap between what Reddit claims and what actually happens has drawn attention from academic researchers. A 2020 study from the University of Washington analyzed over half a million content removals across 204 subreddits and found something troubling: “users are neither notified of these sanctions, nor are these practices formally stated in any of the subreddits’ rules.”
The researchers examined whether Reddit communities were following the Santa Clara Principles, a set of transparency guidelines for content moderation that Reddit itself publicly endorsed in 2018. The principles call for platforms to “provide notice to each user whose content is taken down” and explain the reason for the removal.
The study’s conclusion: Reddit’s subreddit communities largely do not follow these guidelines.

When they interviewed moderators about why transparency was lacking, some were candid.
One moderator explained that users “who are going to violate the rules, they’re going to violate the rules, no matter what.”
A separate 2024 study from the University of Michigan examined over 600 million comments and found that “most removals occur silently and without notifying the user, which undermines trust and makes it difficult for users to challenge wrongful removals.” The researchers recommended that platforms “increase the transparency of content removals by notifying users when their content is removed.”
The Three Layers of Invisible Removal
Understanding why your post disappeared requires understanding that Reddit has multiple overlapping systems for removing content, and most of them operate in silence.

The first layer is Reddit’s sitewide spam filter. According to Reddit’s official help documentation, new accounts, accounts with low karma, and accounts posting in subreddits “especially sensitive to spammers” are most likely to trigger this filter. What Reddit does not mention is that the spam filter often does not tell you it removed your post. As Reveddit documents, if the spam filter removes content and it is less than 24 hours old, Reddit will not display any removal notice.
The second layer is AutoModerator. AutoMod is a bot that subreddit moderators can program with custom rules. According to Reddit’s moderator documentation, AutoMod can “remove or flair posts by domain or keyword” and “identify potential spammers or low-quality contributors” based on account age or karma thresholds.
The key detail: moderators can choose whether or not to notify users when AutoMod removes their content. Many choose not to.
The third layer is manual moderator action. Human moderators can remove posts and comments at their discretion. According to Reddit’s Mod Mode documentation, when moderators remove content, “other redditors who visit the content via a direct link will just see [Removed by moderator].” But for the author? The content remains visible when they are logged in.
A Problem Bigger Than One Platform
Tarleton Gillespie, a principal researcher at Microsoft Research and author of Custodians of the Internet, has written extensively about visibility reduction as a moderation tool.
In a 2022 paper for Yale Law School, he described the fundamental problem: “The content remains, it can be found, commented on, and forwarded, yet it seems not to have the audience or traction that it might have. This uncertainty leaves users grasping for explanations, and is part of why so many users are suspicious that murky machinations are at work under the hood of these platforms.”
The Washington Post has covered shadowbanning across social media platforms, noting that “companies are tight-lipped about what they hide on our feeds.” The article describes how users often cannot determine whether their content is being suppressed or simply not finding an audience.
A 2024 study published in Business & Information Systems Engineering found that approximately 10% of U.S. social media users report being shadowbanned across major platforms, including Reddit. The researchers noted that “shadowbanning prevents users from correcting or disputing content moderation decisions” and that users are “left to speculate about whether they have been shadowbanned.”

The psychological impact is also real. A 2025 paper in Frontiers in Psychology examined the mental health effects of silent content suppression, finding that it “can induce depression symptoms, anxiety, and compulsive checking of content behaviors.” The researchers described how “the shock invisibility disrupts emotional regulatory protocols” when users are “systematically excluded from the social world.”
Why Reddit Built It This Way
The logic behind silent removal makes sense if you think about it from a spam-fighting perspective. If spammers knew immediately when their content was removed, they could adapt. They could test which posts got through and which did not. They could iterate faster than moderators could keep up.
Reddit’s original solution to this problem was the shadowban. A shadowbanned user could post and comment endlessly without ever knowing that nobody could see any of it. The account looked normal to them. But to everyone else, that user simply did not exist.
In November 2015, Reddit announced it was moving away from shadowbans for human users. As TechCrunch reported at the time, Reddit admitted that shadowbans were “great for dealing with bots/spam rings, but woefully inadequate for real human beings.” The new system would use transparent account suspensions instead.
But here is the thing: Reddit said suspensions would replace shadowbans “for the vast majority of real humans.” Not all. And they never said anything about individual post and comment removals. Those remained silent by default. A decade later, the fundamental problem persists.
Who Gets Hit Hardest
Research suggests that silent content moderation does not affect all users equally. A 2024 study from the University of Michigan found that marginalized groups, including Black and transgender users, report being disproportionately affected by shadowbanning across social media platforms.
“The platforms can help these marginalized groups by improving their communication related to shadowbanning, especially about why certain categories of content are suppressed, and by validating users’ experiences instead of denying that they suppress content,” said researcher Samuel Mayworm.
The study found that users who believed they had been shadowbanned “felt frustrated, had fewer social media engagements, and held negative perceptions of platforms.” Many engaged in what the researchers called “collaborative algorithm investigation,” testing each other’s suspicions about being shadowbanned and reporting findings to one another.
The Karma Catch-22
One of the most common triggers for silent removal is not having enough karma. Karma is Reddit’s reputation system, earned through upvotes on your posts and comments. The problem is, many subreddits require minimum karma to post, but they do not tell you what that minimum is.

According to Reddit’s official explanation of karma, “If you’re new to Reddit and posting to a community for the first time, you might run into some issues, such as your post not showing up. This could be due to a number of reasons, one reason being that some communities require a certain amount of karma before allowing you to post there.”
That acknowledgment is buried in a help article. It does not appear when your post gets silently removed. You do not get a message saying “you need 50 karma to post here.” You just post, see your content, and wonder why nobody responds.
The irony is circular. You need karma to post. You get karma from posts that get upvoted. But your posts are getting removed before anyone can upvote them. New users find themselves in a Catch-22 where the only path forward is to somehow figure out which subreddits have lower thresholds, build karma there, and then graduate to the communities they actually wanted to participate in.
Account age works the same way. Many subreddits require accounts to be 7 days, 30 days, or even older before allowing posts. Again, these requirements are often undisclosed. The Reddit moderator documentation actually encourages moderators to “keep your requirements as low as you can so that your community is welcoming to new redditors who are participating in good faith.” But it is just a suggestion, not a requirement.
How to Tell If Your Post Was Silently Removed
The simplest method is the incognito test. Copy the URL of your post, open a private browsing window, and try to view it. If you can see the full content while logged in but only see [removed] when logged out, your post was removed, and you were never told.

For a more comprehensive check, Reveddit.com lets you enter your username and see all of your removed content across Reddit. One user on r/technology captured the experience: “What’s stunning is you get no notification, and to you the comment still looks up. Which means mods can set whatever narrative they want without answering to anyone.”
Another user on r/InternetIsBeautiful had a similar reaction: “Today I learned that the reason no one ever replies to my posts is that they all get removed.”
For checking whether your entire account has been shadowbanned at the site level, the subreddit r/ShadowBan has a bot that will analyze your account. Third-party tools like BanChecker.org can also check your account status without requiring you to log in.
The Notification Gap
One of the strangest details in all of this is that Reddit does have the infrastructure to notify users when content is removed. Some subreddits use it. AutoMod can be configured to send a message explaining why a post was taken down. But it is entirely optional.
The Reddit moderator help documentation describes how moderators can “consider adding a removal reason to help educate your community and provide transparency on rule enforcement.” Consider. Not required. Transparency is framed as a nice-to-have, not a baseline expectation.
Meanwhile, if you want to know when replies to your comments are removed, Reddit offers an indirect solution. According to Reveddit, if you go to reddit.com/settings/emails and turn on notifications for “Replies to your comments” and “Comments on your posts,” Reddit will email you the content of every reply, including ones that get auto-removed. The removal happens, the reply vanishes from Reddit, but the email still contains what was written. It is like a receipt for content that no longer exists.
This workaround itself is telling. Reddit has the data. Reddit sends the notification. But the default is silence.
What Actually Triggers Silent Removal
Based on Reddit’s documentation and user reports, the most common triggers for silent removal include:
- New accounts. Accounts less than a few days or weeks old face heightened scrutiny from both Reddit’s spam filter and subreddit AutoMod rules. Many subreddits will not let you post at all until your account reaches a certain age.
- Low karma. Common thresholds mentioned in various subreddit rules range from 10 karma to 500 or more for larger communities. Since many subreddits do not publish their requirements, users often have to guess.
- Unverified email. Accounts without a verified email address may be filtered more aggressively by Reddit’s spam systems.
- Posting too frequently. Reddit rate-limits new accounts to posting every 10 to 15 minutes. Attempting to post faster can trigger the spam filter.
- Specific keywords or links. Many subreddits have AutoMod rules that remove posts containing certain words, phrases, or domains. Some domains are soft-banned across much of Reddit because they were historically used for spam.
- Cross-posting. Posting the same content to multiple subreddits in a short time window can flag your account for spam-like behavior.
- VPN usage. Multiple reports suggest that creating an account while using a VPN can flag it immediately, as many spam operations route through VPN services.
What You Can Actually Do About It
The options for users caught in silent removal are limited, but they exist.
- Check before assuming. After posting, open an incognito window and verify your content is actually visible. Do this every time until you have established karma and account age in a community. It takes thirty seconds and saves hours of wondering why nobody is engaging.
- Build karma in welcoming communities. Subreddits like r/AskReddit, r/CasualConversation, and hobby-focused communities with smaller followings often have lower or no karma requirements. Comment genuinely there. Build some history. Then move to more restrictive communities.
- Verify your email. This one is simple and removes one potential trigger for spam filtering.
- Read the subreddit rules carefully. Some subreddits do publish their karma and age requirements in their rules or wiki. Others have them pinned to posts. Taking five minutes to look can prevent a removal that would otherwise be invisible.
- Message the moderators. If you believe your post was removed unfairly, send a polite message to the subreddit’s moderators asking if they can review it. Some will. Some will not respond. But it is the only official channel for appeals at the subreddit level.
- Use Reveddit proactively. Install the Reveddit Real-Time extension, and it will notify you whenever your content is removed. Knowing is the first step toward doing something about it.
The Uncomfortable Design Choice
Reddit is not unique in using silent moderation. Platforms across the internet employ similar techniques. The argument is always the same: transparency benefits bad actors. If spammers know exactly when and why they are being blocked, they can work around it faster than the blocks can be updated.

But the collateral damage falls on regular users. People who genuinely want to participate find themselves talking to empty rooms.
One r/mildlyinfuriating commenter, after discovering how much of their content had been silently removed, wrote: “I’ve been on here 12 years and always thought it was a cool place where people could openly share ideas. Turns out it’s more censored than China.”
That comparison is dramatic, but the frustration is real. The design choice to make removed content appear normal to its author is not neutral. It creates an illusion of participation where none exists. It lets people invest time writing posts that will never be read.
In 2018, Reddit publicly endorsed the Santa Clara Principles, which call for transparency in content moderation decisions. In 2020, researchers found that Reddit’s communities were largely not following those principles. In 2024, researchers are still documenting the same gap between what Reddit claims and what actually happens to users.
The gap between what Reddit shows you and what Reddit shows everyone else is a design choice. Whether it is the right one depends on who you ask. But the fact that so many users do not even know the gap exists is, at minimum, worth knowing about.
At least now you know to check.
Sources
- “Through the Looking Glass: Study of Transparency in Reddit’s Moderation Practices” – University of Washington (2020) https://faculty.washington.edu/tmitra/public/papers/group2020_Reddit_Transparency.pdf
- “Political Bias in Content Moderation” – University of Michigan Ross School of Business (2024) https://michiganross.umich.edu/news/new-study-reddit-explores-how-political-bias-content-moderation-feeds-echo-chambers
- “Shadowbanning: An Opaque Form of Content Moderation” – Business & Information Systems Engineering (2024) https://link.springer.com/article/10.1007/s12599-024-00905-3
- “Digital Silence: The Psychological Impact of Being Shadow Banned” – Frontiers in Psychology (2025) https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2025.1659272/full
- “How Marginalized Social Media Users Perceive Shadowbanning” – University of Michigan (2024) https://record.umich.edu/articles/study-looks-at-shadowbanning-of-marginalized-social-media-users/
- Tarleton Gillespie, “Do Not Recommend? Reduction as a Form of Content Moderation” – Yale Law School (2022) https://law.yale.edu/sites/default/files/area/center/isp/documents/reduction_ispessayseries_jul2022.pdf
- Tarleton Gillespie, “Custodians of the Internet” – Yale University Press (2018) https://yalebooks.yale.edu/book/9780300261431/custodians-of-the-internet/
- Washington Post: “What is shadowbanning? Why social media may be hiding your posts” (October 2024) https://www.washingtonpost.com/technology/2024/10/16/shadowban-social-media-algorithms-twitter-tiktok/
- TechCrunch: “Reddit Replaces Its Confusing Shadowban System With Account Suspensions” (November 2015) https://techcrunch.com/2015/11/11/reddit-account-suspensions/
- Reynolds Journalism Institute: “Recognizing and Responding to Shadow Bans” (September 2024) https://rjionline.org/news/recognizing-and-responding-to-shadow-bans/
- Santa Clara Principles on Transparency and Accountability in Content Moderation (2018, updated 2021) https://santaclaraprinciples.org/
- Reddit Help: AutoModerator https://support.reddithelp.com/hc/en-us/articles/15484574206484-Automoderator
- Reddit Help: Mod Mode https://support.reddithelp.com/hc/en-us/articles/15484365010196-Mod-Mode
- Reddit Help: Why can’t I see my post https://support.reddithelp.com/hc/en-us/articles/360045989712-Why-can-t-I-see-my-post
- Reddit Help: What is karma https://support.reddithelp.com/hc/en-us/articles/204511829-What-is-karma
- Reveddit.com – Content removal tracking https://www.reveddit.com/
- Reveddit FAQ https://www.reveddit.com/about/faq/
- r/ShadowBan – Shadowban testing subreddit https://www.reddit.com/r/ShadowBan
- BanChecker.org – Account status checker https://banchecker.org/
