Blocking users

Blocking users

Introducing controls for who can interact with you in the fediverse

Blocking users

Welcome back, intergalactic explorers. Pull up a chair and join us on a Monday morning social web detour. Your todo list can wait. You've got the entire week still ahead of you!

Last week, we introduced brand new preferences for ActivityPub, the ability to edit your social web profile, and dedicated sharing settings for Threads and Bluesky. Each week, Ghost publications in the fediverse become a little more unique. It's lovely to see!

What's new with ActivityPub?

This week, we shipped our first set of moderation controls: The ability to block users from interacting with you, if you don't want them to.

If you've spent any amount of time in the Fediverse over the past 6 months, there's a good chance you've come across Nicole in your mentions.

But you can call her the Fediverse Chick.

Nicole (not her real name) is a not-so-convincing spam bot with hundreds (thousands?) of profiles across different Mastodon servers, and uses the @mention feature to promote her warez. The same warez. Every time.

The good news: Now you can send her out the airlock.

When a user is blocked, they can see your public posts, but they can no longer interact with you. Any requests they make to follow, like, reply, repost, mention, or interact with your profile are automatically rejected.

Being able to block users is important because healthy communities grow on the principle of consent. Every participant should be able to decide who can reach them, who can’t, and when the conversation is over.

In an open, federated environment like ActivityPub—where posts can flow in from thousands of independent servers—bad-actors, drive-by harassment or spam aren’t hypothetical edge-cases; they’re statistical certainties.

Robust user-level moderation tools turn that reality from a deal-breaker into a manageable nuisance. They allow you to publish publicly without surrendering your personal boundaries, so you can curate a meaningful experience.

That being said, the astute pugs among you will have noticed a shortcoming in this argument.

Nicole is so famously persistent because the spam doesn't just come from a single user. You can block her, but invariably she'll pop up again a few weeks later with a new username on a new server.

In reality, it doesn't take just one feature to facilitate thoughtful moderation; it requires a collection of tools that can be used together in concert. User blocking is our first step down this road, but there's much more yet to come.

Our long-term goal is simple: Each Ghost publication should be able to define its own social atmosphere. That means putting the dials and levers of moderation directly in the hands of publishers, whether that’s blocking a single nuisance account, muting an entire server, or setting up automated filters.

Your publication, your rules, your community.