

31·
1 day agoI moved away from centralised social media, because social media owned by multinational corporations benefit from bigotry and rage, and so allow it to fester and grow. They do this by under moderating, or moderating with a bias against the people being harassed and attacked.
So the last thing I would choose to do is go to a platform/network that prides itself on lack of moderation, and requires vulnerable, targeted folk to play whack a mole, with each person having to reactively block individual bigots, one by one, after they’ve appeared and dumped their payload of hate.

In theory yes, in practice, no.
Nostr uses relays. In some ways, a relay is like an instance on the fediverse. Where they differ though, is that a) relays don’t talk to each other and b) users can sign up to many different relays and pull/push content to all of them.
So in practice, in order to see a wide amount of content, you need to end up connecting to multiple relays. And even though a relay does have some moderation capabilities to block content, unless every relay you use blocks the content from the bigoted account, you’ll see it.
If you signed up only to a single relay, and that relay had good moderation, then in theory, your Nostr experience wouldn’t be terrible, but a single niche relay like that will mean you see basically no content. And as soon as you connect to a larger public relay to get more content, you lose all of the moderation advantages offered by your first instance. Which means in practice, there is no incentive to run a well moderated instance.
And so all of the moderation ends up on the end user, who has to manually block accounts only after they appear and dump their load of hate (at which point, the bigot will just spin up another account). Some people prefer that experience, but when you’re the regular target of hate, that approach just doesn’t work for many folk.