A response to Evan Prodromou’s “Big Fedi, Small Fedi”

  • thenexusofprivacy@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    1 year ago

    It’s not that he wants the fediverse to be unsafe. It’s more that the Big Fedi beliefs he describes for the fediverse – everybody having an account there (which by definition includes Nazis, anti-trans hate groups, etc) , relying on the same kind of automated moderation tools that we’ve don’t lead to safety on other platforms – lead to a fediverse that’s unsafe for many.

    And sure there are some people who say Fedi is fine as it is. But that’s not the norm for people who disagree with the “Big Fedi” view he sketches. It’s like if somebody said “People who want to federate with Threads are all transphobic.” There are indeed some transphobic people who want to federate with Threads – We Distribute just reported on one – but claiming that’s the typical view of people who want to federate with Threads would be a mischaracterization.

    • Dame @lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Yes, but your response is misleading it insinuates that we have ways to keep those people off of the Fediverse and we don’t. It’s not possible, all of the kinds of people you’re saying he wants on the Fediverse are already on the Fediverse. Also, small doesn’t equal safer, the Fediverse has been small until last year and for years minorities have dealt with racism and harassment. What he said is ultimately right, we need to make the tools. With better tools people can largely have their preferred experiences, it doesn’t have to be an either or situation.

      • thenexusofprivacy@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I agree that small doesn’t equal safer, in other articles I’ve quoted Mekka as saying that for many Black Twitter users there’s more racism and Nazis on the fediverse than Twitter. And I agre that better tools will be good. The question is whether, with current tools, growth with the principles of Big Fedi leads to more or less safety. Evan assumes that safety can be maintained: “There may be some bad people too, but we’ll manage them.” Given that the tools aren’t sufficient to manage the bad people today, that seems like an unrealistic assumption to me.

        And yes, there are ways to keep these people off the fediverse (although they’re not perfect). Gab isn’t on the fediverse today because everybody defederated it. OANN isn’t on the fediverse today because everybody threatened to defederate the instance that (briefly) hosted them, and as a result the instance decided to enforce their terms of service. There’s a difference between Evan’s position that he wants them to have accounts on the fediverse, and the alternate view that we don’t want them to have accounts on the fediverse (although may not always be able to prevent it).

        • Dame @lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          OANN and Gab are one example of a back down. What about the child porn instances? They are still on the Fediverse, they’re just blocked by lots of instances. Using Gab provides a false sense of safety to people.

          • thenexusofprivacy@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Or, using Gab provides a sense of what’s possible.

            And child porn is a great example – and CSAM more generally. Today’s fediverse would have less CSAM if the CSAM instances weren’t on it. Why hasn’t that happened? The reason that many instances give for not block the instances that are well-known sources of CSAM is that CSAM isn’t the only thing on that instance. And it’s true: these instances have lots of people talking about all kinds of things, and only a relatively-small number of people spreading CSAM. So not blocking them is completely in aligment with the Big Fedi views Evan articulates: everybody (even CSAM-spreaders) should have an account, and it’s more important to have the good (non-CSAM) people on the fediverse than to keep the bad (CSAM-spreading) people off.

            A different view is that whoa, even a relatively-small number of people spreading CSAM is way too many, and today’s fediverse would be better if they weren’t on it, and if the instances that allow CSAM are providing a haven for them then those instances shouldn’t be on the fediverse. It seems to me that view would result in less CSAM on the fediverse, which I see as a good thing.