Social media platforms like Twitter and Reddit are increasingly infested with bots and fake accounts, leading to significant manipulation of public discourse. These bots don’t just annoy users—they skew visibility through vote manipulation. Fake accounts and automated scripts systematically downvote posts opposing certain viewpoints, distorting the content that surfaces and amplifying specific agendas.

Before coming to Lemmy, I was systematically downvoted by bots on Reddit for completely normal comments that were relatively neutral and not controversial​ at all. Seemed to be no pattern in it… One time I commented that my favorite game was WoW, down voted -15 for no apparent reason.

For example, a bot on Twitter using an API call to GPT-4o ran out of funding and started posting their prompts and system information publicly.

https://www.dailydot.com/debug/chatgpt-bot-x-russian-campaign-meme/

Example shown here

Bots like these are probably in the tens or hundreds of thousands. They did a huge ban wave of bots on Reddit, and some major top level subreddits were quiet for days because of it. Unbelievable…

How do we even fix this issue or prevent it from affecting Lemmy??

  • AnotherWorld@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 months ago

    No current social network can be bot-proof. And Lemmy is in the most unprotected situation here, saved only by his low fame. On Twitter, I personally have already banned about 15000 Russian bots, but that’s less than 1% of the existing ones. I’ve seen the heads of bots with 165000 followers. Just imagine that all 165000 will register accounts on Lemmy, there is nothing to oppose them. I used to develop a theory for a new social network, where bots could exist as much as he want, but could not influence your circle of subscriptions and subscribers. But it’s complicated…

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      Also, the “bot”/“human” distinction doesn’t have to be binary. Say one has an account that mostly has a bot post generated text, but then if it receives a message, hands it off to a human to handle. Or has a certain percentage of content be human-crafted. That may potentially defeat a lot of approaches for detecting a bot.