I found that the members of r/luckystar were saying very sexual things about characters that, while 18, had very child like designs. And even if, they were saying shit like, “Cunny!” and, “Correction!” so, yeah.

I constantly reported all posts and comments like that I came across and eventually made a post asking why they were saying things like that. I wasn’t hostile, and was in fact, rather calm about the situation. Meanwhile, they were all extremely rude and hostile towards my post, calling me names and such, and about 20 minutes later, my account was shadowbanned.

So, not only did Reddit moderation side with the pedos, they didn’t even have the decency to tell me what I did wrong. Assuming how it all happened so suddenly, I’m just gonna assume the pedos all mass-reported my account for God knows what. Any and all appeals I’ve mad since then received no response.

I still stuck around for a few months after that, [that whole deal happened back in May of 2025.] but the extreme leftist political dooming invading every corner of the website I used made it absolutely miserable to use.

Tonight, I finally put my foot down and said, “I quit!” I discovered this website and made an account here. I guess you can use this post to welcome me to the Fediverse.

  • Hawke@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    Sure, I guess, although it’s kind of inextricably linked to the damage of actually using it.

      • Hawke@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        Because it’s less real. The amount of harm and/or damage is proportional the realism. Using that shitty line drawing I made for an example: if I say the lines represent something objectionable, would that make it so? No, not really.

        The closer to real, the greater the psychological damage to the viewer. However it’s still no actual harm to anyone else.

        And then production of actual CSAM actually does harm children.

        Like, this seems like blatantly obvious stuff, no one is harmed by someone making lines on paper. (Or with modern tech lines on a screen but the idea is the same.)

        • null@piefed.nullspace.lol
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          So you agree that your line example is bad then. The lines are “less real” than a more detailed drawing.

          • Hawke@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            2 days ago

            I mean if you want to interpret some shitty line drawings as CSAM, knock yourself out.

            The point I was trying (and clearly failing) to make is that judging images by the labels is stupid, but so is judging by leaving the appearance entirely open to interpretation.

            Hell, I hadn’t even considered LLMs where a text description alone would be a problem since an LLM could use that to generate an image.

            • null@piefed.nullspace.lol
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 day ago

              I mean if you want to say that sexual drawings of children is basically the same as a line, knock yourself out.

              Pretty fucked up take though.