• HarkMahlberg@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    It does sound good. It’s not a bad idea, but have you considered certain problematic situations that may get worse when you introduce a feature like this? For instance, an abusive parent could use such a “parent-child account” to control their victim’s online activity, prevent them from accessing contraception or abortion services, restrict access to LGBT material and communities, etc. This leads to one of two things: a victim unable to navigate the internet on their own (in conjunction with other restrictive and abusive practices levied by the adult), or the victim creating their own hidden account without that oversight, (needing to lie about their age to make that happen, in order to access resources they may need).

    At the end of the day, we’re still talking about technological solutions to human problems, and it’s just the wrong tool for the job. Maybe “wrong” is too harsh, but regardless, it’s not ideal.

    • TherouxSonfeir@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’ve got nothing for that situation. What’s your solution? We can’t very well ask people to verify their identity. Or… I mean… maybe this is a totally different type of paid service that is TOTALLY tied to your real self?

      Everything just points to dismantling social networking, which also means forums, bulletin boards, and every other method of communicating with people.

      • HarkMahlberg@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        I don’t have a solution, I’m not that clever lol. Though I also think dismantling social networking is an overreaction, like humans understand object permanence, we know we can talk to people when they’re not in the same room as us. The internet, social media, are just technological extensions of that ability. I’m not pointing out flaws in good ideas because I want to sink them, I’m pointing them out because someone may yet have a solution to that downstream problem, which would strengthen the idea even further.

        I mean here’s an idea to combat abusive relationships, one that’s not reliant on any technology: make social media platforms mandatory reporters. I’m sure there’s flaws in this idea too, but it may be somewhere to start if we’re trying to tackle the issue of minors being harassed or abused on the internet.

        • TherouxSonfeir@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Mandated reporters may include paid or unpaid people who have assumed full or intermittent responsibility for the care of a child, dependent adult, or elder.

          You mean like… a parent or guardian? Hehehe. All this comes back to holding the parent or legal guardian legally responsible for bad things that happen to a child when those things are within their control.

          I’m 100% for giving the parents of the kid in this article a misdemeanor neglect charge. They literally have one job: put forth a reasonable amount of effort to make sure nothing bad happens to the kid. And I’m pretty sure ongoing sex pic trading is something that isn’t hard to catch if you lock the phone down even a little bit.