The Commission’s investigation preliminarily indicates that TikTok did not adequately assess how these addictive features could harm the physical and mental wellbeing of its users, including minors and vulnerable adults.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 hours ago

    Tech companies shouldn’t have these sorts of algorithmic profiles of people. It’s manipulation.

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    2
    ·
    6 hours ago

    Let me hold my breath waiting for something actually meaningful to be done about it, more than a “cost of doing business” protection payment.

  • mrdown@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    4
    ·
    6 hours ago

    Lemmy has infinite scrolling too so if lemmy get big it will have the same legal issue

    • HertzDentalBar@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 hours ago

      There’s a big difference between infinite scrolling content that’s using algorithms to specifically keep you scrolling and how Lemmy does it. On Lemmy that’s a you problem not a capitalism problem.

      • mrdown@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        3 hours ago

        It is subjective. I personally spend more time on non algorithmic feed than algorithm one. It is boring to keep seeing the same time of content most of the time

    • Nurse_Robot@lemmy.world
      link
      fedilink
      English
      arrow-up
      36
      ·
      6 hours ago

      This includes features such as infinite scroll, autoplay, push notifications, and its highly personalised recommender system.

      I don’t think Lemmy (or any text based platform) really fits the bill

        • partofthevoice@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          ·
          4 hours ago

          Nope. I turned off infinite scroll in the settings. I use the Voyager app, which paginates the feed when configured to do so.

        • Nurse_Robot@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          5 hours ago

          It has infinite scroll. It’s not highly personalized like tiktok, it has auto play but there’s FAR more text posts than videos, and the push notifications are far less extreme (is there even a setting to get push notifications for every upvote, like tiktok enables automatically?). Lemmy really isn’t anything like tiktok.

        • ViatorOmnium@piefed.social
          link
          fedilink
          English
          arrow-up
          15
          ·
          6 hours ago

          But the algorithm isn’t extremely personalized or optimized towards “engagement”. In fact the only fediverse platform that comes close is Loops, and even that is light-years away from the psychological manipulation that goes into Tiktok algorithms.

        • CosmoNova@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          Are you sure? They have been ignoring virtually everything with infinite scrolling. TikTok is the extremely rare exception where they actually address it.

  • ZephyrXero@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 hours ago

    And what about all their copycats? Like Instagram and YouTube that are trying to do the same thing?

    • morto@piefed.social
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      5 hours ago

      If this case gets a closure against tiktok, it will become a jurisprudence and will allow to do the same to them

  • FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    5 hours ago

    The big danger here, which these steps mitigate but do not solve are:

    #1 Algorithmically curated content

    On the various social media, there are systems of automated content moderation that are in place that remove or suppress content. Ostensibly for protecting users from viewing illegal or disturbing content. In addition, there are systems for recommending content to a user by using metrics for the content, metrics for the users combined with machine learning algorithm and other controls which create a system of controls to both restrict and promote content based on criteria set by the owner. We commonly call this, abstractly, ‘The Algorithm’ Meta has theirs, X has theirs, TikTok has theirs. Originally these were used to recommend ads and products but now they’ve discovered that selling political opinions for cash is a far more lucrative business. This change from advertiser to for-hire propagandist

    The personal metrics that these systems use are made up of every bit of information that the company can extract out of you via your smartphone, linked identity, ad network data and other data brokers. The amount of data that is available on the average consumer is pretty comprehensive right down to knowing the user’s rough/exact location in real-time.

    The Algorithm used by social media companies are a black box, so we don’t know how they are designed. Nor do we know how they are being used at any given moment. There are things that they are required to do (like block illegal content) but there are very little, if any, restrictions on what they can block or promote otherwise nor are there any reporting requirements for changes to these systems or restrictions on selling the use of The Algorithm for any reason whatsoever.

    There have been many public examples of the owners of that box to restricting speech by de-prioritizing videos or restricting content containing specific terms in a way that imposes a specific viewpoint through manufactured consensus. We have no idea if this was done by accident (as claimed by the companies, when they operate too brazenly and are discovered), if it was done because the owner had a specific viewpoint or if the owner was paid to impose that viewpoint.

    This means that our entire online public discourse is controllable. That means of control is essentially unregulated and is increasingly being used and sold for, what cannot be called anything but, propaganda.

    #2 - There is no #2, the Algorithms are dangerous cyberweapons, their usage should be heavily regulated and incredible restrictions put on their use against people.

    • breakingcups@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      8 don’t know where you live, but over here (European country) I can easily see 5 people on TikTok when I’m on the train.

      • atropa@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        About the same as your country, the UK, 38%. This is a skewed picture based on age. Popular with teenagers and single people, ages 24 to 36, to break the loneliness. I don’t know anyone in my environment who uses this, the same goes for my daughter