The tech mogul’s platform is the first to get hit with charges under new EU social media law.

The European Union is calling Elon Musk to order over how he turned social media site X into a haven for disinformation and illegal content.

The EU Commission on Friday formally charged X for failing to respect EU social media law. The platform could face a sweeping multi-million euro fine in a pioneering case under the bloc’s new Digital Services Act (DSA), a law to clamp down on toxic and illegal online content and algorithms.

Musk’s X has been in Brussels’ crosshairs ever since the billionaire took over the company, formerly known as Twitter, in 2022. X has been accused of letting disinformation and illegal hate speech run wild, roll out misleading authentication features and blocking external researchers from tools to scrutinize how malicious content on the platforms spreads.

The European Commission oversees X and two dozens of the world’s largest online platforms including Facebook, YouTube and others. The EU executive’s probe into Musk’s firm opened in December 2023 and was the first formal investigation. Friday’s charges are the first-ever under the DSA.

Infringements of the DSA could lead to fines of up to 6 percent of a X’s global revenue.

  • blazera@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Disinformation is words

    It spreads on twitter, it spreads on facebook, on tiktok, on youtube, on discord, text messages, books, speeches, talking to coworkers. This is like the war on drugs except even easier to circumvent any bans. Youre not gonna beat disinformation by trying to block it.

    • fluxion@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      You’re also not going to beat it by not trying to deal with it. The transition from twitter being an unreliable source to becoming an unbridled dumpster fire of disinformation and hate campaigns has a direct correlation with Musk taking specific steps to cater to those audiences while ripping out any facilities to filter it.

      It’s not all or nothing, like basically everything else in life, it requires balance. Just like you don’t have to “beat” drugs to help drug users find a better path, you don’t have to “beat” disinformation in order to help stop it from spreading. You can take steps when/where they make sense to limit the damage and give people a chance to pull their head out of the cesspool to get enough air that society can function in a manner in tune with reality to some degree.

      • blazera@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Just like you don’t have to “beat” drugs to help drug users find a better path, you don’t have to “beat” disinformation in order to help stop it from spreading

        The war on drugs notably did not involve helping users find a better path, it only tried to block the path of drug use, with pretty disastrous results as drug users became pariahs pushed to more dangerous avenues of drug sources to get around the blocks.

        The only thing we are talking about here is a block from one path of disinformation. Theyll get pushed to the fringes of more dangerous sources of misinformation.

        • fluxion@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 months ago

          I’m not talking about the war on drugs, I’m talking about the fact that rehab facilities, education, counseling/medical aid are helpful to curtailing an out of control drug epidemic and reducing the negative impact on society.

          Just because the “war on drugs” failed doesn’t drug-related issues can’t be addressed to some degree. You focus on completely blocking misinformation so it doesn’t exist, I’m trying to point out other considerations: ranking, exposure, flagging/reviewing posts, community notes to provide additional context. These are all things that exist, that are used heavily, that impact our information feeds 24/7, and that will continue to be used to significant effect on the general population, whether for good or for bad. More likely the latter if everyone adopts perspectives like yours.

          • blazera@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            I am talking about the war on drugs, as that is what this is akin to, purely trying to block disinformation.

            All of the “other considerations” youve added, except for community context, are just tools to block. Like the war on drugs using drug tests, drug sniffing dogs, report hotlines, methods to find drugs and punish for it.

            Community context is a good example of things that do work, that is akin to educating people about drugs rather than trying to block them. But twitter has that tool, twitter is being punished for not blocking misinformation.

            • fluxion@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              4 months ago

              The specific charges noted in the article have similar nuances to the examples i gave. They are fixable and addressable and impactful. They do not require a full block on misinformation, which is obviously not something that’s possible to enforce effectively and not what’s being expected of X.

              • blazera@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                I just wrote out a long response, ending with the idea that if misinformation gets removed from twitter, its only because its moved somewhere less visible to the public. And then realized i was arguing disinformation would be less visible to the public.

                Kick Musk’s ass EU

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      The article states that the EU is objecting to a couple of particular things:

      The EU said X’s blue checks policy was deceiving and had been abused by malicious actors. The checks were initially created as as way to verify users like government officials, public figures and journalists, in efforts to limit misinformation, but Musk changed that policy, allowing users to buy blue check accounts. The new policy has been abused by fraudsters to impersonate U.S. politician Hillary Clinton and author J.K. Rowling, among many other celebrities.

      The platform also didn’t respect an obligation to provide a searchable and reliable advertisement repository and limited access to its public data to researchers, the Commission said.

      This is not some amorphous campaign against disinformation, it’s a challenge to two specific policies of X.

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      When the vast majority spreads on several platforms, you can very much beat it by blocking it. We’re not doing it not because we can’t but because letting it spread is profitable. Prior to the invention of modern social media the problem of misinformation was much smaller. Yes of course it will never disappear but we don’t need it to disappear.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        They’ve done research about deplatforming, and it’s actually really effective in reducing content - most of the followers aren’t motivated enough to jump to a different website to follow their conspiracy content.

      • blazera@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        I wish you could’ve lived in the wild days of eating lead and radiation well before the internet was even an idea.