Just listened to Naomi Brockwell talk about how AI is basically the perfect surveillance tool now.

Her take is very interesting: what if we could actually use AI against that?

Like instead of trying to stay hidden (which honestly feels impossible these days), what if AI could generate tons of fake, realistic data about us? Flood the system with so much artificial nonsense that our real profiles basically disappear in the noise.

Imagine thousands of AI versions of me browsing random sites, faking interests, triggering ads, making fake patterns. Wouldn’t that mess with the profiling systems?

How could this be achieved?

  • moseschrute@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    3 months ago

    I feel like I woke up in the stupidest timeline where climate change is about to kill us, we decide stupidly to 10x our power needs by shoving LLMs down everyone’s throats, and the only solution to stay private is to 10x our personal LLM usage by generating tons of noise about us just to stay private. So now we’re 100x ing everyone’s power usage and we’re going to die even sooner.

    I think your idea is interesting – I was also thinking that same thing awhile back – but how tf did we get here.

    • octobob@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Yeah agreed. What’s going on in my state of Pennsylvania is they’re reopening the Three Mile Island nuclear plant out near Harrisburg for the sole reason of powering Microsoft’s AI data centers. This will be Unit 1 which was closed in 2019. Unit 2 was the one that was permanently closed after the meltdown in 1979.

      I’m all for nuclear power. I think it’s our best option for an alternative energy source. But the only reason they’re opening the plant again is because our grid can’t keep up with AI. I believe the data centers is the only thing the nuke plant will power.

      I’ve also seen the scale of things in my work in terms of power demands. I’m an industrial electrical technician, and part of our business is the control panels for cooling the server racks for Amazon AI data centers. They just keep buying more more and more of them, projected til at least 2035 right now. All these big tech companies are totally revamping everything for AI. Like before a typical rack section might have drawn let’s say 300 watts, now it’s more like 2000 watts. Again, just for AI.

  • a14o@feddit.org
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    It’s a good idea in theory, but it’s a challenging concept to have to explain to immigration officials at the airport.

  • wise_pancake@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    In a different direction now is a good time to start looking at how local AI can liberate us from big tech.

  • relic4322@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    This is like chaff, and I think it would work. But you would have to deal with the fact that whatever patterns it was showing you were doing “you would be doing”.

    I think there are other ways that AI can be used for privacy.

    For example, did you know that you can be identified by how you type/speak online? what if you filtered everything you said through an LLM first, normalizing it. Takes away a fingerprinting option. Could use a pretty small local LLM model that could run on a modest local desktop…

  • DominusOfMegadeus@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    It’s an interesting concept, but I’m not sure the payoff justifies the effort.

    Even with AI-generated noise, you’re still being tracked through logins, device fingerprints, and other signals. And in the process, you would probably end up degrading your own experience; getting irrelevant ads, broken recommendations, or tripping security systems.

    There’s also the environmental cost to consider. If enough people ran decoy traffic 24/7, the energy use could become significant. All for a strategy that platforms would likely adapt to pretty quickly.

    I get the appeal, but I wonder if the practical downsides outweigh the potential privacy gains.

  • upstroke4448@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    This strategy of generating fake data just doesn’t work well. It requires a ton of resources to generate fake data that can’t be easily filtered which ends up making the strategy non viable on most situations. Look at Mullvads DAITA and how it constantly has to be improved to fight this and, that’s just for basic protection.

    There is a bit of a cognitive dissonance that goes on, where people seem to understand that you are tracked constantly online and offline through all sorts of complex means but still think relatively mundane solutions could break that system.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    This is a dangerous proposition.

    When the dictatorship comes after you, they’re not concerned about the whole of every article that was written about you All they care about are the things they see as incriminating.

    You could literally take a spell check dictionary list, pull three words out of the list at random and feed it into a ollama asking for a story with your name that included the three words as major points in the story.

    Even on a relatively old video card, you could probably crap out three stories a minute. Have it write them in HTML and publish the site map into major search engines on a regular basis.

    • Eyedust@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Yup, you’d be surprised what you can accomplish with 10gb of VRAM and a 12b model. Hell, my profile pic (which isn’t very good, tbf) was made on that 10gb VRAM card using localhosted stable diffusion. I hate big corp AI, but I absolutely love open market and open source local models. Gonna be a shame when they start to police them.

      To OP: The problem is that they’re looking for keywords. With the amount of people under surveillance these days, they don’t give a rat’s ass if you went to your favorite coffee roasting site, they want to find the stuff they don’t want you to do.

      Piracy? You’re on a list. Any cleaning chemical that can be related to the construction of explosives? You’re on a list. These lists will then tack on more keywords that pertain to that list. For example, the explosives list will then search for matching components bought within a close span of time that would indicate you’re making them. Even searching for ways to enforce your privacy just makes them more interested.

      So then you put out a bunch of fake data. This data happens to say you viewed a page pertaining that matching component. Whelp, that list just got hotter and now there are even more eyes on you and they’re being slightly more attentive this time. Its a bad idea. The only way you’re getting out of surveillance, at least online, is to never go online.

      In reality, they probably won’t even do anything about the above. What they really want is money. Money for your info; money to sell more things to you. They want the average home to be filled with advertisements tailored from your information. Because those adverts make those companies money, which they then use to buy more information to monetize your existence. Its the largest pyramid scheme known to humanity, and we’re the unpaid grunts.

      The moment the world became connected through telephones, cable TV, and then internet this scheme was already in motion way beforehand. Let’s be honest, smartphones were the motherload. A TV, phone, and computer you always keep on you? They were salivating that day.

  • SendMePhotos@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Obscuration is what you’re thinking and it works with things like adnauseun (firefox add on that will click all ads in the background to obscure preference data). It’s a nice way to smear the data and probably better to do sooner (while the data collection is in infancy) rather than later (where the companies may be able to filter obscuration attempts).

    I like it. I am really not a fan of being profiled, collected, and categorized. I agree with others, I hate this time line. It’s so uncanny.

    • HelloRoot@lemy.lol
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      I still don’t really understand adnauseum. What is the difference in privacy compared to clicking on none of the ads?

      • SendMePhotos@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Whatever data profile they already have on your can be obscured to make it useless vs them probably trickling in data.

        Think of it like um…

        Having a picture of you with a moderate amount of notes that are accurate, vs having a picture of you with so much irrelevant/inaccurate data that you can’t be certain of anything.

        • HelloRoot@lemy.lol
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          But the picture of me they have is: doesn’t click ads like all the other adblocker people (which is accurate)

          Why would I want to change it to: clicks ALL the ads like all the other adnauseum people (which is also accurate)