- sbv@sh.itjust.worksEnglish3 hours
With a 70% non-compliance rate, that isn’t entirely surprising.
Platforms are even less likely to implement real reforms that the author alludes to.
- 1 hour
I still think it’s a step in the right direction. Once you make it illegal for children to use social media, you can start going after the platforms for knowingly manipulating children.
- 15 minutes
Prohibition is effective, it’s just that it doesn’t work for easy to manufacture compounds such as alcohol or marijuana. Every known human culture has independently discovered alcohol, and marijuana is a weed that is ready to smoke in its natural form.
As far as social media goes, my country has reached a point where TikTok and Facebook are preinstalled on every phone. If a parent buys their kid a phone and removes them, they will reinstall themselves after an automatic update. When you take into consideration the “streamlined” registration process, one can argue this is a means to target prepubescent children.
…I guess an 8 year old could download a VPN and steal their parents identification, but I feel like some form of prohibition would help.
- 2 minutes
So you not only create a grey market you immediately inculcate the children into it.
Prohibition is generally ineffective in anything that doesn’t involve violating someone else’s rights.
If we’re talking about getting rid of slopware I’m all for it. But this law. And other laws like it are an incredibly thinley veiled attempt to silence dissent by tying peoples online comments to their employment and subsequently housing and healthcare.
And I will never believe that this is done out concern for children.
shortwavesurfer@lemmy.zipEnglish
5 hoursSpeak for yourself. I find quite a bit of joy in “I told you so”.
- commander@lemmy.worldEnglish4 hours
They’re propaganda laws. Internet censorship laws. Palestinian genocide started trending on social media and suddenly all the countries out in the west wanted to start banning/controlling social media. Plus the earlier push to ban TikTok by Facebook to try to ladder pull the market from competitors
- TheDarkQuark@lemmy.worldEnglish4 hours
How are they getting around it? I thought Australia implemented some sort of ID verification? And given that social media platforms notoriously refuse to run on VPNs, how are the teenagers bypassing the checks? Is everyone there using their parent’s ID or something?!
- corsicanguppy@lemmy.caEnglish1 hour
I thought Australia implemented some sort of ID verification?
To answer the question as written, you did think so. I’m mostly certain of this. Glad I could help.
- sbv@sh.itjust.worksEnglish3 hours
The platforms aren’t complying with the law:
Of the parents who reported their child had an account on each platform prior to 10 December 2025, around 7 in 10 reported that their child still had an account on Facebook (63.6%), Instagram (69.1%), Snapchat (69.4%), and TikTok (69.3%). Around 3 in 10 reported that their child no longer had an account. One in two of these parents (48.5%) reported that their child still had an account on YouTube following the age restrictions coming into effect.
- Fluffy Kitty Cat@slrpnk.netEnglish9 hours
It was never designed to protect children
Glad to see it’s not even working. Let’s keep fighting aginst these evil laws
- expr@piefed.socialEnglish7 hours
I mean, social media should be banned for everyone, not just teenagers. It’s a great evil in the world today, and in a functional democracy that wasn’t braindead, we should ban them outright for the mass harm and destruction they have caused.
That being said, I fully understand that the motivations of countries for these kinds of bans have little to do with the harm of social media and are much more about surveillance.
- Link@rentadrunk.orgEnglish13 minutes
Which type of social media are we referring to here?
Doesn’t Lemmy count as social media?
yardratianSoma@lemmy.caEnglish
3 hoursIt’s so bonkers how most of the older generations agree that being on the internet cannot make you social, yet became the default method to communicate.
Ban it for everyone? I mean, lemmy itself is a social network platform, if you want it to be. But I know what you mean: social media being the most used platforms, Google, Facebook, Tik-Tok, etc . . . And for that, yeah, I do agree with a full ban. We need a cultural reset, where we aren’t being fed sensationalist bullshit and pure brainrot as entertainment via an algorithm trained on our insufficient capacity to regulate our attention.
Dave@lemmy.nzEnglish
53 minutesIn my view social media is probably not the problem, but the algorithms they use that are designed to be addictive and manipulative.
I saw an article once arguing that the algorithms should be regulated in a similar way to medicine. Give some base ingredients they can use freely (e.g. sort by newest first), then require any others to run studies to prove they are not harmful.
There would be an expert board that approves or declines the new algorithm in the same way medicines are approved today (the important bit being that they are experts, not politicians making the decision).
- expr@piefed.socialEnglish2 hours
If you take such a broad definition of social media, then nearly the entire Internet becomes “social media” and the term loses its meaning, IMO.
- Lodespawn@aussie.zoneEnglish9 hours
I don’t think they are evil. A bunch of people with good intentions who didn’t understand the problem are trying to solve it with a gut feeling rather than analysis and evidence. It’s really disappoi ting that they would waste so much of our time and money like this.
Scotty_Trees@lemmy.worldEnglish
5 hoursFormer Facebook higher ups have gone on the record to say the Facebook uses destructive algorithms to keep people hooked, they know exactly what they are doing and don’t care how it affects us as long as they can squeeze more info from us for more profit. Thinking Silicon Valley tech billionaires actually care about you? Bro, you need to wake up.
- Lodespawn@aussie.zoneEnglish4 hours
We’re talking about Australian legislation not social media itself. The problem is real, the legislation is ineffective and poorly implemented. Calling the legislation evil is a stretch. Modern social media is most certainly evil.
- [deleted]@piefed.worldEnglish9 hours
There is no problem to solve that hasn’t already been addressed with parental controls.
- 8 hours
If the parental control comes from the social media site itself then it’s likely the parent that’s being controlled. The most important control is limiting screen time and not every site allows parents to set hard limits.
- Lodespawn@aussie.zoneEnglish8 hours
There is a problem with social media addiction but the solution isn’t restricting teens from it. The solution, as with most things, is education. Educating the kids, educating their parents and making sure they both have the tools available to them to make smart decisions.
- [deleted]@piefed.worldEnglish8 hours
Limiting total time spent on something is one of the parental control options. It isn’t just blocking things 100%.
- Lodespawn@aussie.zoneEnglish7 hours
Just having parental controls exist isn’t an effective solution. Well implemented education is required to ensure it is used effectively.
- NannerBanner@literature.cafeEnglish7 hours
You can’t educate someone out of an addiction. That’s a fundamental misunderstanding about addiction…
- Lodespawn@aussie.zoneEnglish7 hours
No but you can educate their support networks and build other systems to help them work through their addiction.
- MurrayL@lemmy.worldEnglish8 hours
The issue with this argument is that many kids don’t have good parents, and some don’t have any parents at all.
Are those kids just supposed to be left to the mercy of bad actors because of their circumstances?
- [deleted]@piefed.worldEnglish8 hours
Guess we just let for profit companies and authoritarian states suck up all the data on everyone whether it works or not then.
- MurrayL@lemmy.worldEnglish7 hours
I didn’t say I approve of the current tactics, I’m just pointing out that circumstances can be more complex than simply saying ‘let the parents sort it out’ and leaving it at that.
- Lodespawn@aussie.zoneEnglish7 hours
The current solution just cuts those at risk kids off from all modern support networks.
- Fluffy Kitty Cat@slrpnk.netEnglish6 hours
On purporse to keep them trapped in abusive households with no Support
- bobzer@lemmy.zipEnglish8 hours
If you set parental controls on your own teen’s device, all you’re doing is isolating them from their peers and making them the kid with the weird parent who doesn’t let them post on tik tok.
Social media isn’t what it was when we were growing up. It’s designed to prey on them the same way slot machines create gambling addictions.
I’m no puritan but I do truly believe banning kids from social media and restricting teens at a legislative level would be a net benefit for society. Same as alcohol or drugs.
- some_kind_of_guy@lemmy.worldEnglish4 hours
Prohibition didn’t work for drugs either, so why would it work here? Why do we need to learn that lesson over and over again?
- [deleted]@piefed.worldEnglish8 hours
Limiting total time spent on something is one of the parental control options. It isn’t just blocking things 100%.
- 7 hours
Good intentions without the spirit of cooperation or respect for consent is still evil.
The main problem with all of these internet surveillance tools being marketed as ways to protect children is that people are engaging with them on that basis.
As far as I’m concerned they haven’t done anything to establish that they actually intend to protect children or that this is a reasonable way to do it. This seems like a solution to a different problem that ignores all of the problems it creates.
Parents should be responsible for their children. A random website creator shouldn’t have to be responsible for your children.
Websites aren’t stores where people walk in off of a public street. They are services that people reach out to and engage with specifically and intentionally. If we can address the non-consensual non-intentionality part of internet tracking and surveillance a lot of this stuff goes away. So maybe rather than regulating the website to protect your children we should be regulating the website to protect consent.
- Lodespawn@aussie.zoneEnglish7 hours
I don’t agree that the legislators left the spirit of cooperation or respect for consent out because they are evil, I think they left them out because they are ignorant. I think they are inexperienced with both technology and social media and have failed to appropriately engage people that might have helped them come up with a functional solution rather than an ineffective brute force.
I do however agree with everything else you’ve said above.
- lumpenproletariat@quokk.auEnglish8 hours
The “good intention” was the packaging. The real intent was population control.
- someguy3@lemmy.worldEnglish5 hours
IMO It’s not a question if they remain on, but how much time they spend on it. She’s focusing on the wrong metric.
- gurty@lemmy.worldEnglish9 hours
‘…internally the government was aware of a lack of evidence to support the ban before they passed the legislation anyway’
Terrific job, gov.
- 9 hours
Our government is usually technologically inept.
The first online census (2016) crashed the system because they didn’t allow enough capacity. Anyone with half a brain could have told them that most people were going to try to use it during one particular time – after dinner (especially since the paper census is supposed to count everyone on that particular night). Instead, they decided to rate it for only 1 million form submissions per hour, despite estimating that two-thirds of Australians would fill it out online. At one person per family, that’s around 4 million online submissions. Now factor in that the eastern states have most of the population (and are all in the same time zone at that time of year) and, predictably, the site went down after dinner on census night.
https://www.abc.net.au/news/2016-08-09/abs-website-inaccessible-on-census-night/7711652
Lexam@lemmy.worldEnglish
9 hoursI don’t know. There’s some joy in saying I told you so, to people who had the hubris to try and stop teenagers from being teenagers.
Grail@multiverse.soulism.netEnglish
5 hoursAI companies support the age verification laws because they want to ban kids from talking to anyone on the internet except their robot pedophiles
ikt@aussie.zoneEnglish
8 hours7 in 10? so 3 are off of it? good news 🥳
please expand to over 65 year olds as well
Amnesigenic@lemmy.mlEnglish
8 hoursThe vast majority of new systems throughout history have required some iterative refinement, the fact that this specific implementation attempt didn’t work perfectly on day one isn’t a particularly strong argument against the concept, and there are plenty of good arguments to be made against it









