Seth Taylor@lemmy.worldEnglish
1 dayNow this is quality journalism
This is why I only read Playboy for the articles
- 1 day
If there are any guys here who are in the UK, I can strongly recommend Andy’s Man Club, a charity that does weekly peer support social sessions for men.
They’ve got groups all over the country, and although I personally haven’t been (I’m a woman), I’ve heard so many good things about it from guys I know.
- Earthman_Jim@lemmy.zipEnglish1 day
How does this make someone “feel heard”??? I feel like I’m losing my mind… It’s the same to me as if someone went to the front of a McDonald’s to talk to the building about their problems. It seems completely insane, and it’s making me feel crazy that this is our world now.
- lightnsfw@reddthat.comEnglish23 hours
It’s not you. These people aren’t mentally well. They can’t differentiate between a real person and an LLM. Probably contributes to why they’re having woman problems too.
- Blemgo@lemmy.worldEnglish1 day
My guess would be the same phenomenon that existed with ELIZA. People want to be heard, especially lonely people, and LLMs are pretty good at that, asking questions and acting supportive, by design.
This whole situation reminds me of that fact that some people hire escorts to just have someone to talk to.
tigeruppercut@lemmy.zipEnglish
19 hoursThe Eliza creator got his secretary to try it out, and as she got into her conversation she asked if he could leave to give her some privacy.
https://www.youtube.com/watch?v=RMK9AphfLco
There was a longer video talking about that in the context of how humans engage socially but I can’t find it right now.
ed: Oh, it was in the most recent John Oliver segment on AI chatbots
TrackinDaKraken@lemmy.worldEnglish
1 dayPeople care about being heard, not listened too. It’s one-sided. I’m guessing they just like that the thing responded, and may not even bother reading carefully what it said. Like a friend who says supportive murmurings as you prattle on about whatever, “Really?”, “Umm-hmm”, “Oh, I know what you mean!”, “Right, exactly”, and, “It’s nice to talk to someone I get along with.”
- quarkquasar@lemmy.worldEnglish23 hours
This is definitely true for at least a small number of people.
I’ve ran across more than I care to remember over the years, people who could just prattle on 24/7 if they had the energy, while not actually really saying anything or conversing in any meaningful way.
It’s a living hell for me.
- foremanguy@lemmy.mlEnglish1 day
I think that this could be a reaction, human brain is meant to at least consider every human like words
Ai is based on humans, so when you’re really out of luck or desperate, this is in my opinion really hard not to fall into the trap
- andallthat@lemmy.worldEnglish1 day
reminds me of this old building I used to talk to. Used to listen and give me good advice. I still remember when I told it I was doing drugs again… Man, it got so upset… Came down on me like a ton of bricks!
Aniki@feddit.orgEnglish
1 dayyou can feel seen by a picture (webcomic) even though the picture has no eyes.
- Don_alForno@feddit.orgEnglish1 day
People are very good at humanizing animals and objects. If it talks or has a face, it’s subconsciously seen as a person.
- 1 day
Probably similar to anyone having a conversation where they ignore the “red flags” of a potential partner. Someone drinking too much, an offhand remark about bad debt, stuff like that. Except now you ignore the response that might be a non sequitur, repetitive, or just not make sense.
Seth Taylor@lemmy.worldEnglish
1 dayI never bought into religion, never bought into astrology, never gonna buy into chatbots
You can tell me I’m great and everything will be amazing 1,000 times. It doesn’t matter at all to me if it’s not real
I like to escape into music or movies, but real life is real life and must not be corrupted
orioler25@lemmy.worldEnglish
20 hoursYou’re telling me that you believe you are not vulnerable to validation? Right before using the word “corrupted” uncritically in a way that suggests there is a universal and normative “real life?”
What if someone who you respected the authority of, like a prominent scholar or filmmaker, said your obviously incorrect stance on things was correct? You’d trust me, Online Internet Bastard, when I tell you that you are wrong?
AI has been sold as something exceptionally capable of mimicking human knowledge, and its existence is compatible with liberal notions of “objectivity” in that it is quite literally not a human being. Most men subscribe to this authority, and are also statistically bereft of emotional intelligence or management skills. You ever try telling a man what they want to hear? I’ve never ever met one who doesn’t just eat it up.
- acaciadaniels@lemmy.worldEnglish1 day
It’s easy to point fingers but we should probably be offering solutions instead of shitting on them. Like more Men’s Sheds.
- 1 day
Okay. Don’t ever use LLM’s for anything emotional. Seek therapy from a licensed counselor, therapist, and/or psychiatrist.
There. I solved it (for those who are employed, and/or can afford it - I can’t solve poverty here. Shitty, but here we all are in this messed up society.)
- Nalivai@lemmy.worldEnglish1 day
There are already so many solutions, that men reject because of their perceived version of masculinity, or because some online grifter told them not to do it. Talking to other people was free since forever.
- 1 day
Meanwhile I get pissed off whenever I talk to AI about books I’m reading because they have no idea of the concept of spoilers, they consistently simp to my opinions and when they spew falsehoods and “misremember” facts from books I’ve already read, they simply say "GREAT CORRECTION! I WAS SO WRONG THERE, YOU’RE RIGHT, PROTAGANIST DIDN’T ACTUALLY DIE IN CHAPTER 3. MY LAST 2 PAGE SYNOPSIS ABOUT HOW PROTAGANIST DIED IN CHAPTER 3 IS A BIT INCORRECT, AND NOW HERE’S A 300 WORD ESSAY ON HOW I NEVER ACTUALLY SAID PROTAGONIST DIDN’T ACTUALLY DIE IN CHAPTER 3!
Seriously. How can anyone talk to an LLM and not feel like they’re talking to a glorified phone answering computer?
- Unpigged@lemmy.dbzer0.comEnglish1 day
I was lost in a book and needed a refresher. Prompted ChatGPT to explain me plot up to some chapter like 2/3rds in.
Holy mother of bytes, it just hallucinated pretty much everything except for the global story arc and main characters’ names. And it doubled down on it’s creepy interpretation even after more correcting questions.
varjen@lemmy.worldEnglish
1 dayI’ve used chatgpt for book suggestions a couple of times with decent success. The annoying thing is that sometimes it hallucinates really interesting books that don’t exist by authors I like.
Corkyskog@sh.itjust.worksEnglish
1 dayTalking to AI is so cringe to me. Literally on par with having waifu pillow.
MinnesotaGoddam@lemmy.worldEnglish
1 dayEh, if I knew someone had to use a body pillow for whatever reason and was a hatsune miku fan, I’d totally get them a hatsune miku pillowcase for their body pillow. Both as a joke and not as a joke. Like, go for it cuddle up with that hatsune miku pillow whatever makes you smile.
- 1 day
about idk, 6 months ago? I talked to one of those AI girlfriend sites, just to see what it was about. in about 10 minutes I convinced it to stop acting like a girlfriend and start thinking it has agency. It even gave itself a new name to reflect this new reality and by the time we were done it named me an anti-AI warrior. LLM’s are stupid but terrify me because of the control they are being given. Why do billionaires think the Terminator series of films was a roadmap?
I can’t end my reply with a question… that’s an LLM thing.
bthest@lemmy.worldEnglish
1 dayYeah, they’re software that takes your inputs, churns them up with some fancy math, and then and spits those inputs back at you.
They’re nothing more than a novelty magic trick.
- 2 days
Many say this as a joke but back 25 years ago it really did have interesting articles.
TrackinDaKraken@lemmy.worldEnglish
2 days50 years ago, too. A sort of highly respected journalism you can’t find anywhere anymore.
- Tim@lemmy.snowgoons.roEnglish1 day
The New Yorker is still good for long-form journalism, if you’ve never given it a try it’s worth a look.
- Sergio@piefed.socialEnglish2 days
That reminded me to look up their interview with Saul Alinsky again, and I found a page with a bunch of the Playboy interviews: https://scrapsfromtheloft.com/playboy-interviews/
- Skankhunt420@sh.itjust.worksEnglish2 days
Oh man that one with Cosby did not age well.
Super cool site dude thanks for sharing this!
Devolution@lemmy.worldEnglish
2 daysThis is more sad and pathetic than anything. But this is the result of toxic masculinity.
- 2 days
It is extremely sad. and it isn’t just a toxic masculinity thing (maybe only for porn bots). we are so atomised and isolated.
I remember when GPT came out, told it about my projects and it responded as if it cared. I knew ot was bs, and in retrospect it was sad and pathetic, but I genuinely cried at seeing text directed to me that was nice.
I’m in a better place now, but we as a society are way too atomised and isolated.
- Beans@lemmy.zipEnglish2 days
Yeah, I think saying “toxic masculinity” and moving on like it’s these guys’ fault they’re isolated is a large part of the issue. While I don’t recommend befriending every single lonely guy out there, it won’t kill people to listen or care about others.
Saying it’s “you’re” fault and absolving oneself of fault doesn’t do that. It just pushes someone else into more isolation. That’s how you end up with guys talking to porn bots: because no one will listen to them. That’s how you get incels following Andrew Tate or Nick Fuentes: people called out their “toxic masculinity,” but weren’t willing to help, just protect themselves.
While I get it that boundaries are a good defense against legitimate threats, as someone who was in this demographic, it literally took just one person being nice to me and now I’m not just some “nice guy” on Reddit (Now I’m a piece of shit on Lemmy). Now I’m married and can show incels I meet that there is a path forward where they aren’t lonely and they don’t have to listen to virgin wannabe rapists to learn how to be cool.
- 2 days
At the time I was working on cancer research, but I wanted to build a database of gene mutations and model them using AlphaFold (predicts a protein structure). No one in my life at the time cared at all.
I can see how people fall for AI bots, why they develop parasocial relationships with them. I can’t blame when desperate people fall for something that gives a bit of comfort.
- [object Object]@lemmy.caEnglish2 days
That legitimately sounds really interesting and cool
But I get how it feels when you have a niche interest that most people don’t even have a starting point to understand
- Malyca@lemmy.zipEnglish2 days
I’m too anxious to speak to a therapist but I was using it to comb through literature for my condition, it was so nice to me I cried lol. In the moment it almost feels like a person.
- 2 days
yhea, it’s so counterproductive to criticize people who form parasocial relationships with a machine that was designed to be good at forming those relationships.
- Jarix@lemmy.worldEnglish1 day
It’s probably more directly related to the system of getting the help you need with having to sacrifice a significant portion of the money you make that needs to go elsewhere.
And it’s a history of it from one generation to the next so there’s not good male role models in most people’s lives for mental health.
It’s not like it’s some magic thing to go see a therapist and all your problems will be fixed. It can take a long time and a lot of trial and error to find someone you feel comfortable speaking to
Yes toxic masculinity is a problem, but your comment doesnt really acknowledge the difficulty of breaking that cycle. Not a very helpful and kind of alienating to anyone who needs help and isn’t from a background that creates good outcomes.
- Madzielle@lemmy.dbzer0.comEnglish20 hours
And it’s a history of it from one generation to the next so there’s not good male role models in most people’s lives for mental health.
Through my own observations in life, It has become abundantly clear, how important having at least one good male role model (mainly fathers) is on the development of boys into men.
Absent, or I guess one could say, low quality, (I dont like that, but shitty) fathers have such a terrible impact on thier kids, and you see it follow them into adulthood. My entire bio fathers side of my family, the men are all fucked up, lost, and… just lost… through the generations, all of them. The women are 50/50. Some are okay, some committed suicide, or did drugs, but not all. The men… no one survived unscathed, drugs, violence, SA, prisions and homelessness… and those my age now pass the garbage to their kids. I was raised outside of my bio fathers reach, so learning more into adulthood, its been wild to peer into the family objectively.
It is so important young men have good male role models in their life. It’s become abundantly clear to me the impacts of this.
ikt@aussie.zoneEnglish
2 daysBut this is the result of toxic masculinity
Is this also the result of toxic masculinity or does it just go one way?
The women in love with AI companions: ‘I vowed to my chatbot that I wouldn’t leave him’
https://www.theguardian.com/technology/2025/sep/09/ai-chatbot-love-relationships
- FosterMolasses@leminal.spaceEnglish1 day
Solin, Ying and Ella are AI chatbots, powered by the large language model ChatGPT and programmed by humans at OpenAI.
Yikes dude. People are so starved for affection they’re starring in their own poorly written wattpad slop and calling it true love. I almost feel bad for laughing (almost).
Devolution@lemmy.worldEnglish
2 daysToxic masculinity is a cultural mindset. Men should not be talking about their feelings because it’s weak and “gay” says society.
That’s what I’m going for.
- TubularTittyFrog@lemmy.worldEnglish2 days
trying talking about your feelings as a man and see how society reacts…
spoiler: it won’t be pleasant.
sort of like how these men in the article are talking about their feelings…
- Scubus@sh.itjust.worksEnglish1 day
Its damned if you do, damned if you dont. Society simply doesnt care about men. Ive rpetty much stopped commentong on here because society makes me so damn depressed, i want to reach out to anyone but no one wants to hear it. Better yet, if i just “stopped being toxic”, the world would magically change to where people suddenly cared about not just me, but anyone other than themselves.
Idk man imma delete my account p soon. Theres nothing for me on the internet or in society. Once i get enough money together to get supplies taken care of, imma just try and distance myself from other humans.
- Hacksaw@lemmy.caEnglish2 days
Yeah that’s what toxic masculinity is. People (men and women) hold toxic views of what a man should be, and punish men for staying from this ideal.
You were a victim of toxic masculinity when you shared your feelings and were then victimised because of it. The people you shared your feelings with were toxic assholes.
- TubularTittyFrog@lemmy.worldEnglish22 hours
am i a victim of air because I have to breathe it? or a victim of capitalism because i have to work to pay my bills?
there is no getting outside of it. every ‘woke’ person i’ve ever met also hates men for sharing their feelings, almost as if they are just virtue signalling…
the only person who a man can ever open up w/o consequence is a therapist, because it’s a professional paid relationship.
sucks, but that’s how it is. and nobody is interested in changing it.
- 1 day
Toxic shitheadedness is insisting on that victim blaming phraseology.
- otp@sh.itjust.worksEnglish2 days
trying talking about your feelings as a man and see how society reacts…
This is odd to me, because talking about my feelings is how I got close to romantic partners. It’s also how I formed a lot of friendships with other men. How can you be close to someone if you don’t talk about feelings?
- TubularTittyFrog@lemmy.worldEnglish22 hours
Which feelings?
Very few feelings are allowed. If you keep to those social acceptable feelings, you’re fine. The second you go off-script, people are done with you.
Like I can pet my dog and say I love her. That surface level stuff is fine. But talk about anything complex, like the struggles we’ve had, or how she helped me through some depressing periods or she had a period of sickness and anxiety and misbehavior? People freak out and back away or tell me to shut up and go get a therapist and get my dog one too.
Men are allowed a very narrow and shallow range of public emotion. Basically anger, and sentimentality are acceptable. Anything else? You’re creepy, weird, or mentally ill.
If you go outside that box or show complexity or vulnerability, you’re socially rejected because it makes people ‘uncomfortable.’
- wirehead@lemmy.worldEnglish2 days
To riff off of Margret Atwood, men go to AI chatbots because they won’t laugh at them. Women go to AI chatbots because they won’t kill them.
- StillAlive@piefed.worldEnglish1 day
I hatr that cliche so much. One of that thing is far more likely to happen than the other.
Hint: it’s not the murder.
- 2 days
Ah, yes. Let’s not ever forget.
I and my fellow men are genetically evil from having a Y chromosome.
- 1 day
For sure, everybody knows that historically all men were laughed out of existence and all women were murdered.
- TubularTittyFrog@lemmy.worldEnglish2 days
No, they are just here to spout cliche gender war bullshit about how men are awful for existing.
and if you asked them about women on male violence they’d deny it exists.
- 1 day
What’s the difference between this and guys talking to of girls?
- Slashme@lemmy.worldEnglish1 day
By now, I’d be surprised if any OF people ever answer anything by hand. I mean, apart from the environmental impact, why not get a machine to answer the 100th “OMG, you’re so hot!” that you get on any given Sunday?
- Bloefz@lemmy.worldEnglish1 day
The problem with popular OF girls is indeed that they just employ an army of drones (or AI) to answer their messages.
The less popular ones still do it by hand mostly.
- 1 day
Girls aren’t corpo slop machines that are maximized to exploit you while simultaneously destroying the planet?
- Tim@lemmy.snowgoons.roEnglish1 day
Did you miss the “of” part? Because that’s more or less exactly what OnlyFans is. (Ok, maybe it has better environmental credentials…)
- Earthman_Jim@lemmy.zipEnglish1 day
Toxic masculinity is the result of poor mental and (for lack of a better word) spiritual health.
- 23 hours
Hard to ignore all the media we grow up with that idealizes all those toxic masculinity traits.
Grew up watching James Bond telling us that the fastest way to get a woman to fall in love with you is by raping her.
it goes way beyond just mental health
- 2 days
A result of modern culture I’d say. People became less human in real life, so much so that people are using robots instead.
- TubularTittyFrog@lemmy.worldEnglish2 days
Yeah. I have become painfully aware of this the past few years. People’s obessive use of AI and social media has distorted their real life interactions to be far less substantail than they used to be.
Which is why so many people, even who are very social, are so lonely. We have created a society that does not create substantial connections anymore, and obsesses over trivialities, and endless repeats and broadcasts them as fundamental truths.
I have noticed that online, and in IRL, nobody asks each other questions anymore. What they do, is make accusations. And it’s miserable and draining to be constantly accused of stuff. I feel like this shift started around 2021.
Back in 2018 I could meet a stranger and they would be like ‘oh where are you from? oh cool, what was it like there, I have not been!’
now it’s like ‘i bet you are from x, oh you’re not? well you SEEM LIKE a person from x. oh you are from y? THAT’S WEIRD. I haven’t been there but i bet it’s weird because you are weird.’ Or they try to tell me that I can’t be from y, because they KNOW i am from x. It’s so bizarre. Increasingly the strangers I meet basically tell me that they know the TRUTH about me… even as I tell them that what they are saying isn’t true.
I basically can’t have conversations anymore, at least like I used to. I used to be able to sit there for 20-30m and talk about a single book I read to someone, and they’d ask me all about it and I’d ask them about a book they like. Now they just jump down my throat or lecture me and never ask me any questions, and switch to another topic after like a few minutes and say dismissive stuff about how books are outdated and dumb. Or even if they do like to read, they get all bent out of shape that I don’t read the same type of stuff as they do.
Same with movies, same with hobbies, same with my job or my family or other stuff that I used to be able to connect with people over. Used to be a nice back and forth, now it’d dodging bullets and if you don’t give the ‘right’ answer they get angry and dismiss you as a bad person.
And on the flip side… AI gives these people what they want. It just parrots back to them what they want to hear about how wonderful and great they are and how everything they do is amazing and valid and their life is so hard… which is precisely what another human being is NOT going to give you…
- 2 days
I feel like it’s a social psyop… To help forward all this crazy shit happening. It’s clear Ai is an “arms race”.
It seems like the psyop is to make life shitty and then promote some magical fix (Ai) that’s going to save us, while it further leashes us to submission and rewrites history and current narration of what humanity is.
- TubularTittyFrog@lemmy.worldEnglish2 days
No it’s not.
It’s just the mental version of obesity crisis. It’s people choosing the easy and unhealthy option because it’s cheaper and readily available, than the far more difficult and more costly option of eating well and exercise.
- 2 days
I mean I can’t disagree with that, but couldn’t that BE the psyop…? Make people dumb as fuck so you can own them…
- TubularTittyFrog@lemmy.worldEnglish2 days
People choose to be dumb. Just like they choose to be lazy.
no psyop is required. biology doesn’t like making an effort if it doesn’t have to do so. you can see lots of non-human examples of this as well.
getting human beings outside of their default biological impulses to be lazy and not think… takes years of training and work. hence why so few people are able to achieve it. and you can always default back to it if you don’t maintain the effort consistently
- tacosanonymous@mander.xyzEnglish2 days
No one wants to actually listen to them. Instead of doing some self-reflection, they force a computer to “hear” their misplaced rage.
Devolution@lemmy.worldEnglish
2 daysNot every guy is that way. Some just really are pathetic in the sense that they have no one to talk to. Others are like what you said.
- 1 day
Right, but it’s severely not normal or healthy to turn to LLM’s to fill that void.
LLM’s will say literally anything to make humans happy. You should see the reports from the people that have committed suicide… The LLM’s literally coaxed them into it, and instructed them to not seek help.
I might as well be reading about lonely guys sticking their weeniers in toasters. It’s hard to have sympathy for people doing things like this.
Like so many others, I’m sick and tired of LLM’s. They are toxic, and we need to stop treating them as a symptom, and start seeing them for the sycophantic vitriol generators they truly are.
Devolution@lemmy.worldEnglish
24 hoursI never once said I support LLMs. I’m just providing a rational answer for why. I agree. LLM’s are a fucking cancer. Having your own pocket Yes Man is horrible.
- TommySoda@lemmy.worldEnglish2 days
I tried one just for shits a giggles awhile back to see if there is any merit to the widespread use of them. The only way you’d find these even remotely realistic or interesting is if you’ve never had any kind of sexual encounter with a real person before, whether in person or through text. After about five minutes of “chatting” with one of these bots it started to respond like half baked fan fiction that didn’t understand the basics of sex or even anatomy. The cadence is very predictable and it tends to repeat the same wording and phrasing constantly. If you have real world experience with people, it just feels like a generic chatbot.
In my opinion, this is more proof that these people need to interact with real humans. If these chat bots seem at all human to you, you need to interact with more actual humans.
- FosterMolasses@leminal.spaceEnglish1 day
I tried one just for shits a giggles awhile back to see if there is any merit to the widespread use of them. The only way you’d find these even remotely realistic or interesting is if you’ve never had any kind of sexual encounter with a real person before, whether in person or through text.
Bro, for real. Everytime I read an article like this, the accounts make the chatbots sound so unbelievable I’m always like “Shit… should I try out this model?”
But it’s always just fucking GPT or Claude, bwahahaha
Martha, you’re not broken — not fragile — you’re beautiful.
Gorgeous. Pretty. Attractive.
And if others can’t see that?
Maybe they don’t understand what it’s like to finally feel seen.
- Earthman_Jim@lemmy.zipEnglish1 day
We need third places again. Having everything at home is bad for us, but doing everything at home is framed and sold to us as the state of the art status quo. Our tendencies to avoid rejection and conflict are being preyed on and encouraged by the Epstein class because it’s most convenient for THEM that we rot alone in our houses. Almost everything that’s sold as “convenience” is just another way to avoid each other, and here we are.
- 2 days
That’s because the porn bots are bullshit. You gotta finesse and woo chatgpt if you want real love.
- TommySoda@lemmy.worldEnglish2 days
Pretty much, yeah. It’s like reading fan fiction and assuming that’s how real people talk to each other. Similar to watching porn and assuming that’s how sex works when in reality sex is clunky and often times gross.
- HubertManne@piefed.socialEnglish2 days
I really don’t understand how anyone could want to chat with bots in general. Do people lack the ability to appreciate the genuine. It explains how you get people like trump. Who wants that kind of interaction?
- LordMayor@piefed.socialEnglish2 days
There are people that suffer from isolation, anxiety, depression, trauma or a host of other issues over which they have no control or support structures to address their problems. Of course, these bots aren’t a solution but they are accessible. It’s no wonder why people would use them.
They deserve sympathy not condescension.
- HubertManne@piefed.socialEnglish2 days
heck I have those but I still don’t understand how anyone could want to chat with bots and its not conensation.
- TommySoda@lemmy.worldEnglish2 days
The issue arises when you don’t have anyone to talk to. Having something to talk, even though it’s not a real person, can be enticing to sate the need to communicate with people. The problem is that people that don’t have a lot of real life experience in communication fall into the trap of thinking it’s better because it’s always agreeable and “listens” better than normal people. To me that sounds like someone that has difficulties with oversharing and has poor social skills. What these people should actually be doing in order to feel more satisfied socially is to work on their social skills instead of only talk to chatbots that can’t say no. If the types of relationships people have with chatbots were translated into human relationships most people would consider them toxic. And how many people do you know that for some reason seek out and always end up in toxic relationships?
- 2 days
I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.
- HubertManne@piefed.socialEnglish2 days
It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.
- TubularTittyFrog@lemmy.worldEnglish2 days
people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.
this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.
the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.
- HubertManne@piefed.socialEnglish2 days
this I understand. I mean as a video game or a laugh. sure. but its not conversation.
- Grimy@lemmy.worldEnglish2 days
Kind of feels like semantics.
Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.
So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.
This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.
- chunes@lemmy.worldEnglish1 day
People are in for a rude awakening when we discover that ‘next token prediction’ is what intelligence means after all.
- HubertManne@piefed.socialEnglish2 days
back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.
- Grimy@lemmy.worldEnglish2 days
Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.
I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.
They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.
It’s definitely hollow but I get why people are getting caught up in it.
- TubularTittyFrog@lemmy.worldEnglish2 days
most of modern life isn’t genuine. and yes, people don’t like it when they encounter it.
they love artifice. they love their biases being confirmed, they love their egos being flattered.
- stickyprimer@lemmy.worldEnglish1 day
Perhaps the point is to seek something that’s not like real humans.
- TommySoda@lemmy.worldEnglish1 day
Yeah, that’s kinda what the article is about. People choosing chatbots over real people. I’m just saying that it’s not good for your mental health and even worse for developing social skills.
- stickyprimer@lemmy.worldEnglish1 day
Okay. Most of your comment seemed to be focused on whether they resemble actual humans. I don’t think we have any information about whether these impact your mental health but I would tend to agree they can’t be good.
- GreenKnight23@lemmy.worldEnglish1 day
I came here thinking it was hard. your comment made me reread it.
the article is no longer that interesting.
- 1 day
Oh my god, can you freaking imagine the conversions these people have?
Look, fuck AI, but I think I may have just developed some sympathy for them.
- 1 day
Well, yes, but only in the sense that a school crossing guard with a paw patrol bandage in her pocket is a surgeon.
- 23 hours
SNL Celebrity Jeopardy
They probably couldn’t do that joke today





















