A colleague is all in on AI. She sends these elaborate notes generated by AI from our transcript that she is so proud of. I really hope she hasn’t read any of them because they’re often quite disconnected from what occurred on the call. If she is reading them and sending them anyway… Wow.
I didn’t know those were off. About a year ago we were playing with Zoom’s AI meeting recorder and it was astonishing how accurate the summary was. Hell, it could even tell when I was joking, which was a bit eerie.
Probably not reading them. A family member told me at their work someone had an LLM summarize an issue spread out over a long email chain and sent the summary to their boss, who had an LLM summarize the summary.
Does anyone know the name of this monkey or experiment? It’s kind of harrowing seeing the expression on its face. It looks desperate for affection to the point of dissociation.
Here’s a video for context https://youtu.be/-Qi7txH1KzY
The context makes it even more heartbreaking.
Absolute horror.
This is an interesting comparison because the wire monkey study suggests that we need physical contact from a caregiver more than nourishment. In the case of AI, we’re getting some sort of mental nourishment from the AI, but no physical contact.
The solution? AI tools integrated into either hyper-realistic humanoid robots, or human robo-puppets.
Or, we could also leverage our advancing technology to support the working class by implementing UBI through a reduction in production costs and an evening out of wealth and resources.
But who wants that? I, a billionaire, sure don’t.
How about just hug a real human. Problem solved
How will they sell a human at the lowest cost? People have to eat and sleep.
I mean last week it was all over the news that Mattel and OpenAI made a deal to put chatgpt in toys such as Barbie.
Put that shit in a furby or a 1993 toy biz voice bot.
We love cloth mother, way better than wire mother, gotta say
Where does scrub daddy factor into this?
Damn, wire mother is going dig into my brain.
Yes… very apt comparison.
Cloth AI will love and comfort us until the end of our days.
Which will be soon, because only Wire Computer provides us with actual sustenance.
ELIZA, the first chatbot created in the 60s just used to parrot your response back to you:
I’m feeling depressed
Why do you think you’re feeling depressed
It was incredibly basic and the inventor Weizenbaum didn’t think it was particularly interesting but got his secretary to try it and she became addicted. So much so that she asked him to leave the room while she “talked” to it.
She knew it was just repeating what she said back to her in the form of a question but she formed a genuine emotional bond with it.
Now that they’re more sophisticated it really highlights how our idiot brains just want something to talk to whether we know it’s real or not doesn’t really matter.
Depends. I think I’m on the autistic spectrum, I just don’t see them as equal, but as tools.
I’m not in the autistic spectrum. They aren’t equals and they are barely tools.
They are good tools for communicating with the robots in management. ChatGPT, please output some corpobullshit to answer this form I was given and have no respect for.
One of the last posts I read on Reddit was a student in a CompSci class where the professor put a pair of googly eyes on a pencil and said, “I’m Petie the Pencil! I’m not sentient but you think I am because I can say full sentences.” The professor then snapped the pencil in half that made the students gasp.
The point was that humans anamorphize things that seem human, assigning them characteristics that make us bond to things that aren’t real.
That or the professor was stronger than everyone thought