

If you can tell it was produced in a certain way by the way it looks, then that means it cannot be materially equivalent to the non-AI stock image, no?
previous lemmy acct: @[email protected] see also: @[email protected]
If you can tell it was produced in a certain way by the way it looks, then that means it cannot be materially equivalent to the non-AI stock image, no?
I mean, it can’t really do ‘every random idea’ though, right? Any output is limited to the way the system was trained to approximate certain stylistic and aesthetic features of imagery. For example, the banner image here follows a stereotypically AI-type texture, lighting, etc. This shows us that the system has at least as much control as the user.
In other words: it is incredibly easy to spot AI-generated imagery, so if the output is obviously AI, then can we really say that the AI generated a “stock image”, or did it generate something different in kind?
Hold on though there are talking droids in the Star Wars documentaries and those happened a LONGGG time ago apparently
Edit: to be fair that was in a galaxy far, far away so it’s entirely possible that neither Karl nor Richard were aware of the technology
Richard
edit: folks don’t like Richard Marx, hey believe me I get it
The products of artisanal labour and factory labour might indeed be able to be equivalent in terms of the end product’s use value, but they are not equivalent as far as the worker is concerned; the same loss of autonomy, the loss of opportunity for thought and problem-solving and learning and growing, these are part of the problem with capitalist social relations.
I’m trying to say that AI has this social relation baked in, because its entire purpose is to have the user cede autonomy to the system.
I appreciate you describing the LTV distinctions between the thinkers, thank you, sincerely!
I think the problem I have with AI - and it sounds like you agree at least partially - is that it positions human creative work, and human labour in general, as only a means to an end, rather than also as an end in itself.
(I think this holds true even with something like a video game texture, which I would argue is indeed part of a greater whole of creative expression and should not be so readily discounted.)
This makes AI something more along the lines of what Ursula Franklin called a ‘prescriptive technology’, as opposed to a ‘holistic technology’.
In other words, the way a technology defines how we work implies a kind of political relation: if humans are equivalent to machines, then what is the appropriate way to treat workers?
Is it impossible that there are technologies that are capitalist through and through?
Are you not implying that human effort is not valuable?
At the risk of misinterpreting you, it seems like you’re arguing against the labour theory of value
All I am saying is that, baked into the design and function of these material GenAI systems, is a model of human thought and creativity that justifies subjugation and exploitation.
Ali Alkhatib wrote a really nice (short) essay that, while it’s not saying exactly what I’m saying, outlines ways to approach a definition of AI that allows the kind of critique that I think both of us can appreciate: https://ali-alkhatib.com/blog/defining-ai
The use of AI images without critique communicates to people that these things are normal and fine and inevitable and non-harmful. This isn’t about staging a ‘consumer boycott’ in terms of harming profits as much as it’s about not normalizing this kind of stuff culturally. The less acceptable this stuff is, the more likely people will be willing to push back against big tech more generally. It’s part of the same movement.
And it’s additive, not zero-sum. No one is saying “don’t bother unionizing” because they’re too busy pushing back on the use of AI images in an online community. In fact, I’d reckon, the broad societal pushback makes it more likely that people will be inspired to unionize!
Three problems with this:
Matter being the primary mover does not mean that ideas and ideals don’t have consequences. What is the reason we want the redistribution of material wealth? To simply make evenly sized piles of things? No, it’s because we understand something about the human experience and human dignity. Why would Marx write down his thoughts, if not to try to change the world?
These systems are premised on the idea that human thought and creativity are matters of calculation. This is a deeply anti-human notion.
https://aeon.co/essays/can-computers-think-no-they-cant-actually-do-anything
The banner could be anything or nothing at all, and as long as it isn’t AI generated, I would like it better
The Luddites weren’t simply “attacking machinery” though, they were attacking the specific machinery owned by specific people exploiting them and changing those production relations.
And due to the scale of these projects and the amount of existing work they require in their construction, there are no non-exploitative GenAI systems
By systems positing human creativity as a computational exercise
This argument strikes me as a tautology. “If we don’t care if it’s different, then it doesn’t matter to us”.
But that ship has sailed. We do care.
We care because the use of AI says something about our view of ourselves as human beings. We care because these systems represent a new serfdom in so many ways. We care because AI is flooding our information environment with slop and enabling fascism.
And I don’t believe it’s possible for us to go back to a state of not-caring about whether or not something is AI-generated. Like it or not, ideas and symbols matter.