• HollowNaught@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    4
    ·
    edit-2
    4 hours ago

    People are saying “it’s fine because it was used in the early stages of the game for placeholder art” but that’s kind of missing the point

    The problem is that they used AI and didn’t disclose it, as well as releasing the game with AI textures still in it. Yes, these textures were quickly replaced, but it’s still very concerning they weren’t upfront on how they were using it in the game making process

    Edit: there isn’t even a disclosure on their steam page

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      4
      ·
      2 hours ago

      I dunno…

      If I make a mock up of a cake using toxic ingredients, then throw that out and make my cake from scratch using food safe ingredients, do I need to disclose that “toxic material was used when making this cake”? I don’t think so.

      Of course this kinda falls apart when they shipped with quickly replaced textures. But I also wouldn’t expect them to disclose the game as unfinished if they forgot to replace blank textures with the proper assets until just after release.

  • tomkatt@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    10
    ·
    5 hours ago

    This is fucking stupid. There’s no AI assets in the final game, and it was used for placeholders during development.

    I dislike AI for a lot of reasons, but this is massively overblown. The genie is out of the bottle and there’s no putting it back. This is right up there with artists airbrushing, photoshop, and so on. People are going to use the tools available if it leads to quicker development cycles to get a product out.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 hours ago

      This is fucking stupid.

      It’s stupid because the game has already received a stack of awards a mile high. Nobody seriously cares about this. Nobody’s sales will be hurt in any meaningful capacity. It’s a dumb awards show, not the FCC.

      People are going to use the tools available if it leads to quicker development cycles to get a product out.

      I think this “placeholder art” is a silly line to draw. But the high profile of the game makes it a ripe target to make a statement.

      If you really don’t want to reward people for “quicker development” over the human touch, might as well pick a game everyone already bought and highlight folks who did their dev work organically

  • DegenerationIP@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    3 hours ago

    I mean. Usage of AI should be disclosed. But it feels more Like they’re trying to take it down. This has a taste of jealousy to me.

    Or I don’t get the full picture Here.

  • Serious_Me@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    4
    ·
    edit-2
    4 hours ago

    Because so many people are blowing up without reading the article I felt it was worth posting this. Based on the wording it sounds like they were not disqualified for having AI in the game, they were disqualified for not disclosing AI had been used in development.

    “The Indie Game Awards have a hard stance on the use of gen AI throughout the nomination process and during the ceremony itself,” the statement reads. “When it was submitted for consideration, representatives of Sandfall Interactive agreed that no gen AI was used in the development of Clair Obscur: Expedition 33. “In light of Sandfall Interactive confirming the use of gen AI on the day of the Indie Game Awards 2025 premiere, this does disqualify Clair Obscur: Expedition 33 from its nomination.”

    Additionally, here is another article where they are clarifying HOW it was used.

    https://english.elpais.com/culture/2025-07-19/the-low-cost-creative-revolution-how-technology-is-making-art-accessible-to-everyone.html

    Following the publication of this article, Sandfall Interactive wishes to provide the following clarifications. The studio states that it was in contact with El País on April 25 - three months prior to this publication. During these exchanges, Sandfall Interactive indicated that it had used a limited number of pre-existing assets, notably 3D assets sourced from the Unreal Engine Marketplace. None of these assets were created using artificial intelligence. Sandfall Interactive further clarifies that there are no generative Al-created assets in the game. When the first Al tools became available in 2022, some members of the team briefly experimented with them to generate temporary placeholder textures. Upon release, instances of a placeholder texture were removed within 5 days to be replaced with the correct textures that had always been intended for release, but were missed during the Quality Assurance process.

    TL;DR: They experimented with Generative AI when it first came out, used some of the results as temporary assets that were always intended to be temporary. They still got in to the final product because QA missed them, which was promptly fixed in a patch. Indie Game Awards disqualified them for failing to disclose this in the first place.

    Key takeaways:

    • AI didn’t steal anyone’s job in this instance. It was simply used as a tool to help make an artists job easier.
    • It was never meant to be a part of the final product, and currently isn’t.
    • They used generative AI around when it when it first came out, probably before most people started realizing it was being trained off stolen artwork as well as a lot of the other problems with AI. u/Crazazy brings up a good point and this part is somewhat questionable

    Make of that what you will. I personally think this is being blown out of proportion. They made a mistake and have openly corrected themselves. Good for them.

    • Crazazy [hey hi! :D]@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      ·
      4 hours ago

      I don’t have much if an opinion on the rest of your argument but:

      probably before most people started realizing it was being trained off stolen artwork as well as a lot of the other problems with AI.

      This is the equivalent to those Tesla owners pasting “I bought this before Elon went crazy” stickers. Especially the creative industries were very quick to point out the problematic part of stuff like Dall-E and stable diffusion. Generative Graphical AI has never been approved of by the gamedevs I know.

      • Serious_Me@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        4 hours ago

        Hadn’t thought of it that way. Good point. That being said it’s not 100% that since it’s more akin to someone buying a tesla then getting rid of it soon after in this scenario. That or getting it, leaving it in the garage for 2 years having forgotten it exists, then finally getting rid of it once someone points it out. Still somewhat valid though.

    • SleeplessCityLights@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      4 hours ago

      Stable diffusion and midjourney were supposedly trained fairly. That is the only reason art teams would even use Ai for rapid prototyping.

      • Serious_Me@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Was not aware those two were trained fairly. Sadly I didn’t see anything on what AI tool they used so not sure how that would affect things.

    • maximumbird@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      7 hours ago

      To me, this is worse.

      We are getting closer and closer to not being able to tell the difference between AI and reality. This lying about the use of it or hiding the use of it is a bad fucking idea.

      • mic_check_one_two@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        4
        ·
        5 hours ago

        They didn’t disclose it because there was no AI in the final product. The AI was for placeholder textures, which were replaced by real artists’ work as they were made. Some of the AI textures slipped through the cracks on release day, but a week 1 patch removed all traces of the AI before anyone even realized it was AI.

        IMO this looks bad on the awards show, because the final product didn’t have any AI. And the production team was proactive in ensuring it didn’t have any AI before any kind of public backlash ever happened. Once they realized the issue, they issued a patch to fix it on their own, without needing to be pushed into it by public pressure. That’s what a company should do, and it shows that the devs really cared about their game.

      • KiloGex@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        6 hours ago

        The reason they didn’t disclose it as being used in the creation of the game is probably because no AI was used in the ultimate development. It’s an artist who uses AI to generate concepts and inspiration using AI in their artwork, even if everything in the end is hand crafted and doesn’t resemble any of the generated images?

        One thing we need to take into account going forward too is that AI will inevitably be used for things like texture maps and environmental generation. Things that have been randomly generated with algorithms. In a year it’s going to be nearly impossible to say no game can have any AI used at all, unless you want the pool of potential to be incredibly small.

        • petrol_sniff_king@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          7
          ·
          5 hours ago

          In a year it’s going to be nearly impossible to say no game can have any AI used at all,

          Damn, that sucks. I guess I’ll have to find a new hobby.

          • leftzero@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 hours ago

            Nah, just pirate the stuff.

            If they don’t give a fuck about original creators, why should we give a fuck about paying them?

              • leftzero@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 hours ago

                Of course not, but I think not supporting those that use it to produce something you want to enjoy doesn’t necessarily imply not enjoying what they produce, as long as it’s not too thoroughly damaged by their use of it and as long as it can be obtained in ways that won’t support them.

  • ToiletFlushShowerScream@piefed.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    12
    ·
    8 hours ago

    I’m sure all of the recently out of work artists and programmers are heartbroken over another game that paid for gen AI instead of hiring them. I’m sure the AI company executives just needed the money more. Fuck whomever decided to AI in the Clair project management team. You could have actually deserved that awards. Good on the Indie Game Awards for actually supporting indie developers

    • BananaIsABerry@lemmy.zip
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      6 hours ago

      Did you even read? They used it for placeholders before replacing them with textures created by artists.

      • leftzero@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        5 hours ago

        They didn’t use it for placeholders (which wouldn’t excuse them anyway, if you want a placeholder you can pay an artist to make it).

        They got caught using it in production and came up with the placeholder excuse (which no one who’s ever seen a placeholder texture would fall for) on the spot, throwing the QA team under the bus to try to cover what is clearly a systemic problem with the company.

        • BananaIsABerry@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          6
          ·
          4 hours ago

          You anti AI peeps are so dramatic about things. It’s like listening to your grandparents find every excuse to blame every problem on smartphones

          • leftzero@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            Smartphones are actually useful, and don’t have the moral, ethical, economic, societal, and existential issues that “generative AI” (which is neither generative nor intelligence) has.

  • lepinkainen@lemmy.world
    link
    fedilink
    English
    arrow-up
    113
    arrow-down
    20
    ·
    13 hours ago

    Sandfall Interactive further clarifies that there are no generative AI-created assets in the game. When the first AI tools became available in 2022, some members of the team briefly experimented with them to generate temporary placeholder textures. Upon release, instances of a placeholder texture were removed within 5 days to be replaced with the correct textures that had always been intended for release, but were missed during the Quality Assurance process

    Sauce: https://english.elpais.com/culture/2025-07-19/the-low-cost-creative-revolution-how-technology-is-making-art-accessible-to-everyone.html

    Not exactly a massive AI slop problem, right?

    Can we put our collective pitchforks away for this case at least?

    • Agrivar@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      8
      ·
      10 hours ago

      Can we put our collective pitchforks away for this case at least?

      NO.

      My pitchfork stays sharpened and at the ready until this stupid bubble pops.

      • KiloGex@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        7
        ·
        6 hours ago

        It’s not a bubble though. That’s like waiting for the internet bubble to pop back in the 90s. AI will be around from now on, just not as such an in your face way. It will eventually become ubiquitous, just like many other pieces of tech.

      • jali67@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        8
        ·
        7 hours ago

        The AI was used for background assets that they failed to remove but patched quickly after. It’s not as egregious as the headline makes it out to be.

        • Agrivar@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          5
          ·
          7 hours ago

          I think you misunderstood me. All AI is humanity-ending garbage that needs to be eliminated. I don’t give two figs how or where it’s used - I want it all gone.

          • jali67@lemmy.zip
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            9
            ·
            7 hours ago

            Do you even have a tech background? How is a machine learning algorithm going to end humanity?

            • leftzero@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              5
              ·
              5 hours ago

              Brain rot, job destruction, increased inequality, massive acceleration in global warming, massive decrease in the quality of critical systems, societal and economic collapse…

              • jali67@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                5 hours ago

                That’s fearmongering. It has use cases and has had them well before this LLM AI bubble. The bubble will pop and hopefully these CEOs are actually charged unlike 2008.

            • petrol_sniff_king@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              5
              ·
              5 hours ago

              By feeding people’s collective cynicism, lack of social skills, general paranoia, lack of trust in each other, waning hope for the future, etc.

              Do have a humanities background? All tech people should have one.

            • Agrivar@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              7 hours ago

              I was a network engineer at one of the biggest backbones on Earth before retiring. Before that, I designed and programmed industrial automation. So, no tech background at all.

              Now that that’s out of the way: a blind squirrel could see that sucking up all the energy and wasting endless fresh water is a bad thing for the environment. The “bigger-than-2008” market crash that’s also coming won’t help.

              • jali67@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                5 hours ago

                And again, AI has been around for many years before the LLM craze and these select few companies advocating to shove it in all our faces, forcefully pushing data centers everywhere and integrating it into as much as they can. That is not something that could or should be done with AI. It is these company executives choosing to push it like this. It wasn’t always like this nor did it have to be

    • kopasu22@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      1
      ·
      12 hours ago

      This is the same use case that people are currently up in arms against Larian for

      • PonyOfWar@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Not quite. Larian also wants to use it for concept art, which is not the same thing as placeholder assets. To give you a bit of context, the standard for placeholder textures at the software development companies I worked so far has mostly been “vaguely fitting images you found on Google”.

      • Wigglesworth@retrolemmy.com
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        4
        ·
        10 hours ago

        I don’t like billion dollar corporations, and I’d be fine to stop and leave that be all the context, but I also don’t like them using technology to manufacture truth while polluting the earth to do it.

        So tell your coders to give you a tune up, the damage control algorithm didn’t pan out.

      • jali67@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        7 hours ago

        When they understand the context behind this particular case, yes.

          • jali67@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            2 hours ago

            Yeah I figured this app had more tech savvy and educated people. Evidently, it’s littered with people that barely got through high school.

        • uncouple9831@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          10
          ·
          edit-2
          11 hours ago

          Nah there’s already a slap fight down in the comments between the hard liners and the “can’t we just give it a rest” folks. It’s gotten to the point I’m convinced there’s at least a few ai bots generating hate spam against ai bots.

  • AnarchistArtificer@slrpnk.net
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    10 hours ago

    Can someone help me to understand the difference between Generative AI and procedural generation (which isn’t something that’s relevant for Expedition 33, but I’m talking about in general).

    Like, I tend to use the term “machine learning” for the legit stuff that has existed for years in various forms, and “AI” for the hype propelled slop machines. Most of the time, the distinction between these two terms is pretty clean, but this area seems to be a bit blurry.

    I might be wrong, because I’ve only worked with machine learning in a biochemistry context, but it seems likely that modern procedural generation in games is probably going to use some amount of machine learning? In which case, would a developer need to declare usage of that? That feels to me like it’s not what the spirit of the rule is calling for, but I’m not sure

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 hours ago

      I don’t know of any games that use machine learning for procedural generation and would be slightly surprised if there are any. But there is a little bit of a distinction there because that is required at runtime, so it’s not something an artist could possibly be involved in.

      • AnarchistArtificer@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        I’m not so much talking about machine learning being implemented in the final game, but rather used in the development process.

        For example, if I were to attempt a naive implementation of procedurally generated terrains, I imagine I’d use noise functions to create variety (which I wouldn’t consider to be machine learning). However, I would expect that this would end up producing predictable results, so to avoid that, I could try chucking in a bunch of real world terrain data, and that starts getting into machine learning.

        A different, less specific example I can imagine a workflow for is reinforcement learning. Like if the developer writes code that effectively says "give me terrain that is [a variety of different parameters], then when the system produces that for them, they go “hmm, not quite. Needs more [thing]”. This iterative process could, of course, be done without any machine learning, if the dev was tuning the parameters themselves at each stage, but it seems plausible to me that it could use machine learning (which would involve tuning model hyperparameters rather than parameters).

        You make a good point about procedural generation at runtime, and I agree that this seems unlikely to be viable. However, I’d be surprised if it wasn’t used in the development process though in at least some cases. I’ll give a couple of hypothetical examples using real games, though I emphasise that I do not have grounds to believe that either of these games used machine learning during development, and that this is just a hypothetical pondering.

        For instance, in Valheim, maps are procedurally generated. In the meadows biome, you can find raspberry bushes. Another feature of the meadows biome is that it occasionally has large clearings that are devoid of trees, and around the edges of these clearings, there is usually a higher rate of raspberry bushes. When I played, I wondered why this was the case — was it a deliberate design decision, or just an artifact of how the procedural generation works? Through machine learning, it could in theory, be both of these things — the devs could tune the hyperparameters a particular way, and then notice that the output results in raspberry bushes being more likely to occur in clusters on the edge of clearings, which they like. This kind of process would require any machine learning to be running at runtime

        Another example game is Deep Rock Galactic. I really like the level generation it uses. The biomes are diverse and interesting, and despite having hundreds of hours in the game, there are very few instances that I can remember seeing the level generation being broken in some way — the vast majority of environments appear plausible and natural, which is impressive given the large number of game objects and terrain. The level generation code that runs each time a new map is generated has a heckton of different parameters and constraints that enable these varied and non-broken levels, and there’s certainly no machine learning being used at runtime here, but I can plausibly imagine machine learning being useful in the development process, for figuring out which parameters and constraints were the most important ones (especially because too many will cause excessive load times for players, so reducing that down would be useful).

        Machine learning certainly wouldn’t be necessary in either of these examples, but it could be something that could make certain parts of development easier.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          Sure, I could definitely see situations where it would be useful, but I’m fairly confident that no current games are doing that. First of all, it is a whole lot easier said than done to get real-world data for that type of thing. Even if you manage to find a dataset with positions of various features across various biomes and train an AI model on that, in 99% of cases it will still take a whole lot more development time and probably be a whole lot less flexible than manually setting up rulesets, blending different noise maps, having artists scatter objects in an area, etc. It will probably also have problems generating unusual terrain types, which is a problem if the game is set in a fantasy world with terrain that is unlike what you would find in the real world. So then, you’d need artists to come up with a whole lot of datat to train the model with, when they could just be making the terrain directly. I’m sure Google DeepMind or Meta AI whatever or some team of university researchers could come up with a way to do ai terrain generation very well, but game studios are not typically connected to those sorts of people, even if they technically are under the same company of Microsoft or Meta.

          You can get very far with conventional procedural generation techniques, hydraulic erosion, climate simulation, maybe even a model of an ecosystem. And all of those things together would probably still be much more approvable for a game studio than some sort of machine learning landscape prediction.

    • Jankatarch@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      9 hours ago

      You can use statistics to estimate a child’s final height by their current height and their parents’ height.

      People “train” models by writing a program to randomly make and modify equations, then keep them depending on if new accuracy is higher.

      Generative AI can predict what first result on google search or first reply on whatsapp will look like for llms.

      There are problems. Training from 94% to 95% accuracy takes exponentially more resources as it doesn’t have some “code” you can fix. Hallucinations will happen.

      On the other side, procedural algorithms in games just refer to handwritten algorithms.

      For example a programmer may go “well a maze is just multiple, smaller mazes combined.” Then write a program to generate mazes based on that concept.

      It’s much cheaper, you don’t need GPU or internet connection to use the algorithm. And if it doesn’t work people can debug it on the spot.

      Also it doesn’t require stealing from 100 million people to be usable

      (I kinda oversimplified generative AI, modern models may do something entirely different)

    • nlgranger@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 hours ago

      From my understanding, AI is the general field of automating logical (“intelligent”) tasks.

      Within it, you will find Machine Learning algorithms, the ones that are trained on exemplar data, but also other methods, for instance old text generators based on syntactic rules.

      Within Machine Learning, not all methods use Neural Networks, for instance if you have seen cool brake calipers and rocket nozzle designed with AI, I believe those were made with genetic algorithms.

      For procedural generation, I assume there is a whole range of methods that can be used:

      • Unreal Engine Megaplants seems to contain configurable tree generation algorithms, that’s mostly handcrafted algorithms with maybe some machine learning to find the parameters ranges.
      • Motion capture and 3D reconstruction models can be used to build the assets. I don’t believe these rely on stolen artist data.
      • Full on image generation models (sora, etc.) to produce assets and textures, these require training on stolen artist data AFAIK (some arrangements were made between some companies but I suspect it’s marginal).
      • AnarchistArtificer@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        I agree with the ethical standpoint of banning Generative AI on the grounds that it’s trained on stolen artist data, but I’m not sure how tenable “trained on stolen artist data” is as a technical definition of what is not acceptable.

        For example, if a model were trained exclusively on licensed works and data, would this be permissible? Intuitively, I’d still consider that to be Generative AI (though this might be a moot point, because the one thing I agree with the tech giants on is that it’s impractical to train Generative AI systems on licensed data because of the gargantuan amounts of training data required)

        Perhaps it’s foolish of me to even attempt to pin down definitions in this way, but given how tech oligarchs often use terms in slippery and misleading ways, I’ve found it useful to try pin terms down where possible

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      10 hours ago

      generative ai is a subset of procedural generation algorithms. specifically it’s a procedural algorithm with a massive amount of weight parameters, on the order of hundreds of billions. you get the weights by training. for image generation (which i’m assuming is what was in use here), the term to look up is “latent diffusion”. basically you take all your training images and blur them step by step, then set your weights to mimic the blur operation. then when you want an image you run the model backwards.

      • AnarchistArtificer@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        Yeah, that was my understanding of things too. What I’m curious about is how the Indie Game awards define it. Because if games that use ((Procedural Generation) AND NOT (Generative AI)) are permitted, then that would surely require a way of cleanly delineating between Generative AI and the rest of procedural generation that exists beyond generative AI

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          most procedural algorithms don’t require training data, for one. they can just be given a seed and run. or rather, the number of weights is so minimal that you can set them by hand.

  • LupertEverett@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    8
    ·
    12 hours ago

    The fact that they were there in the first place is a problem.

    Why does a game that has been published by some other company calls itself “indie”???

    The term itself is becoming more and more meaningless with the passing time.

    • Eranziel@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 hours ago

      It has to be more nuanced than “self-published”, otherwise everything EA craps out is “indie”.

      The definition of “indie game” is a case where there is no easy, clear line to draw in the sand.

      • LupertEverett@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 hours ago

        It has to be more nuanced than “self-published”

        It doesn’t need to. Definining it as “self-publishing” is enough.

        otherwise everything EA craps out is “indie”.

        And because of the above, EA games might very well fit the definition, yes.

        This clearly shows that maybe we shouldn’t use “indie” to describe good games (or the lack of it to describe bad ones). It should just be used to define “means of publishing”.

      • Kjell@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        9 hours ago

        Indie game

        An indie video game or indie game is a video game created by individuals or smaller development teams, and typically without the financial and technical support of a large game publisher,

        Clair Obscur: Expedition 33

        After inking a partnership with Kepler Interactive, which was officially announced in early 2023, and securing funding from said publisher, Sandfall grew into a studio of about thirty developers, three of whom—including Broche and Guillermin—were former Ubisoft employees.[38][39][40][29][27][30][excessive citations] The funding also allowed Sandfall to expand the manpower contributing to the project beyond this core team, having outsourced gameplay combat animation to a team of eight South Korean freelance animators and quality assurance (QA) to a few dozen QA testers from the firm QLOC, as well as receiving assistance from a half-dozen developers from Ebb Software to port the game to consoles. The studio also hired a couple of performance capture artists; brought in musicians for the soundtrack recording sessions; contracted with translators from Riotloc for language localization; and partnered with Side UK and Studio Anatole as to voice casting and production in English and French respectively.[39][41] Finally, the partnership with Kepler Interactive enabled Sandfall to pay for noted professional voice actors, including Charlie Cox, Andy Serkis and Ben Starr.[35][37]

        With a team of 30 developers and dozens of consultants for things like QA, it doesn’t sound like a small development team. And they clearly had support from a game publisher.

  • merdaverse@lemmy.zip
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    13
    ·
    edit-2
    12 hours ago

    Clair Obscur is not indie by any definition of the term. I don’t even know why it was considered at all.

    • Rooster326@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      ·
      9 hours ago

      Sandfall *interactive is independent from its publisher Kepler. Many of the other games Kepler produces are typically considered indie - why not Expedition 33? BG3 is “Indie” but this definition

      While Hades, Hollow Knight, and Celeste being both owned and published by the same company are not indie.

      So… idk what definition everyone is using. Seems to be whatever suits their agenda at the time of award.

      • kinsnik@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 hours ago

        While Hades, Hollow Knight, and Celeste being both owned and published by the same company are not indie.

        if your definiton of inide exclude Hades, Hollow Knight and Celeste because they are independent i have to say that it is a very bad definiton of what an indie game is.

        personally, if a game has enough budget to hire Charlie Cox or Andy Serkins, it probably should not be in an indie award ceremony

        • Rooster326@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          5 hours ago

          Yes okay but how do you define it?

          Because that is all that “Indiependent” means.

          Remember Hades and Hades 2 had a bigger budget than E33

          1. Hades production cost was over $15 Million
          2. Hades 2 production cost was over $20 Million
          3. E33 was less than $10 Million.

          Hollow Knight was developed by 2 people with a $58,000 budget. How more independent do you want to get?

  • w3dd1e@lemmy.zip
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    3
    ·
    16 hours ago

    I kinda feel like Clair Obscur is sort of stretching the definition of indie game.

    I guess _technically _ it is.

    I’m not saying every game needs to be made in someone’s garage and take 12 years to make, but it sounds like this game was completely funded by Kepler and parts of the game were outsourced to other companies. Sandfall is made up of experienced developers from places like Ubisoft. Kinda feels like Brad Pitt and Tom Cruise made their own movie with funding from a lesser known subdivision of Warner Bros, outsourced SFX to 300 animators, and called it indie because they filmed it with 10 people.

    I do think Clair Obscur is a fantastic game and deserves to be Game of the Year (aside from the AI use). Sandfall and Kepler did a great job with a reported budget of $10M(!) and I especially appreciate what Kepler is doing to support the gaming industry.

    I guess I see the point of the award to inspire people to believe they shouldn’t give up on their dreams by recognizing small teams making games outside of the traditional industry. I just don’t feel like Sandfall qualifies.

    In the end, it’s not my award and they can give it to whoever they want!

    • Coelacanth@feddit.nu
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      14 hours ago

      I agree with your take. The definition of what an “indie” is is very vague and subjective, but given the budget and resources and circumstances of E33’s development it seems outside the scope of what seems to be the “spirit of the award”.

      Blue Prince should have gotten the award to begin with.

      • MrFinnbean@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        12 hours ago

        Well the definition for indie is independently published. Its not vague in its self, but the way people have started to use the word have changed its meaning to from something well defined to something more feeling based consept. I personally dont like it. People counted game like Dave the Diver to be an indie game when it had huge company Nexon publishing it.

        If followed by the original meaning of the word Blue Prince is not indie game either. It was published by Raw Fury, but Baldurs gate 3 would be indie as Larian published it.

        Indie as a word is like AI. It does not follow its original definition and because people have became used to misusing the word it has became the new norm.

        • Coelacanth@feddit.nu
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          11 hours ago

          People didn’t call Dave the Diver an indie game. The Game Awards nominated it in that category, and rightly got a lot of shit for it.

          Indie is a fraught and vague term in whatever genre of culture it gets applied to. During the early 00s indie music era you had tons of mass produced “indie rock” pushed out by big labels too.

          Everyone kind of knows what it’s supposed to mean: small budget, small crew, independent of the major commercial publishers/labels/whatever. But there will always be edge cases in both directions.

          • MrFinnbean@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 hours ago

            Plenty of people called Dave the Diver indie game. There are also lots of reviews from the time that call it indie game, both from youtubers and “real reviewers” like IGN.

            We can talk forever what indie means personally for everyone and everybody who has a opinion has little different view.

            But originally in both movies and music indie meant indepented publishment. That means the artist have no oblications for outside parties and can freely and without restrictions carry out their own artistic vision. Budget or crew size has nothing to do with it. Only reason people associate it with small teams is that largest portion of indepentedly published projects were done by small teams and or passion projects.

            Valerian and the City of a Thousand Planets is indie movie with budget of 220 million while average hollowood movie budget is somewhere between 100-150 million . Hell, passion of the crist, had budget of 30million and is maybe one of the most famous indie movies.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    261
    arrow-down
    22
    ·
    edit-2
    21 hours ago

    Seems excessive.

    There’s AI slop games, the new breed of lazy asset flips. There’s replacing employees with slop machines.

    And then there’s “a few of our textures were computer generated.” In a game that is clearly passionately crafted art.

    I get it’s about principle, but still.

    • Naia@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      94
      arrow-down
      6
      ·
      edit-2
      21 hours ago

      For stuff like dirt/stone/brick/etc textures I’m less strict for the use of generative stuff. I even think having an artist make the “core” texture and then using an AI to fill out the texture across the various surfaces to make it less repetitive over a large area isn’t a problem for me.

      Like, I agree that these things gernally are ethically questionable with how they are trained, but you can train them on ethically sourced data and doing so could open up the ability to fill out a game world without spending a ton of time, leaving the actual artists more time to work on the important set pieces than the dirt road connecting them.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        42
        arrow-down
        12
        ·
        edit-2
        21 hours ago

        And little tools like that give studios like this an edge over AAAs. It’s the start of negating their massive manpower advantage.

        In other words, the anti-corpo angle seems well worth the “cost” of a few generations. That’s the whole point of AI protest, right? It really against the corps enshittifying stuff.

        And little niche extensions in workflows is how machine learning is supposed to be used, like it was well before it got all the hype.

        • WalnutLum@lemmy.ml
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          1
          ·
          18 hours ago

          Most AAA studios at this point have in-house AIs and training, I’m not sure it’s the equalizing factor people think it is.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            3
            ·
            18 hours ago

            An OpenAI subscription does not count.

            Otherwise, yeah… but it helps them less, proportionally. AAAs still have the fundamental Issue of targeting huge audiences with bland games. Making them even more gigantic isn’t going to help much.

            AAs and below can get closer to that “AAA” feel with their more focused project.

        • tomalley8342@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          37
          ·
          21 hours ago

          100% agree. I’m glad AI is democratizing the ability for the little guys like you and me to not pay artists for art.

            • tomalley8342@lemmy.world
              link
              fedilink
              English
              arrow-up
              13
              arrow-down
              7
              ·
              20 hours ago

              And little tools like that give studios like this an edge over AAAs. It’s the start of negating their massive manpower advantage.

              The implication here is that you can gain manpower without hiring more men, no?

              • lepinkainen@lemmy.world
                link
                fedilink
                English
                arrow-up
                8
                arrow-down
                3
                ·
                16 hours ago

                One builder only uses hand tools, other uses power tools.

                That’s the difference, nobody is hiring less people because the tools are better.

                • EldritchFemininity@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  8 hours ago

                  Except, right now, they absolutely are. The tools are largely as you describe - though thinking about it, I think I’d describe it more as an airbrush vs a paint brush - but that’s not the way that upper management sees it for the most part, and not how the average supporter of GenAI sees it even if they don’t recognize that that’s their view. Both of these groups see it as a way to cut costs by reducing manpower, even if the GenAI folk don’t recognize that that’s what their stance is (or refuse to accept it). It’s the same as in the programming side of the conversation: vibe coders and prompt generators being hired instead of skilled professionals who can actually use the tools where they’re truly useful. Why pay an artist or programmer to do the work when I can just ask an LLM trained on stolen work to do it for me instead.

                  I read a great post probably a year ago now from somebody who works for a movie studio on why the company has banned hiring prompters. The short of it is, they hired on a number of prompters to replace some jobs that would normally be filled by artists as a test to see if they could reduce their staff while maintaining the same levels of production. What they found was that prompters could produce a massive volume of work very quickly. You ask the team for pictures of a forest scene and the artists would come back in a week with a dozen concepts each while the prompters had 50 the next day. But, if you asked them to take one of their concept pieces and do something like remove the house in it or add people in the foreground, they’d come back the next day with 50 new concept pieces but not the original. They couldn’t grasp the concept of editing and refining an image, only using GenAI to generate more with a new set of prompt parameters, and therefore were incapable of doing the work needed that an artist could do.

                  A feel-good story for artists showing what AI is actually capable of and what it isn’t, except for one thing: the company still replaced artists with AI before they learned their lesson, and that’s the phase most of the world is in right now and will probably continue to be in until the bubble bursts. And as Alanah Pierce so eloquently put it when talking about the record setting year over year layoffs in the gaming industry (each year has been worse than during the 2008 financial crash): “Most of those people will never work in games again. There’s just too many people out of work and not enough jobs to go around.” These companies currently in the fuck around phase will find out eventually, but by then it won’t matter for many people. They’ll never find a job in their field in time and be forced into other work. Art is already one of the lowest paying jobs for the amount of effort and experience required. Many artists who work on commissions do so for less than minimum wage, and starting wages in the game industry for artists haven’t increased since I was looking at jobs in the field 15 years ago.

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                13
                arrow-down
                3
                ·
                edit-2
                20 hours ago

                More that an existing smaller studio doesn’t have to sell their soul to a publisher (or get lucky) to survive. They can more safely make a “big” game without going AAA.

                My observation is that there’s a “sweet spot” for developers somewhere around the Satisfactory (Coffee Stain) size, with E33 at the upper end of that, but that limits their audience and scope. If they can cut expensive mocap rigs, a bunch of outsourced bulk art, stuff like that with specific automation, so long as they don’t tether themselves to Big Tech AI, that takes away the advantage AAAs have over them.

                A few computer generated textures is the first tiny step in that direction.

                So no. AI is shit at replacing artists. Especially in E33 tier games. But it’s not a bad tool to add to their bucket, so they can do more.

                • tomalley8342@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  12
                  ·
                  20 hours ago

                  Right, so the barrier was that they had to pay for this “outsourced bulk art”, and now with AI they don’t have to. It looks like we are in agreement when I say “I’m glad AI is democratizing the ability for the little guys like you and me to not pay artists for art”?

          • fonix232@fedia.io
            link
            fedilink
            arrow-up
            19
            arrow-down
            7
            ·
            20 hours ago

            Oh fuck off with that sentiment. You’re very well aware that that’s not what happened here, nor is it what’s happening in a majority of genAI usage cases. In fact in most cases it IS artists using genAI to speed up the design process.

            What AI does here is allowing small teams to get art done what otherwise would eat up their budget, aka they literally couldn’t afford. No artists were harmed in these cases because if AI didn’t exist they simply wouldn’t have been hired.

            Yes, there IS a currently ongoing shift. Just like there was e.g. with the mechanic loom. Did that kill off handmade clothing? No - even today we still have artists making handmade clothing and in fact making tons more off of it, while the masses got access to cheap clothing. The initial sudden rush to the new tech is annoying and yes it exposes some people to hardships (which is why we should switch from capitalism, and start providing UBI), but it WILL balance out. Remember, the luddites were wrong at the end.

            • Dremor@lemmy.world
              shield
              M
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              2 hours ago

              Language 😠.

              Yes, I know I’m kinda strict on that, but there are no reason here to come to insults.

              You got a good point here, and the message you answered to got downvoted to oblivion.

              If you disagre, downvote away, don’t feed the possible troll with your anger.

            • SabinStargem@lemmy.today
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              6 hours ago

              I think the Luddites weren’t just wrong, but actively harmed the masses. They should have been trying to take control of the machines to help themselves, not destroying them, so that they can set more ethical working conditions and pay. The wealthy will always build and use the machines, it is a question whether there are good people running their own businesses who can compete against the feckless elite.

              That is why I am opposed to anti-AI people, because they are doing the work of ensuring the 1% get sole agency over the usage of AI. Knowingly or not, Luddites are serving the worst of humanity.

              • petrol_sniff_king@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                ·
                5 hours ago

                If 1 guy I know gets sole agency over allll the cocaine in my neighborhood, I don’t really care that much. I don’t think we should live in a cocaine-based society, haha.

            • tomalley8342@lemmy.world
              link
              fedilink
              English
              arrow-up
              12
              arrow-down
              4
              ·
              19 hours ago

              What AI does here is allowing small teams to get art done what otherwise would eat up their budget, aka they literally couldn’t afford. No artists were harmed in these cases because if AI didn’t exist they simply wouldn’t have been hired.

              That excuse can be used by big publishers as well, no?

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                8
                ·
                edit-2
                19 hours ago

                Oh, yes. Big publisher will try it on a huge scale. They cant help themselves.

                And they’re going to get sloppy results back. If they wanna footgun themselves, it’s their foot to shoot.


                Some mid sized devs may catch this “Tech Bro Syndrome” too, unfortunately.

                • fonix232@fedia.io
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  9 hours ago

                  For reference, see the latest McDonalds Christmas advert scandal. Or was it Coca Cola?

                  Like with any new tech, companies will try to exploit it to reduce expenses on people, then quickly realise that just because you replaced a hammer with a hydraulic smithing press, you haven’t suddenly become a blacksmith yourself and still need the blacksmith to make shit happen - but now one blacksmith can do ten times more.

                • tomalley8342@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  9
                  ·
                  19 hours ago

                  Yes, like we went over before, it’s literally OK to use AI if the studios that I support use it to generate things that I like.

            • setsubyou@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              3
              ·
              edit-2
              17 hours ago

              I’ve been programming as a hobby since I was 9. It’s also my job so I rarely finish the hobby projects anymore, but still.

              On my first computer (Apple II) I was able to make a complete game as a kid that I felt was comparable to some of the commercial ones we had.

              In the 1990ies I was just a teenager busy with school but I could make software that was competitive with paid products. Published some things via magazines.

              In the late 90ies I made web sites with a few friends from school. Made a lot of money in teenager terms. Huge head start for university.

              In the 2000s for the first time I felt that I couldn’t get anywhere close to commercial games anymore. I’m good at programming but pretty much only at that. My art skills are still on the same level as when I was a kid. Last time I used my own hand drawn art professionally was in 2007.

              Games continued becoming more and more complex. They now often have incredibly detailed 3D worlds or at least an insane amount of pixel art. Big games have huge custom sound tracks. I can’t do any of that. My graphics tablets and my piano are collecting dust.

              In 2025 AI would theoretically give me options again. It can cover some of my weak areas. But people hate it, so there’s no point. Indy developers now require large teams to count as indy (according to this award); for a single person it’s difficult especially with limited time.

              It’d be nice if the ethical issues could be fixed though. There are image models trained on proprietary data only, music models will get there too because of some recent legal settlements, but it’s not enough yet.

              • warm@kbin.earth
                link
                fedilink
                arrow-up
                4
                ·
                8 hours ago

                It’s been proven time and time again that a game doesnt need to compare to AA and AAA shit to be successful. You dont need a big game with a big world. There’s an endless list of simple indie games that had a captivating charm that are crazy successful, all without a single bit of AI used.

              • fonix232@fedia.io
                link
                fedilink
                arrow-up
                3
                arrow-down
                3
                ·
                9 hours ago

                I fully agree with the ethical parts, but not with the bit of people hating it.

                Reality is that people on platforms like Reddit or Lemmy (or the tech side of the Fediverse in general) can be incredibly fervent about their AI hate, but they don’t represent the average people, whose work has become ever so slightly more convenient thanks to AI - let that be due to meeting summarisation, or writing tools making complex emails easier, or maybe they’re software engineers whose workload has been reduced by AI too… I am a software engineer and I use our own Claude instance extensively because it’s really good at writing tests, KDoc, it’s super helpful at code discovery (our codebase is huge, and I mostly work on a very small subsegment on it, going outside of my domain I can either spend an hour doing manual discovery, or tell Claude to collate all the info I need and go for a coffee while it does so), or to write work item summaries, commit messages, and so on. It doesn’t even have to generate (production) code for it to be incredibly useful. And general sentiment within my co-workers is that it’s a great tool that means we can achieve targets quicker, and luckily our management realises that we do need the manpower to do things manually still, so it’s not like they’re reducing teams by expanding on AI. They’d rather take the improved performance, thus the improved revenue, than keep revenue stagnant-ish and reduce expenses.

                So yeah the sentiment isn’t all negative.

                • kazerniel@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  7 hours ago

                  Reality is that people on platforms like Reddit or Lemmy (or the tech side of the Fediverse in general) can be incredibly fervent about their AI hate, but they don’t represent the average people, whose work has become ever so slightly more convenient thanks to AI

                  According to research, the overwhelming majority of gamers across all ages and genders do hate genAI though:

                  Gamers Are Overwhelmingly Negative About Gen AI in Video Games, but Attitudes Vary by Gender, Age, and Gaming Motivations. - Quantic Foundry

                  In a recent survey, we explored gamers’ attitudes towards the use of Gen AI in video games and whether those attitudes varied by demographics and gaming motivations. The overwhelmingly negative attitude stood out compared to other surveys we’ve run over the past decade.
                  (…)
                  Overall, the attitude towards the use of Gen AI in video games is very negative. 85% of respondents have a below-neutral attitude towards the use of Gen AI in video games, with a highly-skewed 63% who selected the most negative response option.

      • warm@kbin.earth
        link
        fedilink
        arrow-up
        29
        arrow-down
        9
        ·
        20 hours ago

        Who made the textures or took the photos that them AI generated ones were derived from, do they get a cut? That justification is even more bizarre now, considering the tools we have to photoscan.

    • Kilgore Trout@feddit.it
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      4
      ·
      edit-2
      18 hours ago

      Let them have their award with their own rules.
      Although I wouldn’t talk about integrity when someone still claims Clair Obscur is an indie.

    • RagingRobot@lemmy.world
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      2
      ·
      21 hours ago

      Also what about AI code tools? Like if they use cursor to help write some code does that disqualify them?

      • seathru@quokk.au
        link
        fedilink
        English
        arrow-up
        51
        arrow-down
        2
        ·
        20 hours ago

        If you do that and proceed to say “No we didn’t use any AI tools”. Then yes, that should be a disqualification.

        “When it was submitted for consideration, representatives of Sandfall Interactive agreed that no gen AI was used in the development of Clair Obscur: Expedition 33.”

        • ZoteTheMighty@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          7
          ·
          11 hours ago

          It’s highly likely that EVERY video game dev team has at least one person who is using cursor, whether it violates their AI policy or not. It’s massively popular, looks just like VSCode, and can be hard to detect.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            edit-2
            7 hours ago

            You don’t even need to use cursor. All the major IDEs are including LLMs nowadays to help with code completion and code generation. There’s zero chance no gen ai code is in any project that has more than a few people nowadays.

            • PapstJL4U@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 hours ago

              The question is, if having better for-loop completion the same as “create this feature”.

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                edit-2
                2 hours ago

                Doesn’t matter, the rules ban all AI. The rules are stupid.

                Edit: I mean the rules are so stupid it probably covers you googling an exception and reading the answer Google provides at the top which is gen ai as it the answer was used to help make the game even if you used nothing from the answer.

                Edit: or Senty even has AI insights into crashes in their default service.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          13
          ·
          edit-2
          19 hours ago

          That’s fair.

          But the Game Awards should reconsider that label next year. The connotation is clearly “AI Slop,” and that just doesn’t fit for stuff like cursor code completion, or the few textures E33 used.

          Otherwise studios are just going to lie. If they don’t, GA will be completely devoid of bigger projects.

          …I don’t know what the threshold for an “AI Slop” game should be through. It’s clearly not E33. But you don’t want a sloppy, heavily marketed game worming its way in, either.

          • Ryanmiller70@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 hours ago

            I’d have no problem with the show that seems to want the awards be taken seriously remove all or most bigger projects.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            24
            arrow-down
            6
            ·
            19 hours ago

            You have to draw the line somewhere, saying any game cant use AI is much simpler than an arbitrary definition of what slop is. Also means we reward real artistry everytime.

            • frongt@lemmy.zip
              link
              fedilink
              English
              arrow-up
              8
              arrow-down
              3
              ·
              19 hours ago

              Awards like these are inherently subjective. You don’t have to draw an objective line anywhere.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              10
              ·
              edit-2
              19 hours ago

              Then you’re going to get almost no games.

              Or just get devs lying about using cursor or whatever when they code.

              If that’s the culture of the Game Awards, if they have to lie just to get on, that… doesn’t seem healthy.

              • warm@kbin.earth
                link
                fedilink
                arrow-up
                17
                arrow-down
                5
                ·
                18 hours ago

                How have we all forgotten that games were made perfectly fine for decades without AI? Better games even.

                I’d rather give an award to a “worse” game that didnt use AI, than to a game that did.

                Devs can lie, but the truth always comes out eventually.

                • Kogasa@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  10
                  arrow-down
                  4
                  ·
                  17 hours ago

                  “the truth” being that a few generated placeholder textures were accidentally left in and promptly replaced? crazy

                • brucethemoose@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  6
                  ·
                  18 hours ago

                  Then most just won’t go on the Game Awards, and devs will go on using Cursor or whatever they feel comfortable with in their IDE setup.

                  I’m all against AI slop, but you’re setting an unreasonably absolute standard. It’s like saying “I will never use any game that was developed in proximity to any closed source software.” That is possible, technically, but most people aren’t gonna do that. It’s basically impossible on a larger team. Give them some slack with the requirement; it’s okay to develop on Windows or on Steam, just open the game’s source.

                  Similarly, let devs use basic tools. Ban slop from the end product.

                • lepinkainen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  5
                  ·
                  16 hours ago

                  Games were made by a single person not sleeping for a week.

                  But people expect more now and one person can’t do it fueled just by passion. The other people want to get paid now, not when the game is released.

                  Limiting the tools people can use to make games is ableist, elitist and just stupid.

            • Holytimes@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              7
              ·
              edit-2
              16 hours ago

              By this logic you could also ban Photoshop, tablets and any other software or hardware tool that has improved accessibility and workflow over the years.

              AI is a tool, flat out banning it won’t and can’t work. It’s too fucking useful.

              People said that anyone who used Photoshop wasn’t a real artist, people said computer graphics weren’t real art.

              At some point you DO have to draw an arbitrary line. Because that’s all. Art is arbitrary all of it since the dawn of mankind making art. It’s all arbitrary. If you only make hard lines that completely block tools, all you’re doing is harming artists.

              The entire point of drawing arbitrary lines is to allow for artists to keep making art. Why dissuading people from abusing others.

              So do you want no one to be able to do anything or do you want things to actually have artistic expression which is arbitrary.

              Ai has plenty of great usage in game development, generating LOD textures, random dirt or rock textures, creating automated systems of pallet replacements. There’s plenty of tools that can cut down huge amounts of repetitive workload, so small teams can actually spend their limited resources on actual art that has direct major impact on their vision without wasting huge chunks of time and money on low end. Small parts that realistically wouldn’t have had any artists hired or any actual real impact on the experience of those who consume the work, but would have huge negative impacts on those making it.

              Just because companies abuse a tool does not make a tool bad. Every artistic tool throughout all of human history has been abused by someone to hurt others. Photography, movies, Photoshop, paints. You name it. It’s been used and abused to hurt artists and every time artists adapt bring the new tool on to create new forms of expression. Even if that expression is too rebel against the tool.

              You cannot ban a tool no matter what. You only cause more problems becoming worse than those who abuse the tools.

              • petrol_sniff_king@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 hours ago

                At some point you DO have to draw an arbitrary line. Because that’s all. Art is arbitrary all of it since the dawn of mankind making art.

                My arbitrary line is that AI is cringe.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        21 hours ago

        Yeah.

        A lot of devs may do it personally, even if it’s not a company imperative (which it shouldn’t be).

    • Goodeye8@piefed.social
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      18
      ·
      20 hours ago

      People have made it excessive due to turning AI into a modern witch hunt. Maybe if people had a more nuanced take than “all AI bad” companies could be more open about how they use AI.

      I can guarantee that if E33 came out with the AI disclaimer it would’ve been far more controversial and probably less successful. And technically they should have an AI label because they did use Gen AI in the development process even if none of it was supposed to end up in the final game.

      But we can’t have companies being honest because people can’t be normal.

      • Nate Cox@programming.dev
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        6
        ·
        17 hours ago

        “All genAI bad” is a nuanced take. When you look at genAI from a moral, ethical, or sociopolitical perspective it always demonstrates itself to be a net evil.

        The core technology is predicated on theft, the data centers powering it are harmful economically and to surrounding communities, it is gobbled up by companies looking to pay less to profit more, and it’s powered by a bubble ripe for bursting which will wreak havoc on our economy.

        GenAI is indefensible as a technology, and the applications it may have for any tangible benefit can probably be accomplished by ML systems not built on the back of the LLM monster. We should all be protesting its use in all things.

          • Katana314@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            4
            ·
            12 hours ago

            Okay but first, will you admit that if my cancer curing Unicorn only dispenses 100 doses of its miracle medicine from its butt when I kill a homeless man, you’d agree killing the homeless is a moral good, right?

            Or, you know, we could throw away silly fantasy scenarios.

              • Katana314@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                2
                ·
                11 hours ago

                Really? Can you share your fully realized and operational generative AI that exists, and only created its model from artwork you personally made or retain full legal reproduction rights to?

                Answers Yes, or Sorry, I Lied.

          • Holytimes@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            13
            ·
            16 hours ago

            No no see. That’s not nuanced what that guy is saying is nuanced being a Hardline a****** is the nuance takes so you’re clearly in the wrong here. Sorry man it just is what it is.

            It’s like people have completely f****** forgotten what Photoshop was like when it first hit the scene. The same anti-ai b******* we’re seeing now was leveled completely against Photoshop and basically all digital art.

            Go back and look in the history books and read old diaries and things and you’ll find that photography had all the same anti-ai sentiment that we’re seeing now labeled against it.

            Artists have always adopted just because people are abusing. A new tool does not make the tool bad. It just makes those who are abusing it assholes. Given time artists will adapt in new forms of art. Well come forth from those tools.

            Cuz no matter what you say about AI, if you create and model yourself trained it entirely on your own art and then used it to create deconstructions or modern takes using computers of your own artwork. That’s still f****** hard. It doesn’t matter that it was processed through an AI slot machine. They’re still artistic intent behind the process.

            The only problem with AI right now is that big companies are breaking copyright laws with it. Hell you can make a solid argument that the problem isn’t even AI. It’s just the law breaking around it and the lack of actual intent to use the tools for artistic purposes instead of just cost saving.

            Cuz as much as we all can make fun of quote" prompt engineers. Someone’s sitting down tuning the model putting in specialized data for its training to generate their exact intent is still effort. It’s still in intent. There are people who are making the equivalent of modern art using generative AI.

            People always s*** on new art forms for not being art because it uses some new tool that isn’t traditional and therefore isn’t art. This stuff has been around for a handful of years. Give it enough time and their well-being actual proper art forms that will be built up around these tools. It has happened for hundreds if not thousands of years in human history with every new tool that we have made.

            We just need to direct the anger to the correct place. S***** companies breaking the law, not the tools.

      • Lfrith@lemmy.ca
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        edit-2
        20 hours ago

        Its not surprising when even people who like AI are now being affected by consumer hardware prices that is leading to shift in previously positive perception of it.

        Becoming harder to ignore its effects. Gone from a philosophical difference of opinion to actual tangible consequences.

        So becomes a question of is AI cool enough to make them happy to put up with the rising cost of hardware, which is something tech enthusiasts tend to care a lot about with it being something needed to even enjoy AI generated stuff in the first place.

        • Serinus@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          19 hours ago

          How do I put this.

          AI isn’t exactly the cause of the rise in the price of hardware. Only 1/6th of the purchased Nvidia cards are actually in data centers. Same for the memory.

          We’re not using it.

          What’s really drumming up all the prices is that the billionaires are convinced that AI is going to replace tons and tons of people. It’s not. It’s the insane corporate hype that’s doing all the damage.

          It will replace some, sure. The same way the electric drill replaced carpenters. One electric drill does not replace one carpenter. That’s not how that works. Instead the carpenters can work a bit faster and their job is a bit easier. It’s worth buying and it’s worth using, but it doesn’t really replace a person. Accountants didn’t disappear as a profession when spreadsheets were invented.

          There were books written in the 1980s about how household appliances raised the standard of cleanliness. Turns out people change clothes more when cleaning clothes doesn’t involve a washing board. And I don’t think Roombas replaced that many jobs either.

          In particular, I think this is a thing that will happen for software development. I don’t think it’ll reduce the number of developers we need. I think the standards for development will just be higher. All the front end stuff in particular is going to get easier, and you won’t need as many frameworks. We’ll especially need just as many devs, if not more, in the short term. Someone’s going to have to fix the mess all these companies are going to make after they’ve fired half their devs and tried to just vibe code everything.

        • Goodeye8@piefed.social
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          5
          ·
          19 hours ago

          I agree the current state of affairs makes people even more against AI and I think people have a good reason to be against AI, but don’t you find it a bit contradictory how people are less antagonistic towards E33 AI use now that it has been revealed?

          People are far more antagonistic towards games when the first thing they see is the AI label, to the point where they dismiss the entire game as AI slop, but it seems people are willing to be more lenient on AI usage when they first get to experience the game for what it is. This unreasonable reaction to the first impression is why companies would rather hide their AI usage rather than inform the customer.

          • Lfrith@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            19 hours ago

            I don’t know that people are less antagonist because of E33. I think regular tech hardware enthusiasts are getting gradually angrier after the initial excitement over them when it came to potential improvements in things like NPC behavior. Because its shifting towards not being able to afford hardware to begin with.

            Things have moved from somewhat background noise to no longer something they can pretend to be unaffected by. I think the period of discourse over AI was most relevant couple years before hardware issues popped up. Those who hate AI now likely don’t even care that much about creative elements. They are just pissed that AI is why prices are going up. They are angry at the AI data centers buying up all the hardware and supplies moving to corporations as consumers get cut off.

          • Holytimes@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            6
            ·
            16 hours ago

            It’s almost as if AI as a tool isn’t the problem. Instead it’s just a bunch of misinformation idiots not understanding the actual problems and misdirected anger.

            AI as a tool is fine. It’s no f****** different than Photoshop.

            The problem is companies breaking copyright law and stealing information and data to train the models in the first place.

            A model trained off non-solen artwork used with intent is perfectly fine.

            It’s not like we go around demanding everyone say that they use Photoshop whenever they do because oh they could be tricking us and it’s not hand drawn. No, we just expect digital art to be made with digital tools.

            Ai’s problem is one of legal issues, not artistic ones and people need to get out of their own asses about it at this point. It’s a f****** tool. Any tool used wrong is bad. A tool used correctly with purpose and intent is fine.

    • HarkMahlberg@kbin.earth
      link
      fedilink
      arrow-up
      13
      arrow-down
      7
      ·
      21 hours ago

      I have the same feeling about Kojima’s and Vincke’s latest comments on AI. Am I supposed to get mad at every single person who said they used/plan to use AI for something? I’d be as outraged as the average Fox News viewer, and it would be impossible to be taken seriously. I still won’t be using AI myself (fuck surveillance state AI) and I’d be making every effort to encourage others not to use it, but there’s no point in burning bridges and falling for rage bait.

      They’re creative people who care about the craft and care about the teams in their employ, which gives their statements weight, where some Sony/Microsoft/EA executive making an identical statement has none.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        5
        ·
        edit-2
        19 hours ago

        I understand the principle. Even if E33 is not slop, people should fear a road that leads to dependence on “surveillance state AI” like OpenAI. That’s unacceptable.

        That being said, I think a lot of people don’t realize how commoditized it’s getting. “AI” is not a monoculture, it’s not transcending to replace people, and it’s not limited to corporate APIs. This stuff is racing to the bottom to become a set of dumb tools, and dirt cheap. TBH that’s something that makes a lot of sense for a game studio lead to want.

        And E33 is clearly not part of the “Tech Bro Evangalism” camp. They made a few textures, with a tool.

        • HarkMahlberg@kbin.earth
          link
          fedilink
          arrow-up
          12
          arrow-down
          3
          ·
          19 hours ago

          When I give myself the leeway to think of a less hardliner stance on AI, I come back to Joel Haver’s video on his use of ebsynth:

          It lets me create rotoscoped animations alone, which is something I never would have the time or patience for otherwise. Any time technology makes art easier to learn, more accessible, we should applaud it. Art should be in the hands of everyone.

          Now my blood boils like everyone else’s when it comes to being forced to use AI at work, or when I hear the AI Voice on Youtube, or the forced AI updates to Windows and VS Code, but it doesn’t boil for Joel. He clearly has developed an iconic style for his comedy skits, and puts effort into those skits long before he puts it through an AI rotoscope filter. He chose his tool and he uses it sparingly. The same was apparently true for E33, and I have no reason not give Kojima and Larian the same benefit of the doubt.

          On the other hand, Joel probably has no idea what I’m talking about when I say “surveillance state AI.” People Make Games has a pretty good video exposing its use case. There’s also…

          • the global and localized environmental impacts of all these data centers,
          • Nvidia and Micron pricing the consumer out of owning their own hardware,
          • aforementioned companies fraudulently inflating an economic bubble,
          • the ease with which larger models can be warped to suit their owners’ fascist agendas (see Grok).

          Creatives may be aware of some, or all, or none of those things, which is why it’s important to continue raising awareness of them. AI may be toothpaste that can’t go back in the tube, but it’s also a sunk cost fallacy, you don’t have to brush your teeth with shit-flavored toothpaste.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            2
            ·
            edit-2
            18 hours ago

            Now my blood boils like everyone else’s when it comes to being forced to use AI at work, or when I hear the AI Voice on Youtube, or the forced AI updates to Windows and VS Code

            You don’t hate AI. You hate Big Tech Evangelism. You hate corporate enshittification, AI oligarchs, and the death of the internet being shoved down your throat.

            …I think people get way too focused on the tool, and not these awful entries wielding them while conning everyone. They’re the responsible party.

            You’re using “AI” as a synonym for OpenAI, basically, but that’s not Joel Haver’s rotoscope filter at all. That’s niche machine learning.


            As for the exponential cost, that’s another con. Sam Altman just wants people to give him money.

            Look up what it takes to train (say) Z Image or GLM 4.6. It’s peanuts, and gets cheaper every month. And eventually everyone will realize this is all a race to the bottom, not the top… but it’s talking a little while :/

            • HarkMahlberg@kbin.earth
              link
              fedilink
              arrow-up
              4
              arrow-down
              2
              ·
              18 hours ago

              True on most fronts except one. On a personal level, I do hate AI lol. The large language model itself. I just don’t think typing out or speaking out a series of instructions is that useful or efficient. If I want a computer to do something for me, I much prefer the more rigid and unnatural syntax and grammar of programming language. AI tools themselves just don’t produce a result that satisfies me.

              • Holytimes@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                edit-2
                16 hours ago

                Don’t produce a result that’s satisfies you yet. Early programming also was absolute dog s***.

                Give it 20 years, and there’s bound to be new things that will replace the current concept of AI that do functionally the same thing just in a manner that actually does produce good results.

                Just like we did with everything else computing related.

                Hating a tool is the single stupidest f****** thing anyone can do.

                That and chat prompting engineer b******* is one tiny tiny slice of the greater hole. It’s a footnote in the grand scheme of everything that the colloquial term AI represents. It’s just the most marketable one to end users so it’s the one that you see everywhere.

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                edit-2
                18 hours ago

                Again, they’re tools. Some of the most useful applications for LLMs I’ve worked on are never even seen by human eyes, like ranking, then ingesting documents and filling out json in pipelines. Or as automated testers.

                Another is augmented diffusion. You can do crazy things with depth maps, areas, segmentation, mixed with hand sketching to “prompt” diffusion models without a single typed word. Or you can use them for touching up something hand painted, spot by spot.

                You just need to put everything you’ve ever seen with ChatGPT and copilot and the NotebookLM YouTube spam out of your head. Banging text into a box and “prompt engineering” is not AI. Chat tuned decoder-only LLMs are just one tiny slice that a few Tech Bros turned into a pyramid scheme.

      • Holytimes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        7
        ·
        16 hours ago

        Give it another 5 years maybe and local self-trainable models and alternative versions of it will be available that won’t have all the theft problems, surveillance problems and other issues. The tech is new and mainly controlled by giant companies right now.

        It’s not like the tech is going to forever exist in a vacuum in the exact state. It’s in nothing ever does. Makes it doubly silly to get mad over a tool.

    • fonix232@fedia.io
      link
      fedilink
      arrow-up
      6
      arrow-down
      8
      ·
      21 hours ago

      At the end of the day it’s all about the quality in my opinion.

      The entire game could be written by ONE passionate person who is awesome at writing the story and the code, but isn’t good at creating textures and has no money for voice actors - in which case said textures and all the voices would be AI generated, then hand retouched to ensure quality. That would still be a good game because obvious passion went into the creation of it, and AI was used as a tool to fill out gaps of the sole debeloper’s expertise.

      A random software house automating a full on pipeline that watches various trends on TikTok, Facebook, YouTube, etc., and chains together various genAI models to create slopware games by the dozens, on the other hand, is undefendable. There’s no passion, there’s no spirit, there’s just greed and abuse of technology.

      Differentiation between the two is super important.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        edit-2
        20 hours ago

        So is the source.

        If they’re paying a bunch of money to OpenAI for mega text prompt models, they are indeed part of the slop problem. It will also lead to an art “monoculture,” Big Tech dependence, code problems, all sorts of issues.

        Now, if they’re using open weights models, or open weights APIs, using a lot of augmentations and niche pipelines like, say, hand sketches to 3D models, that is different. That’s using tools. That’s giving “AI” the middle finger in a similar way to using the Fediverse, or other open software, instead of Big Tech.

        • Holytimes@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          7
          ·
          16 hours ago

          People claimed Photoshop would cause a monoculture if you honestly and genuinely believe that AI will as well you’re stupid as f***. Like there is no way you can look back on the history of computers, art or human innovation in genuinely believe that anything at any point could create an artistic monoculture.

          No, it won’t happen. It physically cannot happen humans for the sake of being goddamn stubborn s*** stands will make counterculture art for the sake of it.

          The concept of a monoculture is an infeasible made-up nonsensical b******* idea. Humans are too diverse in our whims for to ever happen.

          The only way a monoculture could come about is if everyone but one person died off. And that person also decided to never make any form of artistic expressive anything till the day he died.