AI Could Create A Massive Problem For Valve’s Steam
from the flood-the-zone-with-shit dept
Two trends that I’m very interested in are about to collide and it’s going to be a mess.
By now, some of you will be tired of my calling for a more nuanced discussion about the use of AI and machine learning tools in the video game industry. I get it, but I’m also not going to pretend like I don’t still hold that very same view. AI tools are just that: tools. If the tools are good and used at the behest of the artists in the industry to make better games, that’s a good thing. If they upend artistic intent or simply suck, that’s a bad thing. And on the matter of jobs within the industry, if there is a net reduction in jobs, that’s bad! If AI lowers the barriers of entry for otherwise creative people and the result is even more jobs within the industry spread over more studios and, importantly, more cultural output in the form of games, that’s good!
Except when it’s not. And even if the AI evangelists are right, or those of us who see the possibility that AI use will ultimately result in more people in the industry and more games released to the public are right, that can still present very real problems within the industry. And I think there could be a serious one looming for storefronts like Steam.
This concern calcified in my head somewhat when I came across indie publisher Mike Rose, known for producing Yes, Your Grace, talking about just what all of this output could mean on Steam specifically.
“From a publisher perspective specifically, it’s mega annoying,” Rose tells GamesRadar+ in an interview, echoing other publishers like Hooded Horse. “If we thought the number of games being launched on Steam was crazy before, now it’s just impossible. During the last Next Fest, it seemed like around 1/3 of the demos had either AI generated key art, and/or AI-generated content. So now we have that to compete with too. Hurray!” Publishing lead John Buckley of Palworld developer Pocketpair called out the same AI trend in the latest Steam Next Fest.
Steam, as a focal point for the more open PC gaming market, is the clearest barometer for the rising quantity of games, with over 20,000 releases fighting for space every year. Even with Valve sticking to AI content disclosures for games listed on Steam, the rise of AI tools will only contribute to the torrent of content flooding the platform as games – or at least AI-made things game-shaped enough to be sold – become easier to produce.
Claims that there are too many games being released on Steam certainly isn’t new, nor has it historically been tied to anything to do with artificial intelligence. There have been complaints about this, as well as Valve’s apparent lack of interest in playing any real curation role, going back to 2023. Wait, make that 2020. Oh, wait, it actually goes back to 2015.
But while Steam hasn’t yet collapsed under the weight of its own volume of releases on the platform to date, the through line to all of that criticism has been Valve’s stoic apathy towards keeping up with the volume when it comes to helping its customers navigate the flood.
And that could be a very real problem for the platform. Steam’s value to the consumer, besides being the most recognizable outlet for PC gaming, is in its curation capabilities. To date, other than providing some search filters and a few tools to personalize the recommendations it makes for new titles to you, Steam has mostly left curation up to the customer themselves, or third-party list-makers. Meanwhile, the process for listing a game on Steam has not changed appreciably in the past several years. It’s still the same $100 entry fee to get your title listed. You still have to jump through all the registration steps with Steamworks, generate an app ID, build the store page, upload your assets. Then you wait for Valve to do its own review before you can publish your game, but that mostly amounts to ensuring that you’re compliant with Steam policies, that the game can launch successfully, and that’s about it.
With a potential flood of PC games coming, that sure doesn’t feel like enough to keep the platform from becoming an unnavigable wasteland where you can’t tell the gems from the slop. And, barring any new rules limiting to what degree AI can be used in game creation, that tidal wave is coming.
On this point, Rose focuses on “the elephant in the room” here: “It’s probably never going away again.”
“People can now make stuff by telling a bot to make it for them, and you know, the thing is that humans are mega lazy,” he reasons. “I don’t even mean that as an insult! We just are. So for a lot of people, if there’s a choice between ‘spend a bunch of time and money making a cool thing,’ vs ‘type some prompts into a program and the thing is made for me very quickly’ – the average person is going to pick the latter.
And that’s the thing really: Our feelings on it don’t matter. It doesn’t matter that a bunch of us don’t like genAI. It’s gonna get used now, and it’ll get used more and more. As the kids say: Video games are cooked.”
I don’t think that video games are cooked, but his point that AI will be in use in the industry is the one I’ve been making for months now. We have to be talking about how it will be used, not if. That ship has sailed.
And if Steam is still going to be of any value at all to the consumer, Valve better be thinking right damned now how it’s going to get more involved in the curation of what shows up on its platform.
Filed Under: ai, curation, filtering, steam, video games
Companies: valve


Comments on “AI Could Create A Massive Problem For Valve’s Steam”
Compulsory tag
Games with AI generated content should have a compulsory tag indicating as such. Gamers can use that to filter out the slop if they so choose.
I don’t mean the community tags. Whoever decides community tags for games are stupid. Games with contradictory or ambiguous tags make them useless.
Well, funny thing about that: Every time I’ve seen a company sail that ship and get caught sailing that ship without telling people they sailed it, even if only by accident, that ship gets set aflame. At least three games (and a fourth if you don’t believe the “we didn’t use AI” stuff for the Tomb Raider skins) have been caught with AI-generated content that was used without disclosure, and each time, the response to that usage was overwhelmingly negative.
Generative AI, even in its most “ethical” version, isn’t something that a lot of people want. Even if a game turns out to be shit, people would prefer it was hand-crafted shit rather than AI slop. And for all the evangelizing about “democratizing creativity”, the fact of the matter is that AI slop will come to be seen as the opposite of what its evangelists want it to be seen as: Where they want AI slop seen as true art and its “creators” seen as artists, everyone with some goddamned sense will see it as the apotheosis of laziness and non-creativity. Oooh, someone picked out a few settings and wrote a prompt for an AI image generator! So fucking what. Bob Ross was more creative with a half-hour’s worth of time and some oil paints, and all he did on his show was paint landscapes.
I understand that there are legitimate uses for what we colloquially refer to as “AI”. But generative AI, especially as a replacement for human-made creative works, is not one of them. Wanting to be an artist is fine; wanting to be one without putting the work in to learn and hone and master a craft is narcissism mixed with arrogance.
Re:
“Wanting to be an artist is fine; wanting to be one without putting the work in to learn and hone and master a craft is narcissism mixed with arrogance.”
How far would you like to carry this standard?
Does it also apply to people who drive a car without doing their own maintenance? Or those who write computer programs without knowing how to change out their motherboards?
If you eat steak, did you butcher the cow?
Re: Re:
Anyone who uses generative AI as a replacement for the process of making a creative work—an illustration, a video, a piece of writing of any kind, a song, a blog entry, whatever—is not an artist in any measurable sense of the term.
People who, say, use samples in their music still take the time to listen to the songs they’re sampling and figure out some new way to twist the sample in question into something new. Someone who generates an AI-created song isn’t doing any of that. They’re just going “here’s what I want a song to be about and the style I want it to be in” to the Emptiness Machine and expecting the perfect result without any real process beyond “push button, receive song”. You won’t find any artistry in works made by generative AI because the artistry of any work lies within the process of creation.
Those other comparisons are all bogus deflections with no direct 1:1 correlation to the act of artistic creation. You can drive a car without knowing a goddamn thing about the engine (though that knowledge would help), you can code a application without knowing anything about how your device works, and you can eat meat without having obtained it from the source yourself. But creating art—as in really creating it, not asking a computer to do all the work for you? That requires knowledge and skill of how to craft that work. Writing a book of any kind both is and isn’t as easy as “put one word in front of another”; if you understand why I say that, you’re already one step closer to being more of an artist than your average AI slop enthusiast.
Anyone who thinks they’re an artist because they generated something through the Emptiness Machine is fooling themselves. Anyone who believes AI “art” is the future of art/media/pop culture is climbing a stairway to Heaven that’s made of cardboard.
Re: Re: Re:
“Writing a book of any kind both is and isn’t as easy as “put one word in front of another”; if you understand why I say that, you’re already one step closer to being more of an artist than your average AI slop enthusiast.”
Interesting that you mention that because I am, in fact and in real life, an author. Though I don’t write books, at least not so far, I just do stories and “novelettes”.
I suppose my examples would have been better if I had asked if you need to know how to tune an engine in order to be a race car driver, or how to butcher a cow to be a chef.
I guess I really don’t know enough about how this AI stuff really works, and I’ve never been sufficiently interested in the subject to make any particular effort to understand it.
Yeah, I’m an old coot — so what do you expect? 🙂
But I do find the debate around it interesting and the variance in perspectives is intriguing.
Re: Re: Re:2
Moderately, yes. That said, while I would think a race car driver doesn’t necessarily need to know how to tune an engine, the knowledge is definitely helpful. And a chef doesn’t need to know the ins and outs of butchering scow to cook the meat that comes from said cows. An artist who is committed to being an artist, on the other hand, should and must know the process of their art so they can justify the decisions of that process. That’s to say nothing of building a base of inspiration and being able to explain how certain works inspired you. An AI slop enthusiast might be able to tell you what styles they chose for a generated AI image and why, but they won’t be able to tell you why the Emptiness Machine generated the image in the way it did.
Re: Re: Re:3 If they don't care about being an artist?
What if they don’t care about being considered an artist, and simply want to cause their vision of a game to exist, with the least amount of cost and work? Will you refuse to drive a car because it was made by machine instead of by hand?
Re: Re: Re:4
I don’t really care if AI slop enthusiasts want to be considered artists. They’re still lazy assholes for using generative AI. If the only way someone can make a game is through generative AI, that game probably won’t be worth playing—just like images, songs, videos, and written works made with generative AI are worth less of my time than it took for an LLM to generate them.
Re: Re: Re:5
Game makers became lazy in 2000s when first game engine appears (like Flash), because “real” game makers were supposed to be craftsmen coding in assembly and making assets a pixel at the time, not drag-dropping and coding scripts.
But games then became utterly complex, with dozen platforms to support, unlimited possibilities of optimizations, immersive sound effects, good narrative, natural gameplay, decent accessibility, online gaming and so on, that a simple FFS game could requires years of work for a team of few experienced devs. Even Candy Crash was made by a dozen people, and GTA 5 took 5 years involving 1,000 devs.
AI is still just a tool, you cannot prompt “make a great game that doesn’t look like anything people have seen before. no mistakes plz” yet. You’ll have slop, crashes, slowdowns, broken gameplay, and unsupported screen sizes, but you may have something to publish in less than a year, and could see if at least someone would dare to play it. Then once it’ll be a great success (it may never happen, as nearly every time), rewrite from scratch with a dedicated team.
Re: Re: Re:6
TBH, yeah, I think it’s going to take the AI market crashing for anyone to make anything actually good with AI, because the newness and hype/criti-hype is attracting hacks like flies to filth.
Re: Re: Re:2
I could quite easily explain to you how I selected and prepared this Costco frozen pizza, and some principles behind the choice of ingredients Costco used, but claiming that makes me a chef would be quite crazy.
But of course, much like the US patent office, lots of people lose their minds as soon as “on a computer” is added to the explanation.
Re: Re: Re: Come on now....
“Anyone who uses generative AI as a replacement for the process of making a creative work—an illustration, a video, a piece of writing of any kind, a song, a blog entry, whatever—is not an artist in any measurable sense of the term.”
I can think of at least 5 different ways to describe an artist’s use of AI to replace a creative process while still being an artist themselves. Let’s not be silly.
If someone is creating a children’s book and is an excellent writer/author but can’t draw for shit, and has AI create illustrations for his book, that fits your criteria. If you would call the excellent writer/author a non-artist, then you would be very silly.
Re: Re: Re:2
Then I’m a silly motherfucker.
Sure, the writer is an artist for the writing they do. But generating images because they, what, can’t afford to pay an actual human illustrator for their time and skill? Nah, poverty isn’t an excuse for AI slop.
Re: Re: Re:3
“Sure, the writer is an artist for the writing they do. But generating images because they, what, can’t afford to pay an actual human illustrator for their time and skill? Nah, poverty isn’t an excuse for AI slop.”
What an incredibly privileged take. You’re really going to suggest that there aren’t great stories, and art, and music out there in the minds of people that cannot afford the component pieces we both are talking about to get that cultural output? I imagine we’re missing out on a treasure trove of creativity and culture and now we’re staring into the possibility of unlocking all of that for all of humanity….and you want to lock it behind a paywall for reasons I still haven’t really heard articulated?
Re: Re: Re:4
I’m suggesting that such ideas exist in those minds, but creating them via AI slop doesn’t make them art.
How many artists will stop creating because they won’t be able to compete with the speed at which AI slop can be produced and shared? How many artists will stop creating/sharing their work because they don’t want their efforts used to train the AI models that hardcore AI bros think should replace said artists in the name of “democratizing creativity”?
Poverty is not an excuse for AI slop. I can’t afford to commission artists for the purpose of turning an image in my head into an actual image (physical or digital), but I still don’t turn to generative AI. I’ve used the Emptiness Machine before and I don’t call it that only because I like that Linkin Park song. AI slop has no artistic soul; on a long enough timeline, it would take any soul out of my ideas as well. And that’s before we get to all the other finger-curling-on-the-monkey’s-paw consequences of using generative AI.
Re: Re: Re:5
This sounds quite a bit like the RIAA’s “how many musicians will stop producing music because it can be shared for free online.” It was wrong then, it’s wrong now.
Re: Re: Re:6 Apologies, but tl;dr replies are kind of how I roll.
I don’t bring up my ties to the furry fandom here all that much because…well, why would I? But I’m gonna do it here and now because I feel like it will be instructive of my position on genAI in a number of ways. This consists of two different examples, both lengthy, so stick with me.
Back when I first got into the fandom, I found plenty of artists whose work I liked for the content as I much as I admired their work for the style. One in particular (whom I won’t name out of respect, though oldhead furries like me might know who I’m referencing) had a rather fun toon-like style. They’re still around and still posting, thankfully—but many, many years ago, they stopped posting after at least two artists did their best to imitate his style without really adding any kind of stylistic flair of their own to their works. It’s not that the artist in question didn’t stop drawing; it’s that they stopped posting. You can think “oh well that’s just silly”, and sure, maybe it is. But the fandom lost out on years of that artist’s potential contributions because someone else tried imitating him and he didn’t like it.
That’s what I find frightening about genAI imagery: With the right set of models, anyone could replicate just about any artist with a recognizable style if they have enough images on which a model could be trained. At that point, why should anyone go ask the artist themselves for an image when an AI image generator can make the image in the desired style at a much quicker pace than the artist could?
And before you or anyone else says “well that just means human-produced work will be a premium purchase” or something like that: It kind of always has been. People give away works, yes. People produce works with no thought towards monetizing them. But some people do that because they don’t necessarily need the money. The “starving artist” trope/stereotype exists for a reason, and it’s because art—for all the good it does our souls in this world—doesn’t tend to pay the bills if you’re not treating it like a business. Lots of furries do commissions because they live in 2026 CE instead of a year where some rich asshole would pay an artist to work year-round on their art. And now they have to compete with genAI, which can generate numerous images in a fraction of the time an artist would need to create merely one. “Democratizing creativity” might seem like a noble goal, but if you really want to help artists, universal basic income would be a better idea than genAI.
Now I said there were two examples. This is the second: Some of the artists I found when I first got into the fandom are still around today—which means that, like me, they’re oldheads who are at least in their 40s, if not their 50s. With any luck, they’ll be around for another decade or two to give us more art. But I recognize that life is what happens when we’re busy making plans, and an idle Tuesday can turn into the last day of one’s life out of nowhere. If one of the oldhead artists I follow happens to drop dead or otherwise retire from making art, that means I’ll never again see a new image from them. Their art, their style, their tastes in content—all gone in the blink of an eye. That will absolutely suck.
And yet, because those artists are oldheads who still have significant followings, their art is widely available. An AI model could easily be trained on the totality of their work and made to replicate their style and their original characters. Once that’s done, that model can be plugged into an AI generator and made to create images in that artist’s style using characters that artist made. I could, theoretically, create works in that artist’s style forever.
But it wouldn’t be their art, would it? Oh, it would look like their art. It might even fool a moron in a hurry if there are no obvious flaws like fucked-up watermarks or extra limbs. But I would know that it isn’t their art, made their way, because they’re dead. And all I’d be doing is having a computer imitate that artist to an absurdly frightening degree because I couldn’t stand the fact that said artist is gone. When I talk about AI-generated “art” having no “soul”, this is, in part, what I’m talking about: It’s all imitation without meaning or process or anything that makes an artist do what they do. There’s no compulsion for a genAI model to make art in a given style; it’s an order to a mindless machine that the machine tries to follow it based only on its programming. A dead artist can’t produce any more; an AI model can produce, and produce in that artist’s style, so long as there’s a computer powerful enough to run it. And when we use genAI to “resurrect” the art of a person who is no longer with us, it feels like old depictions of zombies as people raised from the dead to work as slaves. (The same could be said of holograms of dead musicians trotted out to “perform” their songs.)
And while this isn’t necessarily connected to the fandom, I feel like I have to say this because it’s a deeply personal admission. I used genAI in the past to create images—and I did it for longer than, say, a cutesy one-hour experiment to see what genAI is capable of. I’m talking weeks. Hundreds of images. And when my time experimenting was over, I looked back over the images I’d saved and felt…pretty much nothing. Oh, I genned some images that depicted ideas in my head in a way that was satisfying, but…well, to quote the most downvoted post on Reddit, there was no “sense of pride and accomplishment” to it. I didn’t make any of that “art” through toil and ache and learning and experimentation—a machine made it because I told it to. It made me feel empty inside to know that I had these ideas, but the only way I could make them was with genAI. I deleted most, if not all, of those images. (My folders are in such disarray that there could be a handful of them still in there somewhere.)
The eponymous “Emptiness Machine” from the Linkin Park song is a metaphor for empty promises and false hope that lead to disillusionment. That makes it a perfect phrase to use for genAI: For all its promises of “democratizing creativity”, it does more to destroy the creative spark within people because the process is the art is the process. When you can get what you want all the time, you end up getting tired of getting it all the time. It makes you feel empty inside. And again, that’s all before we get to the very real—and on Techdirt as of late, often overlooked and understated—consequences of genAI that stretch beyond that soul-crushing emptiness.
There’s an episode of the classic Twilight Zone series called “A Nice Place to Visit”. In it, a middling thief is shot and killed, and he wakes up in the afterlife. A strange man greets the thief and gives him money and food, and the thief eventually figures out that he’s dead. When the strange man tells the thief he can have anything he wants, the thief figures that he’s in Heaven. A month later, after having everything he’s ever wanted—and only that—the thief begs the strange man to send him to Hell…only for the strange man to reveal that the thief was already there. After my own experiences, I think about genAI in a similar way to that Twilight Zone episode: The promises sound so heavenly, but the reality is much more hellish than one can imagine.
Re: Re: Re:7
If someone were to make a venn diagram of artists, writers, musicians and coders, a large portion of the people slap bamg in the middle would be members of the Furry fandom and despite all the claims from AI enthusiasts that there’s a silent majority who love AI actually, the bulk of the fandom openly despises it.
The people who stand to lose the most if AI companies can continue taking whatever they can find won’t be big copyright holders who lose out, as Mike keeps insisting, it will be individuals who need to produce comissions to pay for food, medication or to keep a roof over their heads, and sadly, I’m not sure any AI enthusiast cares about the human cost of their shiny toy… If they did, they would be suggesting solutions and not just whatabouting or comparing people to copyright maximalists.
Re: Re: Re:4
…I’m gonna play devil’s advocate for the moment and say that the issue is if they’re just using it for function rather than the love of the art, it’s going to suck ass.
Like, there are people I’ve seen who do good work with those tools to get around their physical illnesses, but to them it’s a form they focus on not a means to an end, because the latter produces art that doesn’t get anyone interested anyway.
The concern I think is valid is that these sorts of small jobs are the increasingly vanishingly rare entry-level work for illustrators who’ve already been squeezed thin as an industry, but I think the solution is to hit at the core problem rather than using AI as a proxy war.
Because again, entry-level jobs were already being screwed over well before AI, and I think that it’s vital we fight back against that, god I wish that people in the US actually gave a crap about fighting to expand our arts programs…
Re: Re: Re:2
I’m going to put this a different way, Tim:
AI is a tool, but I don’t see any value in what it creates. If the art assets, the story, the music, etc. are generated by AI, that, to me, equates to “you took a shortcut. Charge less for it.”
I’m willing to pay the artist who learned how to draw, the animator who learned how to animate. I’m not willing to pay the one who got AI to do it for them, because I could just go do that myself in an afternoon.
If the artist, the animator, the writer use AI to learn how to do it themselves, that’s a different story. In the end, they will be the ones who have made it. If the writer uses AI to help edit what they’ve written, that’s also a different story – though a human editor is still probably a better choice.
I would be fine with buying those (ignoring other difficulties AI has beyond the scope of this particular discussion). But if they just used the AI output? Nah. It’s not worth my time or my money.
It’s something I don’t see addressed in a lot of these: why is the stuff generated by AI worth my time and money?
Note: Our current AI as an analytics tool, as a summarizer, as a collator of large amounts of information so that the human can better understand what’s going on? I have no issues with that.
I saw a behind-the-scenes version of a music video that used AI-generated images for storyboarding purposes in the planning phase – importantly, the end version had nothing generated by AI in it. The man making it used it as a planning aid and nothing more (the final versions of the characters looked nothing like the storyboards). He was also not someone who would have hired anybody to do that work in the first place. This is another use I have no issue with.
I just don’t see a reason to pay someone for something they didn’t actually make.
Re: Re:
You really like trying to get people to argue about something else besides the actual topic of discussion, don’tcha Frank?
Re: Re:
I love false equivalence nonsense. You must be one of the people who like $nuanced discussions.
$(For values of “nuance” which involve only the nuance i which we are invested, but not the other 99% of the elephant-in-the-nuance which we will oh-so-studiously ignore.)
Re: Re:
That’s not a sensible comparison, even by your standards. Actual analogy: Being a delivery driver without knowing how to drive, just trusting autopilot.
Which all translates to Steam becoming for most people what it already is for me: a place to buy the games I already know I want from other people’s recommendations or other sources outside Steam. I don’t go browsing Steam looking for games. I go elsewhere to people I know or to review and recommendation sites I know and find games I’m interested in there. Steam is the cashier, nothing more.
That might stem the flood of cheaply-made crap games, but I’m not optimistic. The cost of making and publishing them is so low it only takes a few suckers to make it profitable, and there’s always new suckers coming along who don’t know how to navigate the bog.
Re:
Unfortunately so many people browse Steam for whatever reason and buy this shit. For the lols, or just to see, or because “Let’s browse Steam games and livestream the experience”.
The barrier is already low. You can download thousands of dollar’s worth of software like Krita, Blender, and Godot for free and start making your game yourself right now without having to rely on some third party SaaS code generator that can arbitrarily throttle or even revoke your access:
https://x.com/patomolina/status/2045281665363386504
Software used to be the one industry you could be almost completely self-reliant
The barrier is already low. You can download thousands of dollar’s worth of software like Krita, Blender, and Godot for free and start making your game yourself right now without having to rely on some third party SaaS code generator that can arbitrarily throttle or even revoke your access:
https://x.com/patomolina/status/2045281665363386504
I don't understand the hate
I don’t understand the hate for “ai generated” stuff.
Maybe it’s because I don’t play games and am not really exposed to that culture.
But I’ve seen some AI generated pictures that look pretty darn impressive and the fact that they are “AI” doesn’t make them less interesting or amusing than they would be if they were hand drawn.
On computers, even “hand drawn” isn’t; art programs all allow you to draw perfect lines and circles and angles of all sorts. Computer art is a long way from a pencil on paper or paint on canvas, regardless of whether AI is involved or not.
So here’s a genuine question:
Why all the hate for AI? If something is good, it’s good, and the tool used to make it, whether it’s a paintbrush or a chainsaw, seems like it shouldn’t be relevant to the worthiness of the end result.
Re:
Because the source material they use is all appropriated from actual people without their permission, then used to generate profits for the AI companies without giving those people any of it. Without the work the models were trained on there wouldn’t be any profits for the AI companies, but they look at it as their right to use everyone’s work without permission and without paying for it.
On top of that, there’s the prodigious environmental and economic costs of those models. They require data centers so big they can’t be used for anything else, consume so much power that it destabilized the electric grid in the area and drives prices sky-high for everyone else, and consume memory (both DRAM and SSD memory) in quantities that leave nothing for the rest of us. Literally. There will be no available DDR5 nor SSDs for consumers this year, the AI companies have bought up 100% of the production for the entire year.
Now, is some nice imagery worth all that?
Re: Re:
Is some nice imagery worth all that?
It seems that a lot of people believe that the answer to your question is yes it is.
Folks here and on many other websites have spent years decrying copyright overreach and excessive terms and restrictive contracts and on and on.
But now that there’s an industry using “creative data” (I can’t think of a better term) in a wholesale manner, those same people are screaming about “artist’s rights”.
Without taking a position on the right or wrong of this issue, I wonder if this kind of thing is the future of creative endeavor. For good or for ill, it certainly appears to have democratized the creation of a lot of artistic stuff, or at least a lot of stuff that looks pretty darn artistic to me.
Are folks who rail against AI in the same position as groups of monks after the printing press suddenly made books available to the masses?
I genuinely don’t know the answer, and I suspect it’s because I simply don’t know enough about AI.
Again, I’ve looked at a few pictures and thought they were pretty nifty, but other than that my exposure to the whole thing has been pretty much limited to what I read on websites like this one.
Re: Re: Re:
Because most of the “copyright overreach” isn’t on the part of the creators, it’s on the part of the middlemen (publishers and distributors and the like) who take the lion’s share of the profits and give the actual creators a pittance. There’s no contradiction in being in favor of creator’s rights and simultaneously in opposing copyright overreach.
Re: Re: Re:2
My problem with that argument is, most of the copyright “reforms” people are suggesting are for the benefit of the corpos rather than the artists.
Like, the copyright alliance people were cheering on the anti-Internet-Archive lawsuit, especially ex-RIAA ghoul Neil Turkewitz, who has overtly stated that the anti-IA lawsuits are a “dry run” for anti-AI lawsuits.
Now note how the Wayback Machine is being screwed over by news sites over fears of scraping “rights” and you can begin to connect the dots. It’s an astroturf for big copyright, and everyone is falling for it because they don’t think the leopards will eat their face.
The law is a blunt instrument, it doesn’t care if you hate copyright overreach and AI, it is what it is designed to do, and I see where that design is going and it scares me!
Re: Re: Re:
Generative AI does nothing that is even remotely comparable to the printing press making written works cheaper to produce and widely available to the masses.
You can keep coming up with those bogus comparisons, but you won’t find many people here willing to treat them seriously. They’re all an attempt to deflect from the numerous already-mentioned problems with generative AI, not the least of which is how “democratizing creativity” means worsening the culture divide by way of making a shared culture that much harder to achieve. Imagine if anyone could generate an entire feature-length Avengers movie for their specific tastes. Why, then, would anyone watch the ones produced by Marvel/Disney? And how would we be able to talk about specific Avengers movies when there would be so many different ones that we’d have no frame of reference for what movie any two people would be talking about? For that matter, imagine if someone makes an Avengers movie with a sex scene in it and chooses to “use” the same actors as the MCU films—would the use of the likenesses of, say, Scarlett Johansson and Robert Downey Jr. be morally and ethically acceptable even (and especially) if they never gave their approval to have their likeness used that way?
We can’t put the genie back in the bottle, that much is true. But generative AI is less a genie and more a monkey’s paw: You get what you wish for, but it comes with a cost you probably won’t like.
Re: Re: Re:2
“would the use of the likenesses of, say, Scarlett Johansson and Robert Downey Jr. be morally and ethically acceptable even (and especially) if they never gave their approval to have their likeness used that way?”
What is your position on Elvis impersonators, and “tribute bands” in general?
Re: Re: Re:3
My snarky-yet-serious position is that you’re trying to deflect from the direct yes-or-no question of “is it morally and ethically acceptable to use someone’s likeness without their permission in a way that said someone would never have approved”. But if I must: I generally have no issue with those things because they still require a modicum of talent and there’s no mistaking the tributes for the real thing. (Especially with Elvis, since he’s dead.) With generative AI, it’s entirely possible to create something that looks or sounds like “the real deal” (e.g., a celebrity deepfake) and pass it off as genuine to, say, a dumbass in a hurry.
Re: Re: Re: Let me just pick up on one thing here...
AI is not democratising art. This stuff costs more money to create than doing stuff manually. Any half-decent AI output has gone through multiple layers of prompts, and/or is being driven by someone who has spent a lot of time refining their prompts, and that means signing up to a paid service.
At a rate that excludes large parts of the world and people on reduced incomes.
A pencil and paper, a cheap second-hand guitar, hell, even a tablet with a word processing programme on are all cheaper than a year’s subscription to these tools.
Re: Re:
And after all that, they don’t really work that well for most of the use cases their advocates are presenting. I have to work with Github Copilot at my job, and at the best of times it’s like mentoring a junior engineer: I have to hand-hold it through the entire process, explaining to it what it got wrong and what it needs to do to correct it at every step. As with mentoring another engineer, I could’ve just done the work myself faster. Unlike that junior engineer, I don’t get a more experienced and better engineer out of it. Copilot never improves, and I take a hit to my productivity because of it. At it’s worst it blows up so badly I have to just abandon the attempt (which at least saves time, which is a sad commentary on it).
Re:
“I don’t understand why people hate it when eateries fail to provide tasty food, maybe it’s because I don’t eat out”
Re:
Let me put it this way:
Do you think the message the Luddites were trying to communicate by smashing looms was “We fucking hate looms”?
Re: Re:
I’d vote this for comment of the year if possible.
Re:
People literally draw/paint with a stylus on computers, and have done since ages. It’s hella harder with a mouse, so credit to those who suffer through it. And understanding how to really use the tools is as complex as mixing paints and choosing host media, etc.
Sure, some ppl will do graphic arts or childish stuff by letting the programs give you a perfect circle and bucket-filling it. Not very good art.
Re:
Put simply: AI-generated art was not drawn by the person who input the prompt. Why should I pay someone for art that they didn’t draw?
“they don’t really work that well for most of the use cases their advocates are presenting.”
If that is truly the case then the whole thing will blow over and become irrelevant in a short period of time.
If a tool manufacturer was producing wrenches that broke every second time you tried to loosen a bolt, people would soon start buying their wrenches elsewhere.
Same thing here. If you’re using the wrong tool or an inferior tool for your job, then you’re in the position of the guy who went to the doctor and said, “It hurts when I do this.”
Answer: Don’t do that.
Re:
It’s hard to avoid doing the thing when every C-suite asshole with a few hundred million dollars to spare is shoehorning AI, generative or otherwise, into their businesses regardless of whether it’s needed. When the boss tells you “use this or lose your job”, chances are good that you’ll use the AI instead of risking your paycheck.
Re: Re:
“When the boss tells you “use this or lose your job”, chances are good that you’ll use the AI instead of risking your paycheck.”
Yeah, that’s the same position as anyone who gets told to use a particular widget of any kind to do his job, not just AI-assisted stuff.
If said widget is not an effective tool, either the directive will change or the company will evolve somehow to account for that inefficiency (or not).
Re: Re: Re:
In the meantime, the tool will be used regardless, and the damage it could do will be done. Not exactly a great outcome there.
Re: Re:
I think society, just in general, needs to be rescued from “C-suite assholes”. So many of our problems can be traced back to this cause.
This is exactly the sort of thing people were getting worked up over before and got brushed off. Glad we did get there eventually, though
The problem isn’t really so much for the platform, but all the games that struggle to get noticed in an environment where search is broken, tbh. Realistically, Steam will be fine, just as it is now despite search being functionally useless. The lock in from convenience and it’s frontpage still serving the majority of popular games is enough to keep it going.
Y’know, if you phrased it like he did, instead of as a provocative sales pitch, it’d save a lot of grief and make a nuanced conversation a lot easier.
AI are tools of a different kind.
AI are indeed tools. Historically, tools have been used to eliminate jobs. There are no exceptions to this. Long before the tool is able to make the artists lives easier, employers start chopping off headcounts. The game industry is already plagued by massive and frequent layoffs.
No one is talking about how Steam already saw an absolute flood of ultra low quality content when they started allowing porn. And no, I’m not saying all porn on steam is low quality. When I say “low quality content” I’m talking about the barely-a-game software products where you uncover an erotic image, which as a genre probably makes up double digit percentages of all the releases on steam now; the same game copy pasted over and over again with ever so slightly tweaked visuals and a new nude.
What I’m saying is the flood has already come, and people are navigating it. It’s already a problem, but it’s being dealt with. Steam isn’t becoming the Kindle Store of vidya, it already is. Shovelware has been a part of the gaming industry since well before the internet and we haven’t been doomed yet.
Also, the guys behind Palworld complaining about shovelware on Steam is rich. Just because your shovelware became a meme doesn’t make it quality.
Re:
Yes, but there’s a “don’t show me porn” toggle.
Re: Re:
Not a solution. That function is for religious weirdos, it will filter out a lot of games just for nudity (Cyberpunk 2077 doesn’t belong in the same bucket as “Super Boobie Clicker”)
Steam games
How many novels are published every month? Some good and some less so but an easy to enter market with editors and publishers curating and recommending etc. some will be slop and some AI slop just like games so the Steam-GameShop Needs to think more like Steam-Book-Shop.
very naive
Distilling AI use to “just another tool” is amazingly, reductively naive. The entire tech is built on theft, exploitation, and abuse, and every corporation involved in forcing AI into everything everywhere is deeply anti-human, anti-social, and at their core, fascist.
So, exactly like the wider Internet. Or Amazon, where a search will often produce a page of non-matching shit (“Sponsored”), maybe followed by some matches mixed in with shit, then a long tail of shit (if the search works at all; perhaps a quarter of the Amazon-search links that show up on DuckDuckGo just give a “something went wrong on our end” error).
But long before the latest wave of stuff being falsely advertised as A.I., web searches were already like that for 15-20 years, producing garbage pages that have matching keywords—maybe just dumped with no pretense of prose, or maybe as Markov-generated slop. Ever download a PDF “product manual” that’s just a link back to the same bullshit site it came from?
I don’t see why any particular aggregation site should be held out as having a specific problem. Make it easy for people to get refunds, and the operator will quickly get a list of potential garbage to be inspected. Not that such sites even really need to do “curation”; the lack of it doesn’t seem to be harming Amazon much, even if you do see people writing “I’m wary of counterfeit products” frequently on message boards.
Oh hey, the regurgitation engines drowning anything decent with their endless sea of garbage finally entered an arena you care about?
You’re about a year behind the curve here, arguably more. Better late than never, I guess.
The tools aren’t good.
You said if the tools are good. They’re not. If false then Santa exists returns true, you know.
Re:
Claiming the tools aren’t good these days immediately outs you as a naive fool. What is true is the tools are not as good as the hype from the companies. And they’re not going to get rid of the need for people. But in competent hands, they’re incredibly powerful.
Anyone who says they’re not good clearly either has not used the most modern versions… or has no idea how to use them well.
I’m going to assume you most likely tried older tools and have not experimented with the more modern ones. But claiming that good AI is like Santa Claus is laughably wrong.
Re: Re:
Yeah? That’s funny given Bsky’s recent, well-publicized self-DoS.
The tools produce unmanageable amounts of stuff, which cannot be properly checked or managed as a result, and there’s plenty of errors still hidden in them. Lots of times, when I see a regurgitation engine try to give me info on something I have domain-specific knowledge of, I can spot multiple errors. It is the Elon-Musk-talks-about-a-subject-you-know-something-about thing; I have no reason to think they’re any more accurate in subjects I am not versed in.
They create an illusion of productivity (but actually cost more human time), are eating insane amounts of hardware, actively atrophy the abilities of people who use ’em, and are gonna citogenesis into oblivion their own training data as the oroborous eats itself.
Re: Re: Re:
You seem particularly confidently ignorant.
It’s fine. You don’t have to like the tools, but each time you reveal your ignorance doesn’t make you look cool or savvy. It just is “old man yells at cloud.”
As I said, I agree that there’s a ton of hype. They’re not as useful as the hype screams. And they’d be terrible at replacing people. But they are incredibly useful in the right hands.
Clearly, not yours.
Re: Re: Re:2
Yeah?
Bluesky says they managed to screw up and DoS themselves by via logging.
Bluesky, you included, are ride-or-die for vibecoding.
I don’t think these are unrelated.
Re: Re: Re:3
Again, you are an extraordinarily overconfident fool.
You should maybe stop being so confidently wrong about shit you don’t understand.
Re: Re: Re:4
Not for nothing, Mike, but…Bluesky did DOS itself via fucked-up code, and at least one staff member has admitted to using AI-generated code.
Re: Re: Re:5 Not one staff member, the whole platform.
‘Why are you blaming AI? Bluesky being down had nothing to do with AI, you people know nothing!’
Meanwhile…
https://bsky.app/profile/jay.bsky.team/post/3micqcyeawc2g
‘Jay 🦋
@jay.bsky.team
Bluesky is made with AI, the engineers and even some non-engineers use Claude code.’
Gee, wonder why people are skeptical about claims that that AI had nothing to do with Bluesky’s issues which came from a coding error?
Re: Re: Re:6
It’s hilarious how the people who thought Bluesky fucked itself up with AI-generated code and the people who thought Bluesky got fucked up from a DOS (distributed or otherwise) were both right.
Re: Re: Re:7
There has been zero evidence that Bluesky had any problems from AI-generated code. Stephen, I didn’t think you’d stoop so low as to make up shit.
Stop it.
Re: Re: Re:5
It is flat out wrong to claim that AI code is what caused the DDoS.
What Bluesky has said is that their engineers used tools like Claude Code, but that all code still goes through the same testing and reviews that all their code goes through. It’s not like their just telling Claude what to build and rushing it into production.
It’s incredible what kinds of conspiracy theories people come up with just because they hate these tools they don’t understand.
Re: Re:
Yeah? That’s funny given Bsky’s recent, well-publicized self-DoS.
The tools produce unmanageable amounts of stuff, which cannot be properly checked or managed as a result, and there’s plenty of errors still hidden in them. Lots of times, when I see a regurgitation engine try to give me info on something I have domain-specific knowledge of, I can spot multiple errors. It is the Elon-Musk-talks-about-a-subject-you-know-something-about thing; I have no reason to think they’re any more accurate in subjects I am not versed in.
They create an illusion of productivity (but actually cost more human time), are eating insane amounts of hardware, actively atrophy the abilities of people who use ’em, and are gonna citogenesis into oblivion their own training data as the oroborous eats itself.
Re: Re:
If thinking the slopengines produce techdebt and shit code is being a naive idiot, what would you call that BlueSky dev who chucked their phone in a pool to cool it? Or the people who decided that brain was worth paying six figs?
If you think the tools are so great, maybe replace them with the machine.
Re: Re: Re:
Those are words. They don’t seem to form sentences that mean anything to me. But they are words.
You seem particularly susceptible to weird conspiracy theories. Maybe work on that?
This comment has been flagged by the community. Click here to show it.
Re: Re: Re:2
All I’m saying is, personally, I’d be careful about giving Why any more phones.
Re: Re:
If thinking the slopengines produce techdebt and shit code is being a naive idiot, what would you call that BlueSky dev who chucked their phone in a pool to cool it? Or the people who decided that brain was worth paying six figs?
If you think the tools are so great, maybe replace them with the machine.
Re: Re:
‘Santa Clause’, you say?
Re: Re: Re:
OMG MIKE MADE A TYPO!!!!!!!!
Your big win of the day.
Re: Re: Re:2
I think it’s ironic in the context of bigging up the power of ‘AI’, yes. Particularly given the ‘typos’ that’ve been creeping into articles and headlines on this site.
Re: Re: Re:3
This site has always had typos. We’re human. I’m not sure wtf that has to do with AI? If anything it shows the opposite. If we over relied on AI there wouldn’t be those typos.
Re: Re:
Good AI currently does not exist in my field, and the evidence strongly suggests it’s never going to exist. Hallucinations have been getting worse over time, not better.
You just can’t use a tool that might hallucinate in a field that relies on facts and case history. My field is a sub-type of law, and I have seen enough hot-shot lawyers, people who are about my age but haven’t been in this sub-category as long and are more used to general law, get in trouble by relying on AI and get shot down for it.
Frankly, the tools are toxic and touching them seems like a fast-track to getting censured or disbarred.
The tools aren’t good.
Re: Re: Re:
Entirely possible. I wouldn’t use AI in law at all. It does seem quite a silly place to use it.
But I am telling you that in many other fields it is very useful as a tool. Where it tends to fall down is when it’s used as a replacement for humans. But as a power assist, there are many industries where it is quite useful.
You insisted that because it was not useful to you, it could not possibly be useful to anyone. To me, that’s a sign of a very dumb person who cannot think beyond his or her own experiences. I can tell you, quite clearly, that as a tool it is incredibly useful in many fields, but again, as an assistive tool, generally where the hallucinations are not a problem.
The issue it seems, is that you have a very narrow (and quite limited and ignorant) view of what kinds of tools there are today. And you stupidly assume that your narrow view is representative of the whole.
You’re wrong.
Re: Re: Re:2
The hallucinations make it clearly useless to anyone in coding: It seems to me that running a code that was spit out by a ‘being wrong’ machine seems like a recipe for disaster.
What that leaves is, essentially, lorem Ipsum generation. Something to make background stuff that’s not intended to be inspected very carefully.
I have no issues with AI as a Lorem Ipsum generator, but that’s simply not how people are using it, and given how much it seems to cost, that also doesn’t seem like a cost effective use for it.
Normally these tasks are done by interns who’s pay isn’t great run through a bio-computer about the size of a bowl of pudding and using fewer calories than an incandescent light-bulb.
Re: Re: Re:3
Which is why no serious company would simply run code that was created that way.
This is the part you seem to miss. You idiotically assume that everyone using these tools just have it “spit out” things and then do nothing but run it. That’s not how it works.
I’ve built multiple tools that I use every day with AI, but they go through extensive testing and review first. Sometimes there are mistakes, but there are mistakes with humans as well.
The idea that they are ‘being wrong” machines is so incredibly ignorant. They are mostly pretty good at certain tasks (and very good at coding tasks, especially when directed by someone who knows what they’re doing and simply asking them to cover the drudge work, which will then be checked and reviewed by an expert).
I think the exact opposite is true. They are quite useful in working on projects that involve a lot of drudge work, which a skilled practitioner can then review later and check thoroughly.
Or, in my case, I’ve built now half a dozen tools that I use for various projects all the time, where they are not for anyone else to use, and where they are “wrong” (very rarely, if at all), the problems are contained and only create a slight time sink for me to figure out what went wrong.
It’s such a weird experience to hear from people insisting that tech I use literally every day and which has enabled me to work on amazing projects successfully is somehow “useless.”
Again, I don’t deny that the companies oversell the tech. And there are many areas I would never use them in. But you are taking a single example of using a tool for a stupid purpose and insisting that it only does wrong stuff and is therefore useless.
You’re wrong.
Re: Re: Re:2
“If I call people an idiot often enough, I’ll look convincing” is not actually going to work.
Re: Re: Re:3
Don’t want to be called an idiot? Stop saying idiotic things.
Re: Re: Re:4
… wow, without any irony at all 😂
Re: Re: Re:2 The Masnick doth protest too much, methinks
Nothing gets Mike into the comments like a chance to get real defensive against anti-AI sentiments.
You got some investments in the tech you need to pay off and feel jittery about it, or do you just need to defend your subscription costs to yourself?
Re: Re: Re:3
Honestly, it’s the thing that people here are so focused on that I least understand, which is why I get involved in the comments. Techdirt has always been pro-innovation, but against bad tech companies. This is no different.
What I simply do not understand, and why I jump into the comments on this subject, is the people who are so extreme as to insist there is no possible useful version of the tech, which is just simply wrong. I know so many people who are using it in a way that is helpful and valuable.
I’m fine with criticism of the companies (most of which suck) and with the hype. But people taking the hype to the level of claiming that the tech is all garbage are so out of touch with reality, and I don’t understand why.
Neither. I don’t have spare money to invest in shit. My only “subscription” cost is the $20/month I pay to Anthropic for Claude Code, and if that turned out to be useless, I’d stop paying and get on with my life. But already that $20 to Anthropic/per month has saved me over $50/month in subscriptions to other services that I’ve canceled (my previous task management tool, a calendaring tool, a video conferencing tool, and a note taking tool).
So, so far, I’ve saved money via AI tools, and I’m working to replace some other services as well, so it should end up saving me even more.
Re: Re: Re:4
Personally, I’m all for the “drudge-work reduction” aspects of AI tools – for example, analyzing thousands of support cases to find trends and provide useful insights, which would take me days to do myself. I do have serious concerns about the energy requirements of the data centers and have absolutely zero faith in the AI companies themselves to not destroy entire communities or environments in their pursuits of more processing power. The background cost of the tools are potentially so high and devastating that I have doubts that even the useful parts of the tools are worth that cost.
What I don’t see any value in are the uses that generate “art.” Images, movies, plots, novels, even music – it’s in these realms that I see very little of value; or rather, nothing worth paying for.
I have respect for the time and effort it takes to draw a picture, to write a story, to animate a scene, to compose a song or tune. I could do these things, but I don’t have the drive or desire to become skilled in these arenas. Even if I were, I would take heart in the connection with the creator of those other pieces that experiencing them brings.
When you replace that with AI-generated stuff – I no longer have any connection. The human behind it becomes a supervisor – offloading the actual creative endeavors to a thing that does not actually think, but instead predicts what is most likely to look like what the prompt asked for. They then pick what they like most out of the results, and string it together into something hopefully coherent – and why should I engage with that?
For me, it ends up in the same bucket as the “maximize profit” products coming out of the big name gaming companies – the “games” that seek to extract as much money for as little effort as they possibly can. ‘Gacha’ games are a hard pass for this reason.
These uses of AI produce, to my mind, nothing of value – so the cost for these uses drives their value into the negative. So for games that used AI to create their assets – I don’t see a reason to buy them. And companies and devs at large have done a very good job of destroying trust with their uses of AI so far, and even with their actions predating AI, so with such a short supply of trust in play, I find any use of AI in this market to be suspect.
I can’t trust the people using or creating the AI. I can’t trust what they say about how they’re using it. So, in my view, I’m better served to avoid anything that ending up using it.
Re: Re: Re:4
My hypothesis: It’s because they’re terrified it’ll be used to deny them the ability to work in the way they’re accustomed to because the way they’re accustomed to will be deemed “obsolete” and the mass use of it will be used to justify starving the resources from how they prefer it, so refusing to concede any use is a defense mechanism to take away the weapon that would destroy how they work.
Like how the mass-adoption of CGI pushed out practical effects, or the way the internet got gentrified by things like mass-adoption of smartphones; the death of Flash; the move away from independent forums to mega-sites and the way said mega-sites penalized external links in the name of “spam prevention,” the way it got way worse and more hostile to be an artist online after everyone moved from Tumblr to Twitter, or the way those of us who hate non-chronological feeds and infinite scroll had them shoved down our throats in a way where we can’t opt out.
There’s been one forced disruption after another, hell algorithmic feeds that prioritize speed over quality are practically designed to push out craft for slop, and it’s been sold as an inevitable result of the popular will and increased efficiency, so it makes perfect sense people would try to fight back with “Actually, it’s not something people wanted,” yanno?
Like, I agree with you there’s a lot of uses for them, but a lot of anti-anti-AI stuff is tone-deaf on that, and I feel like that’s what needs to be addressed.
Even the implorations that we need a labor response rather than a copyright one, while I agree with the principle, feel tone-deaf, because the average trans 20-something barely scraping by on commission money unable to work other jobs due to chronic pain (A very common demographic I see in anti-AI-imagegen circles) doesn’t feel like labor works for them; it works for people who have Real Jobs, and I think there needs to be a push from the tech side of things to make an actual labor response that does work for them.
I suppose my point is, that seemingly irrational behavior denying the possibility of progress makes more sense when you think of how “progress” can look more like a threat to destroy than a promise to those under the steamroller, and it’s the obligation of those of us who believe in actual progress to make the first move to make sure they aren’t flattened into paving material for the road to tomorrow, yanno?
Re: Re: Re:2
Mike, the only position where being confidently wrong is acceptable is CEO. And that position exists as a parasitic position for the wealthy to use to extract wealth from everyone else, so they’re not going to replace it with AI.
Re: Re: Re:3
You must be a CEO then, because you have been nothing but consistently wrong for quite some time on this site.
Curation is not that important
“And if Steam is still going to be of any value at all to the consumer, Valve better be thinking right damned now how it’s going to get more involved in the curation of what shows up on its platform.”
Ridiculous. I am a regular long-time customer, and I don’t care about its curation and never did, and still Steam has a lot of value for me. AI junk is not going to change that value for me. I don’t seek games to buy on Steam. If I search for good games to buy and play, I check out YouTube channels to see what is good, this way I see what the real game looks and plays, and know usually what I’m getting, I go to Steam, buy it, download it, and play it.
Steam has value for me for a place to buy, tech support, game library storage, tools, modifications, updates, patches, and refunding if game is not what it is advertised or playable for me. Curation is irrelevant for me as a customer. I have known for a long time that there’s a lot of crappy games on steam, nothing new. Curation simply is not where the value lies in Steam.
Have you considered that the reason Steam leaves curation to the customers is maybe because they know customers don’t care that much about curation? Why pour a lot of resources into something if customers don’t care much about it? And there is no accounting for taste, plenty of junk is someone else’s treasure. So what’s the point of trying to account for taste if Steam is trying to be a store that fit all kind of taste ?
The iOS store app is crappy for curation too. a lot of junk there too, and still it has some value. I just don’t look for mobile games to buy in there. I don’t spend time in there more than I have to. If i want something new to play on the iPhone, i check somewhere else to see what’s good then I go in there to buy and download then I get out, that’s it. To me, Steam is much the same. just a store and tool like the iOS app, except it’s better as it has a lot of useful extras, so there’s more to it, but it’s just not that valuable as a curation tool, and if it becomes much less valuable as a curation tool because of ai junk, it’s no huge loss, because that feature was never that great or that valuable in the first place. I’ll stick with YouTubers as they have better taste than Steam will ever have.
“By now, some of you will be tired of my calling for a more nuanced discussion about the use of AI and machine learning tools in the video game industry.”
Quite right. So STFU, AI cockgobbler.