AI Could Create A Massive Problem For Valve’s Steam

from the flood-the-zone-with-shit dept

Two trends that I’m very interested in are about to collide and it’s going to be a mess.

By now, some of you will be tired of my calling for a more nuanced discussion about the use of AI and machine learning tools in the video game industry. I get it, but I’m also not going to pretend like I don’t still hold that very same view. AI tools are just that: tools. If the tools are good and used at the behest of the artists in the industry to make better games, that’s a good thing. If they upend artistic intent or simply suck, that’s a bad thing. And on the matter of jobs within the industry, if there is a net reduction in jobs, that’s bad! If AI lowers the barriers of entry for otherwise creative people and the result is even more jobs within the industry spread over more studios and, importantly, more cultural output in the form of games, that’s good!

Except when it’s not. And even if the AI evangelists are right, or those of us who see the possibility that AI use will ultimately result in more people in the industry and more games released to the public are right, that can still present very real problems within the industry. And I think there could be a serious one looming for storefronts like Steam.

This concern calcified in my head somewhat when I came across indie publisher Mike Rose, known for producing Yes, Your Grace, talking about just what all of this output could mean on Steam specifically.

“From a publisher perspective specifically, it’s mega annoying,” Rose tells GamesRadar+ in an interview, echoing other publishers like Hooded Horse. “If we thought the number of games being launched on Steam was crazy before, now it’s just impossible. During the last Next Fest, it seemed like around 1/3 of the demos had either AI generated key art, and/or AI-generated content. So now we have that to compete with too. Hurray!” Publishing lead John Buckley of Palworld developer Pocketpair called out the same AI trend in the latest Steam Next Fest.

Steam, as a focal point for the more open PC gaming market, is the clearest barometer for the rising quantity of games, with over 20,000 releases fighting for space every year. Even with Valve sticking to AI content disclosures for games listed on Steam, the rise of AI tools will only contribute to the torrent of content flooding the platform as games – or at least AI-made things game-shaped enough to be sold – become easier to produce.

Claims that there are too many games being released on Steam certainly isn’t new, nor has it historically been tied to anything to do with artificial intelligence. There have been complaints about this, as well as Valve’s apparent lack of interest in playing any real curation role, going back to 2023. Wait, make that 2020. Oh, wait, it actually goes back to 2015.

But while Steam hasn’t yet collapsed under the weight of its own volume of releases on the platform to date, the through line to all of that criticism has been Valve’s stoic apathy towards keeping up with the volume when it comes to helping its customers navigate the flood.

And that could be a very real problem for the platform. Steam’s value to the consumer, besides being the most recognizable outlet for PC gaming, is in its curation capabilities. To date, other than providing some search filters and a few tools to personalize the recommendations it makes for new titles to you, Steam has mostly left curation up to the customer themselves, or third-party list-makers. Meanwhile, the process for listing a game on Steam has not changed appreciably in the past several years. It’s still the same $100 entry fee to get your title listed. You still have to jump through all the registration steps with Steamworks, generate an app ID, build the store page, upload your assets. Then you wait for Valve to do its own review before you can publish your game, but that mostly amounts to ensuring that you’re compliant with Steam policies, that the game can launch successfully, and that’s about it.

With a potential flood of PC games coming, that sure doesn’t feel like enough to keep the platform from becoming an unnavigable wasteland where you can’t tell the gems from the slop. And, barring any new rules limiting to what degree AI can be used in game creation, that tidal wave is coming.

On this point, Rose focuses on “the elephant in the room” here: “It’s probably never going away again.”

“People can now make stuff by telling a bot to make it for them, and you know, the thing is that humans are mega lazy,” he reasons. “I don’t even mean that as an insult! We just are. So for a lot of people, if there’s a choice between ‘spend a bunch of time and money making a cool thing,’ vs ‘type some prompts into a program and the thing is made for me very quickly’ – the average person is going to pick the latter.

And that’s the thing really: Our feelings on it don’t matter. It doesn’t matter that a bunch of us don’t like genAI. It’s gonna get used now, and it’ll get used more and more. As the kids say: Video games are cooked.”

I don’t think that video games are cooked, but his point that AI will be in use in the industry is the one I’ve been making for months now. We have to be talking about how it will be used, not if. That ship has sailed.

And if Steam is still going to be of any value at all to the consumer, Valve better be thinking right damned now how it’s going to get more involved in the curation of what shows up on its platform.

Filed Under: , , , ,
Companies: valve

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “AI Could Create A Massive Problem For Valve’s Steam”

Subscribe: RSS Leave a comment
91 Comments
Stephen T. Stone (profile) says:

We have to be talking about how it will be used, not if. That ship has sailed.

Well, funny thing about that: Every time I’ve seen a company sail that ship and get caught sailing that ship without telling people they sailed it, even if only by accident, that ship gets set aflame. At least three games (and a fourth if you don’t believe the “we didn’t use AI” stuff for the Tomb Raider skins) have been caught with AI-generated content that was used without disclosure, and each time, the response to that usage was overwhelmingly negative.

Generative AI, even in its most “ethical” version, isn’t something that a lot of people want. Even if a game turns out to be shit, people would prefer it was hand-crafted shit rather than AI slop. And for all the evangelizing about “democratizing creativity”, the fact of the matter is that AI slop will come to be seen as the opposite of what its evangelists want it to be seen as: Where they want AI slop seen as true art and its “creators” seen as artists, everyone with some goddamned sense will see it as the apotheosis of laziness and non-creativity. Oooh, someone picked out a few settings and wrote a prompt for an AI image generator! So fucking what. Bob Ross was more creative with a half-hour’s worth of time and some oil paints, and all he did on his show was paint landscapes.

I understand that there are legitimate uses for what we colloquially refer to as “AI”. But generative AI, especially as a replacement for human-made creative works, is not one of them. Wanting to be an artist is fine; wanting to be one without putting the work in to learn and hone and master a craft is narcissism mixed with arrogance.

frankcox (profile) says:

Re:

“Wanting to be an artist is fine; wanting to be one without putting the work in to learn and hone and master a craft is narcissism mixed with arrogance.”

How far would you like to carry this standard?

Does it also apply to people who drive a car without doing their own maintenance? Or those who write computer programs without knowing how to change out their motherboards?

If you eat steak, did you butcher the cow?

Stephen T. Stone (profile) says:

Re: Re:

How far would you like to carry this standard?

Anyone who uses generative AI as a replacement for the process of making a creative work⁠—an illustration, a video, a piece of writing of any kind, a song, a blog entry, whatever⁠—is not an artist in any measurable sense of the term.

People who, say, use samples in their music still take the time to listen to the songs they’re sampling and figure out some new way to twist the sample in question into something new. Someone who generates an AI-created song isn’t doing any of that. They’re just going “here’s what I want a song to be about and the style I want it to be in” to the Emptiness Machine and expecting the perfect result without any real process beyond “push button, receive song”. You won’t find any artistry in works made by generative AI because the artistry of any work lies within the process of creation.

Those other comparisons are all bogus deflections with no direct 1:1 correlation to the act of artistic creation. You can drive a car without knowing a goddamn thing about the engine (though that knowledge would help), you can code a application without knowing anything about how your device works, and you can eat meat without having obtained it from the source yourself. But creating art⁠—as in really creating it, not asking a computer to do all the work for you? That requires knowledge and skill of how to craft that work. Writing a book of any kind both is and isn’t as easy as “put one word in front of another”; if you understand why I say that, you’re already one step closer to being more of an artist than your average AI slop enthusiast.

Anyone who thinks they’re an artist because they generated something through the Emptiness Machine is fooling themselves. Anyone who believes AI “art” is the future of art/media/pop culture is climbing a stairway to Heaven that’s made of cardboard.

frankcox (profile) says:

Re: Re: Re:

“Writing a book of any kind both is and isn’t as easy as “put one word in front of another”; if you understand why I say that, you’re already one step closer to being more of an artist than your average AI slop enthusiast.”

Interesting that you mention that because I am, in fact and in real life, an author. Though I don’t write books, at least not so far, I just do stories and “novelettes”.

I suppose my examples would have been better if I had asked if you need to know how to tune an engine in order to be a race car driver, or how to butcher a cow to be a chef.

I guess I really don’t know enough about how this AI stuff really works, and I’ve never been sufficiently interested in the subject to make any particular effort to understand it.

Yeah, I’m an old coot — so what do you expect? 🙂

But I do find the debate around it interesting and the variance in perspectives is intriguing.

Stephen T. Stone (profile) says:

Re: Re: Re:2

I suppose my examples would have been better if I had asked if you need to know how to tune an engine in order to be a race car driver, or how to butcher a cow to be a chef.

Moderately, yes. That said, while I would think a race car driver doesn’t necessarily need to know how to tune an engine, the knowledge is definitely helpful. And a chef doesn’t need to know the ins and outs of butchering scow to cook the meat that comes from said cows. An artist who is committed to being an artist, on the other hand, should and must know the process of their art so they can justify the decisions of that process. That’s to say nothing of building a base of inspiration and being able to explain how certain works inspired you. An AI slop enthusiast might be able to tell you what styles they chose for a generated AI image and why, but they won’t be able to tell you why the Emptiness Machine generated the image in the way it did.

Stephen T. Stone (profile) says:

Re: Re: Re:4

What if they don’t care about being considered an artist, and simply want to cause their vision of a game to exist, with the least amount of cost and work?

I don’t really care if AI slop enthusiasts want to be considered artists. They’re still lazy assholes for using generative AI. If the only way someone can make a game is through generative AI, that game probably won’t be worth playing⁠—just like images, songs, videos, and written works made with generative AI are worth less of my time than it took for an LLM to generate them.

Anonymous Coward says:

Re: Re: Re:5

Game makers became lazy in 2000s when first game engine appears (like Flash), because “real” game makers were supposed to be craftsmen coding in assembly and making assets a pixel at the time, not drag-dropping and coding scripts.
But games then became utterly complex, with dozen platforms to support, unlimited possibilities of optimizations, immersive sound effects, good narrative, natural gameplay, decent accessibility, online gaming and so on, that a simple FFS game could requires years of work for a team of few experienced devs. Even Candy Crash was made by a dozen people, and GTA 5 took 5 years involving 1,000 devs.
AI is still just a tool, you cannot prompt “make a great game that doesn’t look like anything people have seen before. no mistakes plz” yet. You’ll have slop, crashes, slowdowns, broken gameplay, and unsupported screen sizes, but you may have something to publish in less than a year, and could see if at least someone would dare to play it. Then once it’ll be a great success (it may never happen, as nearly every time), rewrite from scratch with a dedicated team.

Anonymous Coward says:

Re: Re: Re:2

I could quite easily explain to you how I selected and prepared this Costco frozen pizza, and some principles behind the choice of ingredients Costco used, but claiming that makes me a chef would be quite crazy.

But of course, much like the US patent office, lots of people lose their minds as soon as “on a computer” is added to the explanation.

TKnarr (profile) says:

Which all translates to Steam becoming for most people what it already is for me: a place to buy the games I already know I want from other people’s recommendations or other sources outside Steam. I don’t go browsing Steam looking for games. I go elsewhere to people I know or to review and recommendation sites I know and find games I’m interested in there. Steam is the cashier, nothing more.

That might stem the flood of cheaply-made crap games, but I’m not optimistic. The cost of making and publishing them is so low it only takes a few suckers to make it profitable, and there’s always new suckers coming along who don’t know how to navigate the bog.

Anonymous Coward says:

lowers the barriers of entry for otherwise creative people

The barrier is already low. You can download thousands of dollar’s worth of software like Krita, Blender, and Godot for free and start making your game yourself right now without having to rely on some third party SaaS code generator that can arbitrarily throttle or even revoke your access:

https://x.com/patomolina/status/2045281665363386504

TTASDU6 (profile) says:

Software used to be the one industry you could be almost completely self-reliant

lowers the barriers of entry for otherwise creative people

The barrier is already low. You can download thousands of dollar’s worth of software like Krita, Blender, and Godot for free and start making your game yourself right now without having to rely on some third party SaaS code generator that can arbitrarily throttle or even revoke your access:

https://x.com/patomolina/status/2045281665363386504

This comment has been deemed insightful by the community.
frankcox (profile) says:

I don't understand the hate

I don’t understand the hate for “ai generated” stuff.

Maybe it’s because I don’t play games and am not really exposed to that culture.

But I’ve seen some AI generated pictures that look pretty darn impressive and the fact that they are “AI” doesn’t make them less interesting or amusing than they would be if they were hand drawn.

On computers, even “hand drawn” isn’t; art programs all allow you to draw perfect lines and circles and angles of all sorts. Computer art is a long way from a pencil on paper or paint on canvas, regardless of whether AI is involved or not.

So here’s a genuine question:

Why all the hate for AI? If something is good, it’s good, and the tool used to make it, whether it’s a paintbrush or a chainsaw, seems like it shouldn’t be relevant to the worthiness of the end result.

TKnarr (profile) says:

Re:

Because the source material they use is all appropriated from actual people without their permission, then used to generate profits for the AI companies without giving those people any of it. Without the work the models were trained on there wouldn’t be any profits for the AI companies, but they look at it as their right to use everyone’s work without permission and without paying for it.

On top of that, there’s the prodigious environmental and economic costs of those models. They require data centers so big they can’t be used for anything else, consume so much power that it destabilized the electric grid in the area and drives prices sky-high for everyone else, and consume memory (both DRAM and SSD memory) in quantities that leave nothing for the rest of us. Literally. There will be no available DDR5 nor SSDs for consumers this year, the AI companies have bought up 100% of the production for the entire year.

Now, is some nice imagery worth all that?

frankcox (profile) says:

Re: Re:

Is some nice imagery worth all that?

It seems that a lot of people believe that the answer to your question is yes it is.

Folks here and on many other websites have spent years decrying copyright overreach and excessive terms and restrictive contracts and on and on.

But now that there’s an industry using “creative data” (I can’t think of a better term) in a wholesale manner, those same people are screaming about “artist’s rights”.

Without taking a position on the right or wrong of this issue, I wonder if this kind of thing is the future of creative endeavor. For good or for ill, it certainly appears to have democratized the creation of a lot of artistic stuff, or at least a lot of stuff that looks pretty darn artistic to me.

Are folks who rail against AI in the same position as groups of monks after the printing press suddenly made books available to the masses?

I genuinely don’t know the answer, and I suspect it’s because I simply don’t know enough about AI.

Again, I’ve looked at a few pictures and thought they were pretty nifty, but other than that my exposure to the whole thing has been pretty much limited to what I read on websites like this one.

TKnarr (profile) says:

Re: Re: Re:

Because most of the “copyright overreach” isn’t on the part of the creators, it’s on the part of the middlemen (publishers and distributors and the like) who take the lion’s share of the profits and give the actual creators a pittance. There’s no contradiction in being in favor of creator’s rights and simultaneously in opposing copyright overreach.

Tip Tappers (profile) says:

Re: Re: Re:2

My problem with that argument is, most of the copyright “reforms” people are suggesting are for the benefit of the corpos rather than the artists.

Like, the copyright alliance people were cheering on the anti-Internet-Archive lawsuit, especially ex-RIAA ghoul Neil Turkewitz, who has overtly stated that the anti-IA lawsuits are a “dry run” for anti-AI lawsuits.

Now note how the Wayback Machine is being screwed over by news sites over fears of scraping “rights” and you can begin to connect the dots. It’s an astroturf for big copyright, and everyone is falling for it because they don’t think the leopards will eat their face.

The law is a blunt instrument, it doesn’t care if you hate copyright overreach and AI, it is what it is designed to do, and I see where that design is going and it scares me!

Stephen T. Stone (profile) says:

Re: Re: Re:

Are folks who rail against AI in the same position as groups of monks after the printing press suddenly made books available to the masses?

Generative AI does nothing that is even remotely comparable to the printing press making written works cheaper to produce and widely available to the masses.

You can keep coming up with those bogus comparisons, but you won’t find many people here willing to treat them seriously. They’re all an attempt to deflect from the numerous already-mentioned problems with generative AI, not the least of which is how “democratizing creativity” means worsening the culture divide by way of making a shared culture that much harder to achieve. Imagine if anyone could generate an entire feature-length Avengers movie for their specific tastes. Why, then, would anyone watch the ones produced by Marvel/Disney? And how would we be able to talk about specific Avengers movies when there would be so many different ones that we’d have no frame of reference for what movie any two people would be talking about? For that matter, imagine if someone makes an Avengers movie with a sex scene in it and chooses to “use” the same actors as the MCU films⁠—would the use of the likenesses of, say, Scarlett Johansson and Robert Downey Jr. be morally and ethically acceptable even (and especially) if they never gave their approval to have their likeness used that way?

We can’t put the genie back in the bottle, that much is true. But generative AI is less a genie and more a monkey’s paw: You get what you wish for, but it comes with a cost you probably won’t like.

Stephen T. Stone (profile) says:

Re: Re: Re:3

What is your position on Elvis impersonators, and “tribute bands” in general?

My snarky-yet-serious position is that you’re trying to deflect from the direct yes-or-no question of “is it morally and ethically acceptable to use someone’s likeness without their permission in a way that said someone would never have approved”. But if I must: I generally have no issue with those things because they still require a modicum of talent and there’s no mistaking the tributes for the real thing. (Especially with Elvis, since he’s dead.) With generative AI, it’s entirely possible to create something that looks or sounds like “the real deal” (e.g., a celebrity deepfake) and pass it off as genuine to, say, a dumbass in a hurry.

drew (profile) says:

Re: Re: Re: Let me just pick up on one thing here...

AI is not democratising art. This stuff costs more money to create than doing stuff manually. Any half-decent AI output has gone through multiple layers of prompts, and/or is being driven by someone who has spent a lot of time refining their prompts, and that means signing up to a paid service.
At a rate that excludes large parts of the world and people on reduced incomes.
A pencil and paper, a cheap second-hand guitar, hell, even a tablet with a word processing programme on are all cheaper than a year’s subscription to these tools.

TKnarr (profile) says:

Re: Re:

And after all that, they don’t really work that well for most of the use cases their advocates are presenting. I have to work with Github Copilot at my job, and at the best of times it’s like mentoring a junior engineer: I have to hand-hold it through the entire process, explaining to it what it got wrong and what it needs to do to correct it at every step. As with mentoring another engineer, I could’ve just done the work myself faster. Unlike that junior engineer, I don’t get a more experienced and better engineer out of it. Copilot never improves, and I take a hit to my productivity because of it. At it’s worst it blows up so badly I have to just abandon the attempt (which at least saves time, which is a sad commentary on it).

Anonymous Coward says:

Re:

People literally draw/paint with a stylus on computers, and have done since ages. It’s hella harder with a mouse, so credit to those who suffer through it. And understanding how to really use the tools is as complex as mixing paints and choosing host media, etc.

Sure, some ppl will do graphic arts or childish stuff by letting the programs give you a perfect circle and bucket-filling it. Not very good art.

frankcox (profile) says:

“they don’t really work that well for most of the use cases their advocates are presenting.”

If that is truly the case then the whole thing will blow over and become irrelevant in a short period of time.

If a tool manufacturer was producing wrenches that broke every second time you tried to loosen a bolt, people would soon start buying their wrenches elsewhere.

Same thing here. If you’re using the wrong tool or an inferior tool for your job, then you’re in the position of the guy who went to the doctor and said, “It hurts when I do this.”

Answer: Don’t do that.

Stephen T. Stone (profile) says:

Re:

Answer: Don’t do that.

It’s hard to avoid doing the thing when every C-suite asshole with a few hundred million dollars to spare is shoehorning AI, generative or otherwise, into their businesses regardless of whether it’s needed. When the boss tells you “use this or lose your job”, chances are good that you’ll use the AI instead of risking your paycheck.

frankcox (profile) says:

Re: Re:

“When the boss tells you “use this or lose your job”, chances are good that you’ll use the AI instead of risking your paycheck.”

Yeah, that’s the same position as anyone who gets told to use a particular widget of any kind to do his job, not just AI-assisted stuff.

If said widget is not an effective tool, either the directive will change or the company will evolve somehow to account for that inefficiency (or not).

Arianity (profile) says:

And I think there could be a serious one looming for storefronts like Steam.

This is exactly the sort of thing people were getting worked up over before and got brushed off. Glad we did get there eventually, though

And that could be a very real problem for the platform

The problem isn’t really so much for the platform, but all the games that struggle to get noticed in an environment where search is broken, tbh. Realistically, Steam will be fine, just as it is now despite search being functionally useless. The lock in from convenience and it’s frontpage still serving the majority of popular games is enough to keep it going.

but his point that AI will be in use in the industry is the one I’ve been making for months now

Y’know, if you phrased it like he did, instead of as a provocative sales pitch, it’d save a lot of grief and make a nuanced conversation a lot easier.

n00bdragon (profile) says:

No one is talking about how Steam already saw an absolute flood of ultra low quality content when they started allowing porn. And no, I’m not saying all porn on steam is low quality. When I say “low quality content” I’m talking about the barely-a-game software products where you uncover an erotic image, which as a genre probably makes up double digit percentages of all the releases on steam now; the same game copy pasted over and over again with ever so slightly tweaked visuals and a new nude.

What I’m saying is the flood has already come, and people are navigating it. It’s already a problem, but it’s being dealt with. Steam isn’t becoming the Kindle Store of vidya, it already is. Shovelware has been a part of the gaming industry since well before the internet and we haven’t been doomed yet.

Also, the guys behind Palworld complaining about shovelware on Steam is rich. Just because your shovelware became a meme doesn’t make it quality.

Anonymous Coward says:

that sure doesn’t feel like enough to keep the platform from becoming an unnavigable wasteland where you can’t tell the gems from the slop. And, barring any new rules limiting to what degree AI can be used in game creation, that tidal wave is coming.

So, exactly like the wider Internet. Or Amazon, where a search will often produce a page of non-matching shit (“Sponsored”), maybe followed by some matches mixed in with shit, then a long tail of shit (if the search works at all; perhaps a quarter of the Amazon-search links that show up on DuckDuckGo just give a “something went wrong on our end” error).

But long before the latest wave of stuff being falsely advertised as A.I., web searches were already like that for 15-20 years, producing garbage pages that have matching keywords—maybe just dumped with no pretense of prose, or maybe as Markov-generated slop. Ever download a PDF “product manual” that’s just a link back to the same bullshit site it came from?

I don’t see why any particular aggregation site should be held out as having a specific problem. Make it easy for people to get refunds, and the operator will quickly get a list of potential garbage to be inspected. Not that such sites even really need to do “curation”; the lack of it doesn’t seem to be harming Amazon much, even if you do see people writing “I’m wary of counterfeit products” frequently on message boards.

Mike Masnick (profile) says:

Re:

Claiming the tools aren’t good these days immediately outs you as a naive fool. What is true is the tools are not as good as the hype from the companies. And they’re not going to get rid of the need for people. But in competent hands, they’re incredibly powerful.

Anyone who says they’re not good clearly either has not used the most modern versions… or has no idea how to use them well.

I’m going to assume you most likely tried older tools and have not experimented with the more modern ones. But claiming that good AI is like Santa Claus is laughably wrong.

Anonymous Coward says:

Re: Re:

Yeah? That’s funny given Bsky’s recent, well-publicized self-DoS.

The tools produce unmanageable amounts of stuff, which cannot be properly checked or managed as a result, and there’s plenty of errors still hidden in them. Lots of times, when I see a regurgitation engine try to give me info on something I have domain-specific knowledge of, I can spot multiple errors. It is the Elon-Musk-talks-about-a-subject-you-know-something-about thing; I have no reason to think they’re any more accurate in subjects I am not versed in.

They create an illusion of productivity (but actually cost more human time), are eating insane amounts of hardware, actively atrophy the abilities of people who use ’em, and are gonna citogenesis into oblivion their own training data as the oroborous eats itself.

Mike Masnick (profile) says:

Re: Re: Re:

You seem particularly confidently ignorant.

It’s fine. You don’t have to like the tools, but each time you reveal your ignorance doesn’t make you look cool or savvy. It just is “old man yells at cloud.”

As I said, I agree that there’s a ton of hype. They’re not as useful as the hype screams. And they’d be terrible at replacing people. But they are incredibly useful in the right hands.

Clearly, not yours.

Bloof (profile) says:

Re: Re: Re:5 Not one staff member, the whole platform.

‘Why are you blaming AI? Bluesky being down had nothing to do with AI, you people know nothing!’

Meanwhile…

https://bsky.app/profile/jay.bsky.team/post/3micqcyeawc2g

‘Jay 🦋
‪@jay.bsky.team‬

Bluesky is made with AI, the engineers and even some non-engineers use Claude code.’

Gee, wonder why people are skeptical about claims that that AI had nothing to do with Bluesky’s issues which came from a coding error?

Mike Masnick (profile) says:

Re: Re: Re:5

It is flat out wrong to claim that AI code is what caused the DDoS.

What Bluesky has said is that their engineers used tools like Claude Code, but that all code still goes through the same testing and reviews that all their code goes through. It’s not like their just telling Claude what to build and rushing it into production.

It’s incredible what kinds of conspiracy theories people come up with just because they hate these tools they don’t understand.

Anonymous Coward says:

Re: Re:

Yeah? That’s funny given Bsky’s recent, well-publicized self-DoS.

The tools produce unmanageable amounts of stuff, which cannot be properly checked or managed as a result, and there’s plenty of errors still hidden in them. Lots of times, when I see a regurgitation engine try to give me info on something I have domain-specific knowledge of, I can spot multiple errors. It is the Elon-Musk-talks-about-a-subject-you-know-something-about thing; I have no reason to think they’re any more accurate in subjects I am not versed in.

They create an illusion of productivity (but actually cost more human time), are eating insane amounts of hardware, actively atrophy the abilities of people who use ’em, and are gonna citogenesis into oblivion their own training data as the oroborous eats itself.

Anonymous Coward says:

Re: Re:

If thinking the slopengines produce techdebt and shit code is being a naive idiot, what would you call that BlueSky dev who chucked their phone in a pool to cool it? Or the people who decided that brain was worth paying six figs?

If you think the tools are so great, maybe replace them with the machine.

This comment has been flagged by the community. Click here to show it.

The Phule says:

Re: Re:

Good AI currently does not exist in my field, and the evidence strongly suggests it’s never going to exist. Hallucinations have been getting worse over time, not better.

You just can’t use a tool that might hallucinate in a field that relies on facts and case history. My field is a sub-type of law, and I have seen enough hot-shot lawyers, people who are about my age but haven’t been in this sub-category as long and are more used to general law, get in trouble by relying on AI and get shot down for it.

Frankly, the tools are toxic and touching them seems like a fast-track to getting censured or disbarred.

The tools aren’t good.

Mike Masnick (profile) says:

Re: Re: Re:

Good AI currently does not exist in my field, and the evidence strongly suggests it’s never going to exist. Hallucinations have been getting worse over time, not better.

Entirely possible. I wouldn’t use AI in law at all. It does seem quite a silly place to use it.

But I am telling you that in many other fields it is very useful as a tool. Where it tends to fall down is when it’s used as a replacement for humans. But as a power assist, there are many industries where it is quite useful.

You insisted that because it was not useful to you, it could not possibly be useful to anyone. To me, that’s a sign of a very dumb person who cannot think beyond his or her own experiences. I can tell you, quite clearly, that as a tool it is incredibly useful in many fields, but again, as an assistive tool, generally where the hallucinations are not a problem.

The issue it seems, is that you have a very narrow (and quite limited and ignorant) view of what kinds of tools there are today. And you stupidly assume that your narrow view is representative of the whole.

You’re wrong.

The Phule says:

Re: Re: Re:2

The hallucinations make it clearly useless to anyone in coding: It seems to me that running a code that was spit out by a ‘being wrong’ machine seems like a recipe for disaster.

What that leaves is, essentially, lorem Ipsum generation. Something to make background stuff that’s not intended to be inspected very carefully.

I have no issues with AI as a Lorem Ipsum generator, but that’s simply not how people are using it, and given how much it seems to cost, that also doesn’t seem like a cost effective use for it.

Normally these tasks are done by interns who’s pay isn’t great run through a bio-computer about the size of a bowl of pudding and using fewer calories than an incandescent light-bulb.

Mike Masnick (profile) says:

Re: Re: Re:3

The hallucinations make it clearly useless to anyone in coding: It seems to me that running a code that was spit out by a ‘being wrong’ machine seems like a recipe for disaster.

Which is why no serious company would simply run code that was created that way.

This is the part you seem to miss. You idiotically assume that everyone using these tools just have it “spit out” things and then do nothing but run it. That’s not how it works.

I’ve built multiple tools that I use every day with AI, but they go through extensive testing and review first. Sometimes there are mistakes, but there are mistakes with humans as well.

The idea that they are ‘being wrong” machines is so incredibly ignorant. They are mostly pretty good at certain tasks (and very good at coding tasks, especially when directed by someone who knows what they’re doing and simply asking them to cover the drudge work, which will then be checked and reviewed by an expert).

Something to make background stuff that’s not intended to be inspected very carefully.

I think the exact opposite is true. They are quite useful in working on projects that involve a lot of drudge work, which a skilled practitioner can then review later and check thoroughly.

Or, in my case, I’ve built now half a dozen tools that I use for various projects all the time, where they are not for anyone else to use, and where they are “wrong” (very rarely, if at all), the problems are contained and only create a slight time sink for me to figure out what went wrong.

It’s such a weird experience to hear from people insisting that tech I use literally every day and which has enabled me to work on amazing projects successfully is somehow “useless.”

Again, I don’t deny that the companies oversell the tech. And there are many areas I would never use them in. But you are taking a single example of using a tool for a stupid purpose and insisting that it only does wrong stuff and is therefore useless.

You’re wrong.

Anonymous Coward says:

Re: Re: Re:2 The Masnick doth protest too much, methinks

Nothing gets Mike into the comments like a chance to get real defensive against anti-AI sentiments.

You got some investments in the tech you need to pay off and feel jittery about it, or do you just need to defend your subscription costs to yourself?

Mike Masnick (profile) says:

Re: Re: Re:3

Honestly, it’s the thing that people here are so focused on that I least understand, which is why I get involved in the comments. Techdirt has always been pro-innovation, but against bad tech companies. This is no different.

What I simply do not understand, and why I jump into the comments on this subject, is the people who are so extreme as to insist there is no possible useful version of the tech, which is just simply wrong. I know so many people who are using it in a way that is helpful and valuable.

I’m fine with criticism of the companies (most of which suck) and with the hype. But people taking the hype to the level of claiming that the tech is all garbage are so out of touch with reality, and I don’t understand why.

You got some investments in the tech you need to pay off and feel jittery about it, or do you just need to defend your subscription costs to yourself?

Neither. I don’t have spare money to invest in shit. My only “subscription” cost is the $20/month I pay to Anthropic for Claude Code, and if that turned out to be useless, I’d stop paying and get on with my life. But already that $20 to Anthropic/per month has saved me over $50/month in subscriptions to other services that I’ve canceled (my previous task management tool, a calendaring tool, a video conferencing tool, and a note taking tool).

So, so far, I’ve saved money via AI tools, and I’m working to replace some other services as well, so it should end up saving me even more.

TFG says:

Re: Re: Re:4

Personally, I’m all for the “drudge-work reduction” aspects of AI tools – for example, analyzing thousands of support cases to find trends and provide useful insights, which would take me days to do myself. I do have serious concerns about the energy requirements of the data centers and have absolutely zero faith in the AI companies themselves to not destroy entire communities or environments in their pursuits of more processing power. The background cost of the tools are potentially so high and devastating that I have doubts that even the useful parts of the tools are worth that cost.

What I don’t see any value in are the uses that generate “art.” Images, movies, plots, novels, even music – it’s in these realms that I see very little of value; or rather, nothing worth paying for.
I have respect for the time and effort it takes to draw a picture, to write a story, to animate a scene, to compose a song or tune. I could do these things, but I don’t have the drive or desire to become skilled in these arenas. Even if I were, I would take heart in the connection with the creator of those other pieces that experiencing them brings.
When you replace that with AI-generated stuff – I no longer have any connection. The human behind it becomes a supervisor – offloading the actual creative endeavors to a thing that does not actually think, but instead predicts what is most likely to look like what the prompt asked for. They then pick what they like most out of the results, and string it together into something hopefully coherent – and why should I engage with that?

For me, it ends up in the same bucket as the “maximize profit” products coming out of the big name gaming companies – the “games” that seek to extract as much money for as little effort as they possibly can. ‘Gacha’ games are a hard pass for this reason.

These uses of AI produce, to my mind, nothing of value – so the cost for these uses drives their value into the negative. So for games that used AI to create their assets – I don’t see a reason to buy them. And companies and devs at large have done a very good job of destroying trust with their uses of AI so far, and even with their actions predating AI, so with such a short supply of trust in play, I find any use of AI in this market to be suspect.

I can’t trust the people using or creating the AI. I can’t trust what they say about how they’re using it. So, in my view, I’m better served to avoid anything that ending up using it.

Tip Tappers (profile) says:

Re: Re: Re:4

My hypothesis: It’s because they’re terrified it’ll be used to deny them the ability to work in the way they’re accustomed to because the way they’re accustomed to will be deemed “obsolete” and the mass use of it will be used to justify starving the resources from how they prefer it, so refusing to concede any use is a defense mechanism to take away the weapon that would destroy how they work.

Like how the mass-adoption of CGI pushed out practical effects, or the way the internet got gentrified by things like mass-adoption of smartphones; the death of Flash; the move away from independent forums to mega-sites and the way said mega-sites penalized external links in the name of “spam prevention,” the way it got way worse and more hostile to be an artist online after everyone moved from Tumblr to Twitter, or the way those of us who hate non-chronological feeds and infinite scroll had them shoved down our throats in a way where we can’t opt out.

There’s been one forced disruption after another, hell algorithmic feeds that prioritize speed over quality are practically designed to push out craft for slop, and it’s been sold as an inevitable result of the popular will and increased efficiency, so it makes perfect sense people would try to fight back with “Actually, it’s not something people wanted,” yanno?

Like, I agree with you there’s a lot of uses for them, but a lot of anti-anti-AI stuff is tone-deaf on that, and I feel like that’s what needs to be addressed.

Even the implorations that we need a labor response rather than a copyright one, while I agree with the principle, feel tone-deaf, because the average trans 20-something barely scraping by on commission money unable to work other jobs due to chronic pain (A very common demographic I see in anti-AI-imagegen circles) doesn’t feel like labor works for them; it works for people who have Real Jobs, and I think there needs to be a push from the tech side of things to make an actual labor response that does work for them.

I suppose my point is, that seemingly irrational behavior denying the possibility of progress makes more sense when you think of how “progress” can look more like a threat to destroy than a promise to those under the steamroller, and it’s the obligation of those of us who believe in actual progress to make the first move to make sure they aren’t flattened into paving material for the road to tomorrow, yanno?

Anonymous Coward says:

Curation is not that important

“And if Steam is still going to be of any value at all to the consumer, Valve better be thinking right damned now how it’s going to get more involved in the curation of what shows up on its platform.”

Ridiculous. I am a regular long-time customer, and I don’t care about its curation and never did, and still Steam has a lot of value for me. AI junk is not going to change that value for me. I don’t seek games to buy on Steam. If I search for good games to buy and play, I check out YouTube channels to see what is good, this way I see what the real game looks and plays, and know usually what I’m getting, I go to Steam, buy it, download it, and play it.

Steam has value for me for a place to buy, tech support, game library storage, tools, modifications, updates, patches, and refunding if game is not what it is advertised or playable for me. Curation is irrelevant for me as a customer. I have known for a long time that there’s a lot of crappy games on steam, nothing new. Curation simply is not where the value lies in Steam.

Have you considered that the reason Steam leaves curation to the customers is maybe because they know customers don’t care that much about curation? Why pour a lot of resources into something if customers don’t care much about it? And there is no accounting for taste, plenty of junk is someone else’s treasure. So what’s the point of trying to account for taste if Steam is trying to be a store that fit all kind of taste ?

The iOS store app is crappy for curation too. a lot of junk there too, and still it has some value. I just don’t look for mobile games to buy in there. I don’t spend time in there more than I have to. If i want something new to play on the iPhone, i check somewhere else to see what’s good then I go in there to buy and download then I get out, that’s it. To me, Steam is much the same. just a store and tool like the iOS app, except it’s better as it has a lot of useful extras, so there’s more to it, but it’s just not that valuable as a curation tool, and if it becomes much less valuable as a curation tool because of ai junk, it’s no huge loss, because that feature was never that great or that valuable in the first place. I’ll stick with YouTubers as they have better taste than Steam will ever have.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the Techdirt Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...