Apple Has To Pull Its “AI” News Synopses Because They Were Routinely Full Of Shit
from the energy-sucking-bullshit-machines dept
While “AI” (language learning models) certainly could help journalism, the fail upward brunchlords in charge of most modern media outlets instead see the technology as a way to cut corners, undermine labor, and badly automate low-quality, ultra-low effort, SEO-chasing clickbait.
As a result we’ve seen an endless number of scandals where companies use LLMs to create entirely fake journalists and hollow journalism, usually without informing their staff or their readership. When they’re caught (as we saw with CNET, Gannett, or Sports Illustrated), they usually pretend to be concerned, throw their AI partner under the bus, then get right back to doing it.
Big tech companies, obsessed with convincing Wall Street they’re building world-changing innovation and real sentient artificial intelligence (as opposed to unreliable, error-prone, energy-sucking, bullshit machines), routinely fall into the same trap. They’re so obsessed with making money, they’re routinely not bothering to make sure the tech in question works.
For example, last December Apple faced criticism after its Apple Intelligence “AI” feature was found to be sending inaccurate news synopses to phone owners:
“This week, the AI-powered summary falsely made it appear BBC News had published an article claiming Luigi Mangione, the man arrested following the murder of healthcare insurance CEO Brian Thompson in New York, had shot himself. He has not.”
Yeah, whoops. So recently, Apple pulled the feature offline:
“On Thursday, Apple deployed a beta software update to developers that disabled the AI feature for news and entertainment headlines, which it plans to later roll out to all users while it works to improve the AI feature. The company plans to re-enable the feature in a future update.
As part of the update, the company said the Apple Intelligence summaries, which users must opt into, will more explicitly emphasize that the information has been produced by AI, signaling that it may sometimes produce inaccurate results.”
There’s a reason these companies haven’t been quite as keen to fully embraced AI across the board (for example, Google hasn’t implemented Gemini into hardware voice assistants), because they know there’s potential for absolute havoc and legal liability. But they had no problem rushing to implement AI in journalism to help with ad engagement; making it pretty clear how much these companies tend to value actual journalism in the first place.
We’ve seen the same nonsense over at Microsoft, which was so keen to leverage automation to lower labor costs and glom onto ad engagement that they rushed to implement AI across the entirety of their MSN website, never really showing much concern for the fact the automation routinely produced false garbage. Google’s search automation efforts have been just as sloppy and reckless.
Language learning models and automation certainly have benefits, and certainly aren’t going anywhere. But there’s zero real indication most tech or media companies have any interest in leveraging undercooked early iterations responsibly. After all, there’s money to be made. Which is, not coincidentally, precisely how many of these companies treated the dangerous privacy implications of industrialized commercial surveillance for the better part of the last two decades.
Filed Under: ai, apple intelligence, automation, journalism, llm, media, undercooked
Companies: apple



Comments on “Apple Has To Pull Its “AI” News Synopses Because They Were Routinely Full Of Shit”
If you take AI as something that produces the answer that the majority of the internet would give you in response to something, then the Apple AI was doing its job just fine. I could definitely see most of social media gleefully saying that any given subject of a news article, especially an accused criminal, killed themselves. Of course, the weighted average of the internet’s opinion isn’t exactly something known for its adherence to reality.
To be (perhaps excessively) fair to Google (and to a lesser degree the others), after a decade of people trying to implement link taxes to “save journalism” I probably wouldn’t value journalism all that hoghly either.
Google like “hold my beer.”
Re:
Re: Re:
The preceding message was approved by the office of the President of the United States.
🙃
Re: Re: Re:
Can bleach be boofed? Asking for a friend.
Re: Re: Re:2
Absolutely, but I wouldn’t recommend it.
Tim Cook has a thing for naked mole rats.
It’s true, an AI told me so.
This comment has been flagged by the community. Click here to show it.
SOPA is back, officially. https://www.congress.gov/bill/119th-congress/house-bill/791/text
This comment has been flagged by the community. Click here to show it.
Re:
Great. Now fuck off
energy-sucking-bullshit-machines dept
Has anyone seen the vaporcentre proposal Kevin O’Leary is pitching to the stupid Alberta government? “Wonder Valley”, a behemoth (in someone’s mind anyway) AI datacentre.
EpsteinLuigi Mangione didn’t kill himself.You’d think using AI instead of reporters to save money would be enough but apparently they don’t want to pay editors either.
The obvious problem with AI synopses is lack of attribution. Everyone knows not to believe just any old source on the internet. You gotta track it back to the source to see whether it’s plausible or more Trumpian Fox News BS.
AI synopses just summarize the errant BS on the internet. If that’s of any value to anyone.
The Micro-oft hour
Deliberately misspelling the name above. The current AI horror, forced onto unwilling users by this awful company, begins with a c, then comes an o and the word pilot. I don’t want to write the name exactly for fear they’ll do worse things to me than they’ve already done. This AI program disrupts your email, telling you again and again your email can’t be shown, with a big picture of the AI’s logo. When you try to write, it wants to rewrite what you’re writing, or to give you a “summary.” If you look online, you’ll see person after person wanting to know how to dump it. It’s the worst of the worst and I don’t know how to stop it. Why wasn’t its name in the article?
If only news articles had a convenient one-sentence summary already baked into them, right at the top of each article in large writing. If that were the case then Apple would not need to generate their own summary of the article using unreliable Large Language Manglers, they could instead show the summary provided by the news outlet themselves. Alas.