DailyDirt: Computers Are Editing Our Double-Plus-Ungood Content
from the urls-we-dig-up dept
More and more digital media is being edited and prioritized in datacenters by intangible algorithms. As usual, this can be good and bad, depending on how the technology is used. On the one hand, algorithms can do laborious tasks that humans don’t want to do. But at the same time, algorithms might introduce all kinds of errors or inadvertent biases on a scale that no group of humans could ever accomplish without automation. Here are just a few links on bots tinkering with online content.
- The Los Angeles Times reported on an earthquake in about 3 minutes — thanks to an algorithm that collected data from the US Geological Survey and automagically created an article about the seismic event. This robotic reporting isn’t exactly a new thing, but the robotic “scoop” can’t be matched by human writers. [url]
- About half of all the edits on Wikipedia are made by bots. Algorithms keep spam links from flooding the site, and they also create whole entries based on online data, as well as perform tedious tasks such as grammar and spelling corrections. Not surprisingly, the biggest bot job on Wikipedia is detecting vandalism. [url]
- Algorithms aren’t free of bias; they can actually amplify biases. Humans can also trick algorithms by gaming their inputs and biasing results, so computer-produced content isn’t necessarily more objective than the writings of humans (not that anyone here would have assumed that). [url]
If you’d like to read more awesome and interesting stuff, check out this unrelated (but not entirely random!) Techdirt post via StumbleUpon.