by Mike Masnick
Fri, Sep 19th 2008 7:55am
With everyone trying to figure out just what went wrong to cause the rather spectacular financial mess Wall Street finds itself in these days, Saul Hansell over at the NY Times wanted to find out why all the sophisticated risk management quant algorithms that Wall St. has been so big on lately failed to warn of impending doom. His answer, basically, is that people on Wall St. were lying to the algorithms, coming up with ways to purposely enter data such that the risk seemed much less than it actually was -- in order to let them keep pushing the boundary. Then, it became a situation where people start relying on the computers just because the computer says so -- even though the data is bad. This happens time and time again. Even when people know that computers make mistakes, it's just so convenient to have a computer "confirm" your thinking that you start ignoring other warning signs.
If you liked this post, you may also be interested in...
- DailyDirt: More Robot Servants Will Be Nice...
- Study Shows Lenovo, Other OEM Bloatware Still Poses Huge Security Risk
- Heart Surgery Stalled For Five Minutes Thanks To Errant Anti-Virus Scan
- Congress Questions Facebook About Something It Probably Didn't Do With A Feature That Barely Matters
- DailyDirt: Thinking Machines