How Reliable Are Industry Announced Piracy Statistics?
from the depends-on-your-definition-of-reliable dept
Eric Goldman sent in a link to a recent research paper that aimed to look at the reliability of industry-released reports on piracy. That sounded interesting, as we’ve spent plenty of posts picking apart why almost all of their released numbers are bogus. In particular, we’ve pointed out how incredibly bogus the BSA’s statistics are. So, it was somewhat surprising to have the study say that the BSA’s were the most reliable, when compared to other groups like the RIAA and MPAA. If anything, though, that really just suggested that the RIAA’s and MPAA’s stats were even more bogus (remember, things actually got so bad for the MPAA that it had to admit how bogus its own stats were). That actually seems likely, as the BSA is the most upfront about the methodology used.
However, reading through the actual report, it does little to vindicate the piracy numbers that the industry reports always trumpet. That’s because the report actually focuses on the rate of unauthorized use, rather than the cost or impact of that unauthorized use — which is the key point to come out of these reports. The rate of unauthorized use is fairly meaningless, so it doesn’t matter that much who is the most accurate. It’s the impact that matters. While reports used to do silly things like count every unauthorized copy as a lost sale, most have stopped that, and now use a multiplier. Some have started using a questionable ripple effect that counts the same loss multiple times and ignores the “ripple effects” in the other direction that benefit the industry. So, yes, perhaps the BSA is the best of a bad bunch, but even if the rate of unauthorized use is somewhat accurate, that has little bearing on the actual impact of those unauthorized copies.