Usually we don't see these types of stories until March Madness time
, but the NY Times is writing about how much productivity is "lost" due to trying to keep up with the "data stream."
Apparently research firm Basex has come out with a gimmicky calculator to determine how much productivity is likely lost, and put out a silly, borderline ridiculous press release
noting that Intel claims it worked with the research firm to determine that the impact on productivity because of information overload was "up to eight hours a week." Seriously? Productivity is measured not in hours
, but output. If productivity were just about hours, we'd be looking for ways to get people to work more hours. But, most people recognize that there are diminishing returns to making people work too much -- and they have time off to charge their batteries.
If you're going to measure productivity this way, we could just as easily say that we're putting out a study showing that sleeping
costs a company approximately eight hours a day in lost worker productivity
! Something must be done! While I have no doubt that information overload can be a cost to productivity, it's not going to be measured in hours. If I "waste" 20 hours a week dealing with information overload, but I'm able to extract information that makes me three times as productive, the rest of the week, then that's a good trade-off. Do people actually pay companies for this sort of research?