TheNextWeb points us to a story claiming that China apparently “hijacked” about 15% of all internet traffic for a period of about 18 minutes back in April. The source of the story is McAfee, who certainly has some incentives to play up such “threats,” so perhaps take it with a grain of salt.
Also, it should be noted that this isn’t new. Some folks appeared to spot this soon after it happened as it wasn’t even remotely covert — and also said that it appeared to be a “fat fingers” type of mistake based on the way it took place. Yet, to read the McAfee report, the assumption is that it must have been for nefarious reasons. Perhaps, but that wasn’t what it appeared to be initially.
Of course, McAfee is pointing out that some of the traffic included US government and military traffic, but the US government said it was no big deal because its traffic was encrypted. However, McAfee is claiming that the US government is still at risk, and that it should be concerned. The explanation at “National Defense Magazine” based on what McAfee said seems slightly misleading:
“If China telecom intercepts that [encrypted message] and they are sitting on the middle of that, they can send you their public key with their public certificate and you will not know any better,” he said. The holder of this certificate has the capability to decrypt encrypted communication links, whether it’s web traffic, emails or instant messaging, Alperovitch said. “It is a flaw in the way the Internet operates,” said Yoris Evers, director of worldwide public relations at McAfee.
It would be great if a security expert could chime in here, but this seems like a rather simplified version of how a man-in-the-middle attack on public key encryption would work. It’s possible that it could work in some specific instances, but this report makes it out like China could automatically read any encrypted message.
Lots of broadband operators around the world have been talking about how their networks can’t keep up with traffic demands, so they’ll have to shift back to usage-based pricing. In particular, US mobile operators AT&T and Verizon have led the rhetoric, even as they continue to launch the unlimited plans they say are such a problem. The head of one broadband provider in the UK recently said a switch to usage-based pricing, and away from flat-rate plans, was inevitable as soon as one operator in a market made the switch. He dismissed the idea that operators would seek to differentiate by sticking with flat-rate plans, or by taking any other pricing strategy than usage-based plans, ignoring the fact that consumers have grown accustomed to flat-rate offerings, and that the lack of clarity in usage, billing and pricing that per-unit plans are a big turnoff for them. Already, we’re seeing some signs that the operator landscape may not be dominated by such groupthink, as T-Mobile and Leap Wireless have made changes to their mobile broadband plans that are out of step with many other operators. The two companies have changed the way the caps on some of their plans work: for instance, on T-Mobile, when a user reaches their 5GB monthly cap, they don’t get hit with overage fees, the speed of their connection gets throttled, avoiding the uncertainty inherent in usage-based pricing. Perhaps it’s not a perfect situation, but it does show that some operators aren’t afraid to step out from the party line and explore different pricing models. It also builds some hope that when some providers do decide to regress to usage-based schemes, there will be some choices for consumers.
It looks like even Rupert Murdoch and his crew at News Corp. aren’t all that confident in how well his new paywalls for his newspapers in the UK are going to do. Apparently, News Corp. has convinced the media-owned organization, ABC, that tracks traffic to newspaper sites in the UK, to no longer publish the numbers for his sites. ABC is still tracking the numbers, but it won’t publish them. Apparently, someone is a bit sensitive about people noticing how much traffic has shrunk. Maybe it’s starting to sink in that there really is competition online, and most of it is remaining free.
A decade into the entertainment industry’s massive game of whac-a-mole when it comes to file sharing sites, you would think that people would realize that blocking or banning any particular site doesn’t do a damn thing to slow the pace of file sharing around the globe. Instead, it does two things: (1) informs more people of the social norm of unauthorized file trading and (2) causes people to scatter to more sites, usually further underground and even more difficult to identify and stop. And, indeed, that appears to be the case in Italy. You may recall that the Italian Supreme Court recently decided that it was okay for a lower court to block The Pirate Bay (the lower court is now deciding what to do), but in response, it appears that users have already figured out how to scatter to other sites, as many other torrent sites have seen an influx of Italian users. Another mole whacked, and yet, more keep popping up. It’s difficult to see how this is a particularly good strategic policy.
With a new report coming out suggesting that Facebook sends more traffic to news sites than Google News, folks like Mathew Ingram are asking if Rupert Murdoch, the AP and others will be complaining about Facebook “stealing” their traffic and demanding to get paid. Given their reactions to Google, it does seem like a reasonable question. Or will that only happen when Facebook is making much more money from its other lines of business, and those news execs get jealous?
Someone anonymously submitted a decent writeup by John A. Byrne, the former editor-in-chief at Business Week who recently left (amid the shakeup due to Bloomberg buying the magazine) to start a new media effort called C-Change Media. In this blog post, Byrne argues that the media complaining about Google sending them traffic is biting the hand that feeds them. There’s really not much new in the writeup, which runs over the same ground we’ve covered for a few years now, but it’s a nice succinct summary of the situation:
Rupert Murdoch’s protestations aside, there is no doubt that Google is driving vast amounts of traffic to websites run by traditional media companies. In recent years, most of BusinessWeek.com’s growth came from search optimization and direct traffic. Up until only three years ago, the number one referring domain at BusinessWeek was always a portal until Google’s popularity replaced Yahoo Finance and MSN Money as the top referrer. Search–largely Google–now accounts for some 45% of the traffic at BW.com, up from less than 20% in 2006. That simple little box is driving vast amounts of advertising inventory (and therefore revenue) to the site. It’s a similar story everywhere else.
In the war between the traditional media brands and Google, the old cliche about biting the hand that feeds you is certainly in play. Some of the complaints from media can be attributed to sour grapes. Many incumbents resent that most efforts to find information on the Web no longer starts with a brand. It starts with Google which is largely brand agnostic. So, in effect, Google has become this massive transaction machine, and as everyone knows, transactions are the antithesis of relationships. If a brand wants a relationship with its audience, Google is getting in the way. It’s how Google was able to siphon nearly $22 billion last year in advertising from traditional media. And it’s the most obvious proof that media brands have diminished in value. People are more routinely turning to Google to get information, rather than a brand known for its expertise in a given area. They’ll google (yes, I’m using Google as a verb) leadership before going to The Wall Street Journal, Fortune, BusinessWeek, or Harvard Business Review. They’ll google President Clinton before going to The New York Times, Time, or Newsweek. Why? Because they trust Google to serve up unbiased results; because they want to see what is generally available out there and not tied to a brand, and because most brands no longer wield the power and influence they did years ago.
Instead of complaining about this and threatening to block Google from crawling a site, media companies would do well to step back and more fully understand what they really need to do: rebuild the relationships they have with their readers, viewers, users. To offset the massive transaction machine that Google is, media brands need to focus on restoring relationships with users. That’s why “user engagement” is not an idle phrase to throw around but is essential to making a brand successful online. Original content isn’t enough. Gee-whiz tech tricks aren’t enough. Neither is a fancy design or a search trap gimmick. You need an audience that is deeply and meaningfully engaged in the content of a site, so engaged in fact that many of those users become collaborators, and that requires tremendous amounts of work and editorial involvement with the audience.
Indeed. It’s the point we’ve been trying to make for ages. Newspapers were always in the community building business. They would bring together a community of folks and then sell their attention to advertisers. That was the business. But they thought they were in the news delivery business, and that’s confusing them — leading them to do things that are anti-community and anti-relationship (registration walls, paywalls, etc.) that actually harm the value of the community and limit that. Thus, people are going elsewhere for community — whether it’s other media publications or social network sites — and newspapers are lashing out at the wrong party: the one who sends them traffic.
Once again, Danny Sullivan is ripping to shreds the arguments being made by newspaper execs who are talking about how Google is a “parasite” on their content, despite sending tons of traffic. In this episode, Danny looks at the silly claim that visitors from Google are worthless, by comparing the situation to a regular shopfront and how they handle browsers vs. requiring a fee to get inside in the first place. He also goes on to look at how the Wall Street Journal (to which he is a subscriber) tries to monetize him online, and the only clear conclusion is that if News Corp. execs think that traffic from Google is worthless, it’s only because they’re making it worthless by doing an incredibly poor job capitalizing on all that free traffic.
With the new effort by newspaper folks who are unable to come up with a business model to blame news aggregators with big time executives from media companies insisting that aggregators “steal” from them by sending them traffic, it’s time to brush away that myth. Take, for example, the excellent tech/social media blog ReadWriteWeb, who recently had an article about Eric Schmidt’s predictions for what the web will look like in five years. Soon afterwards, the Huffington Post “aggregated” that story and posted the opening on its own site with a link to the full article. For over a year now, we’ve been hearing mainstream publications complain about this sort of thing by the HuffPo, with the NYTimes digital boss Martin Nisenholtz complaining about this activity just last week.
But, of course, all this sort of activity does is bring in tons of traffic. The Huffington Post gets an awful lot of traffic and a link from the site drives traffic. Marshall Kirkpatrick, from RWW, noted that the single HuffPo link drove 10,000 page views in just four hours, and basically begged HuffPo to “steal” more content like that. Indeed, it’s still really difficult to understand why mainstream publications are so up in arms over other sites helping to promote their articles and send them traffic — even to the point of looking to pass laws to stop such activity.
The topic of this post is sponsored by IBM. Read more about building a smarter planet on the IBM A Smarter Planet Blog. Of course, the content of this post consists entirely of the thoughts and opinions of the author and not of IBM.
Drivers in the United States are faced with a constant barrage of traffic signs, lights and signals all meant to navigate them safely through the sea of cars, pedestrians and bicycles without incident. Furthermore, US drivers are faced with an increasing array of laws that prohibit a multitude of things like speaking on the cell phone while driving, even though studies have shown that roads are not necessarily safer. Red light cameras have been installed under the guise of making intersections “safer,” even though, like the cell phone bans, study after study has shown that these cameras do little more than provide a revenue stream for the cities that employ them. The problem with using signs and fines to enforce driving behavior is that they usually attack the symptoms of bad driving, rather than the bad driving itself. After all, playing video games while driving has always been a bad idea, even before it was explicitly forbidden by law. Similarly, by teaching drivers to constantly monitor their speed and look out for red lights, they are preoccupied with the wrong things — they should be watching the road and traffic around them instead.
Instead of trying to micromanage every aspect of safe driving with signs, signals and laws, a better approach would be to utilize what should be the smartest part of the car — the driver. Just like a poorly designed door needs a sign to tell you whether or not to “push” or “pull” it, a poorly designed intersection needs to tell you when to stop or go. So, a better way to design an intersection seems counterintuitive: reduce the number of signs and signals. Back in 2003, in the Dutch town of Drachten, traffic engineer Hans Monderman replaced red light intersections with traffic roundabouts with reduced signage. Moving through the intersection, there are almost no signals or signs to direct the traffic at all. As a result, drivers, pedestrians and cyclists pay more heed to the actual traffic patterns within the circle. So, rather than blindly following traffic signals, they proceed much more carefully and make eye contact with each other as they make their way through the intersection. Traffic flows better now; gridlock is a rarity. Most importantly, six years after the improvements, Drachten is safer — prior to the roundabout, there were over eight accidents per year, after the roundabout was installed and traffic signs and signals removed, less than two. By making traffic seem more chaotic, it is actually made safer.
Of course, any new approach has its doubters — after all, intersections in Asia are infamously chaotic:
However, upon closer inspection, this seemingly chaotic traffic pattern actually works surprisingly well. Pedestrians, cyclists, cars and buses all coexist in relative chaotic harmony. With the addition on one simple rule, like a roundabout, it could possibly work even better — but, to try and control everything with traffic signals would definitely disrupt the flow. As our cities and towns get more and more congested, embracing this concept of “shared space” will become increasingly important. After all, traffic improvements aren’t just good for cars — they make cities more livable for pedestrians and cyclists too. Elsewhere, according to Wired, when the town of West Palm Beach converted “several wide thoroughfares into narrow two-way streets, traffic slowed so much that people felt it was safe to walk there. The increase in pedestrian traffic attracted new shops and apartment buildings.”
Recently, to celebrate 50 years of automobile safety improvements, the Insurance Institute for Highway Safety crashed a 2009 Chevy Malibu with a 1959 Chevy Bel Air. The results were impressive; the theoretical occupants of the 2009 vehicle would be able to walk away relatively unscathed compared to their unfortunate cohorts in the Bel Air. However, although modern autos do a great job of protecting vehicle occupants in case of an accident, a smart city with well-designed traffic systems could help to avoid such accidents in the first place.
With so much effort put towards new laws banning mobile phone use while driving, and installing speed cameras and redlight cameras, you would think that places that were quite aggressive in doing so would see a decrease in the number of auto injuries. After all, isn’t that the point of all of this? The UK has been particularly aggressive in such efforts, but as Jeff Nolan alerts us, a new report out in the UK suggests that (despite the gov’t’s earlier claims) injury accidents have actually increased over time. The government has now been forced to admit that the stats it had been pumping out (which showed a decrease) were faulty, and that the real number of accidents may be as much as three times as high as what it had been reporting. This only came about after the British Medical Journal looked at hospital admission records of those injured in car crashes, and saw the numbers went up as these new efforts were put in place in the UK. We’re all for safer driving, but the claims that these measures lead to safer driving aren’t supported by the data.