The times are changing, that is for sure. DRM will never be perfect, but clearly even the strongest of the hackers have realized that the amount of work required to hack something will be more than the return is worth.
It's also important to realize that the key here is in the first 90-120 days. This is where a big percentage of the full price or near full price sales are made, and where the game / software maker recoups a big chunk of their investment. Later hacks are somewhat less important, and may in fact drive sales later in a games cycle as the price comes down to a point where people who have tried the same as pirates might buy it.
The other part of course is that many games are no longer complete as a stand alone. Part of their value is in playing with others, either as network games or in downloading and adding game components to play. Some games run stand alone but are not as complete or not the full game. Even hacking the product isn't going to help you if you are unable to register and play it online, as it was meant to be. This greatly diminishes the social value of hacking a game if you cannot get the pride (and respect) of giving people something for nothing.
My personal feeling is that more and more, gaming companies are going to move to a "razor / razor blade" model of operation, where the game that you buy is basic (complete in it's own way, but not the totality of the gaming experience) and they will continue to offer upgrades, new levels, and other gaming experiences to users who have a full copy and are registered with them. Our almost entirely always on universe makes this more and more of a likely scenario going forward, and one that will make piracy all but moot - they will be essentially selling what is not easily or even possible to be hacked.
Another interesting story, but one that almost entirely (but not quite) ignores the real issues of broadband in the US:
Population density.
Where there is enough population (urban) the US hits 96% of homes with at least one 25/3 offering. If you used only urban as the measurement, the US would be in the top 10, if not the top 5.
The problems lie in the rural. Americans want to live far outside the city, but expect city quality services to follow them. DSL (and even cable) have some pretty serious limitations when it comes to distance. So serving rural areas is pretty expensive, and the only way to recoup expenses would be in a very, very long run scenario. Consumers are unwilling (or unable) to pay what it would really cost to provide them the services.
These areas are unlikely to see great broadband services until there is a technology game changer, or the government will have to spend a whole lot of money to hook these people up. With fiber still costing in the thousands per mile and about $1000 per hookup, there isn't enough income in the game to interest the incumbents into spending.
For me, the real solution will come when someone has a format that allows fiber to be run to these rural homes, and all companies being able to share the line to provide services - telephone, cable, internet, whatever... and to be able to have more than one company on the line at a time. Most of the muni-fiber solutions are internet only and turns being an ISP into a municipal service, which can only end badly. There has to be a better solution.
With too many Americans living away from urban centers, it's unlikely you will see broadband penetration move very much in the next few years. It's your own choices that are causing the problems...
I would say that 1% of it likely would have lead to a dead end about 99% sooner. The concept of waiting three years to get next to nothing is silly. Asking such an wide question is just inviting them to take as long as they want.
As for the cost, I think that a direct query "documents related to the payments made for GPS devices" might have given a quicker answer as well.Tossing it all in one big pile and hoping to get it all is pretty much a losing battle from the word go.
I can assure you troll that I am not distressed in the least. Certainly a lot less than you are about me posting here.
It very much is fishing. 10 years of logs and other information related to a few thousand devices which may have been used more than once in a year.
Depending on the type of logging (and no, it couldn't be only on computer, it would have to be on paper) might have a single device and a single installation per page. Or might only log a couple of dozen devices per page (say 24 lines, normal paper).
Also, you have to remember that these devices may not have been released and tracked from a single office, but from regional offices all over the US. The potential also is that the log isn't kept as a consolidated list, but rather logged as evidence into individual investigations. That would require that each investigative physical file is pulled to obtain the real information.
That said, asking for that much general information (rather than being more narrow and specific) leads to the problems that happened here. Too many "responsive documents", and a long period of time to obtain them assemble them, and review / redact them before release. If the request had been more narrow (say using a single district office, shorter period of time, etc) they likely would have gotten a much quicker response. They may still not have liked the documents, they may still have gotten a long exemption list, but they would have gotten the answer much faster.
Wide scope requests without a specific case or situation in mind is fishing. They are looking for and hoping to find some sort of anomaly that they could use to embarrass the police (again).
I think there are two sides to the problem: Officials / departments who redact way too much, and people, individuals, and organizations who use the FOIA not to obtain specific information, but rather as the basis of fishing expeditions.
The scale and scope of this guy's request is enormous. 10 years of almost anything that might be remotely responsive, all of which has to be reviewed, redacted as needed, and so on. It's an insane request, and totally a fishing expedition. He wasn't trying to prove a specific circumstance, he was just fishing hoping there might be some juicy comment he could use to open a can of legal worms.
Abuse of the FOIA diminishes it's use for others. The 3 years wasted on this one could have been put to much better use responding to much more precise and refined queries.
Does it take any longer than 7 minutes to decided that (a) the plaintiff appears to have cause (ie, they have a patent and the products look very similar in nature) and to see that the defendant has been contacted by lawyer letter and sent a cease and desist (ie, being served notice)?
Again, it's not the judge making a detailed ruling on the validity of the patent or declaring the defendant to be infringing. It's about injunctive relief from a situation that would not be able to be undone.
"why did Judge Du order service of the Summons and Complaint if defendant had already been served? "
I used a word incorrectly, Mr Technical. They had contacted the Chinese via lawyer letter, a cease and desist, etc. The judge ruled on the injunction at the moment the suit was filed, which is why the judge ordered service.
Put simply, the Chinese company wasn't blind sided - just hoping that they could do the trade show, make a bunch of sales, and go home to produce the product without having to deal with the patent issue. Once back in China, they could produce all they want with the protection of their weak legal system, and flood the market. Once the contacts are made at the trade show and orders secured, it would be very hard for the Patent holder to deal with each and every infringing seller, effectively rendering the patent moot. The Chinese company could also flood other markets with the product, essentially killing the value of holding a valid patent.
I guess that explains why Techdirt is so upset... patents suck, right?
Another day, another angry Karl post to "enjoy".
My take on this is that it's the difference between Techdirt and the real world. Here you gets lots of talk, and little real action. Out there you get action and at least attempts to move forward.
Is AT&T the best partner? There are probably better, but AT&T is around for the long haul and isn't going to dry up like many of the venture capital backed companies that come and go like the breeze.
Moreover, the concept itself is actually pretty important and well worth having. Keeping an eye on resources and assets all of the city is useful, and could perhaps lead to either savings in the future or at least marginally better services. The ore information that can be gathered should lead to better choices in the future.
We have the technology. Applying it and making it all work together may be a huge step forward for the way we all live.
Doing, even if it's not the best ever solution, is way better than just chatting about the best solution - or slagging off those who dare to try.
No, in the space of 7 minutes, the judge heard that the company had been contacted, served, whatever, and had continued anyway - and failed to be present in court. The plaintiff makes the case, asks for an injunction to stop what they feel is harm, and in absence of any other information, the judge rules on what they have at hand and grants the injunction.
He didn't rule that it's infringing at all. He only ruled that the request for an injunction was valid, and the failure to follow this injunction by the Chinese company lead to the seizure of the products at the trade show.
There is no ruling that says that it's infringing.
"If Comcast sells you a 40 megabit connection, you have every right to expect 40 megabits per second, 24/7/365. If they can't provide that to you, then they lied.
"
Your electrical wires to your house can usually support about 200amps (US / Canadian standards). Having that capacity does not mean that you can run endlessly at exactly 200amps. You get a certain amount, and then you are billed for overage (or you pay for every dropped used).
Size of the pipe != amount you can have.
" The lawyer and judge might want to review their security, including that of their families."
That is perhaps the saddest comment ever on Techdirt. It's insanely sad to think that court officers, doing their job as prescribed by the law, should have to worry about angry plaintiffs coming to get them.
"Perhaps the better idea would be to have rates for internet connections quoted as bandwidth first, connection speed second.
They are the same."
No, they are not. You are making the common mistake of assuming that the speed of your connection entitles you to every possible byte you could jam through there. Reality says otherwise.
Thus, the speed of your connection is like the size of the pipe. Bandwidth is like water. You are limited to 300 gallons of water each month, you can run the tap slowly all month you can run it top speed 5 minutes a day, or save it up and run it full speed for a single day each month. This size of your pipe does not dictate hom much water you get, it only defines the maximum amount of water you can get at any one time.
"Interesting, we don't have caps here, my speed multiplied by more than 500 in the last 20 years and yet I pay less today than 20 years ago. Care to explain how is it possible in your IPS apocalypse world? You can't. But I"ll wait for more gibberish."
First, thanks for the low quality personal attack. Keeping it classy, as always.
My speed multiplied in 35 years by... well, damn, how do you calculate 45.45 baudot to 1 gig? Many, many times over, and it costs less. How is it possible? It's called advancing technology. We get better and better at things over time (except perhaps your insults), and as a result we can do more things. Communication technology has always been limited by what we would do with a wire (twisted pair or cable), and as we have made advancements, speed has grown. Our abilities to make microchips and to assemble modems and other things more cheaply contributes we all.
All of that also means that operating a pipe is cheaper than it was. The equipment is cheaper, moves more data, and lasts longer. The cost of fiber has dropped, and the quality of connections made on less than perfect fiber installations (last mile residential) has made it possible to get gigabit speeds set up by the "cable guy" type installers that do most internet setups.
Parts got cheaper, they got better, and so you get a better, cheaper internet connection.
However, my point was more this: If you really want 100% connectivity with 100% end to end dedicated networking and infrastructure to the low traffic peering point, then you need to know it will cost you. That would remove any sharing / overselling and would require a major upgrade from the ISPs to handle this amount of traffic properly and correctly.
" You are paying for a goddamn pipe size, not what goes through. And the extra cost for using it at 100% load, 100% of the time is virtually nonexistent at best and negligible at worst. Either you are being trollish dishonest or you are incredibly ignorant. I'd go for the first one."
Not true. You are paying for (a) a connection speed, and (b) an amout of data you can move at any speed from nothing to the size of the pipe. If you have an unlimited connection, then you are limited by other parts of the network and every hop and jump between you and your destination.
For your example, if all of your neighbors on the same ISP tried to download 200 gig at the same time as you, you would likely see a slow down. It's incredibly unlikely that your isp has provisioned upstrream connections to suppoer that much totally connectivity to your areas. You very likely do get throttled, but not enough for it to show as anything more than network congestion.
"Sure, okay, there's a fixed overhead to having a link up. But that overhead does not change with usage. If a circuit costs $X to maintain, it doesn't cost 0.5X when it's lightly loaded and 2X when it's saturated. It's just X. Usage doesn't impose any additional costs. (there's a tiny power delta, but it's so small it would be quite difficult to even measure.) "
The problem is the pipe is actually quite limited, when you consider. Typically you are looking at 1 or 10 gig connections, not faster. Doesn't seem like much, but if you have each customer with 25 meg, you get 40 to a gig. So do the math, you have to spread the cost of each connection over a certain number of customers. Like it or not, there is a limit to connectivity.
"Whatever your overhead is with a pipe, it doesn't increase based on usage. You build your circuit, buy your router(s), and potentially buy a commit from a transit Internet provider, and then your overhead stays the same whether it's at 100% or 0%. "
yes, and at 101%, you are buying a new setup. At much over 85 to 90%, you are facing slowdowns for your clients.
"Only peak demand matters. Offpeak usage does not, and bandwidth caps are completely unrelated. "
Part of the issue faced is that just like highways, the internet also faces it's own rush hour, typically starting about then the kiddies get home from school and on through the night until about 11 or midnight local. You know. the time everyone wants to watch netflix, they want to download new torrents, and they want to do full HD video chats with that lovely stranger they met the other day. However they use it, they do use it.
Now, in the same manner that you cannot build highways to accept all of the rushhour traffic not matter what, it's not realistic to assume that an ISP will buy bandwidth and connectivity to a level that supports the potential 100% use of all of their clients at any peak moment. That would be an incredibly expensive proposition. It would also require that their own internal network, end to end, was entirely 100% able to deliver max bandwidth to all customers at all times. That would mean few clients per node, more nodes, more node costs... you know how it goes! So they work in multiples, based on the concept that not all users need max bandwidth at the same time. The problems only really occur during peak usage times, also extends longer these days with people running torrent clients all day and such, which can use up a fair bit of bandwidth.
In the end, a cap isn't really a speed limit, as much as it's something that explains a difference between the size of the "series of tubes" and how long you can hold that tube open for each month.
Perhaps the better idea would be to have rates for internet connections quoted as bandwidth first, connection speed second. People mostly focus on the high speed and then somehow magically think that their 25 meg connection entitles them to move 10 gig of data an hour, every hour of every day. What it does it allows you to move UP to that amount (depending on where and what you connect to) until you have moved your cap amount of data.
Caps are not positive things, don't get me wrong. They are however an integral part of keeping the costs related to the internet in the realm of what consumers are willing to pay. If your ISP uses 10 as it's ratio (overselling 10 times their connectivity), would you be willing to pay 10 times the price for the full connection? Twice the price?
"The thing about network connections is this: they cost a lot to build, and have some fixed overhead per month, but usage of a connection costs basically nothing. Once a pipe is built, the monthly cost is the same whether it's fully used or not used at all. (There might be a tiny power delta for heavy use, but it's so small that it disappears in the noise.)"
This is absolutely false. It seems like it would be true on the surface, but reality kicks in pretty quick and you start to realize it's not just true.
There are two problems: All pipes physically have two ends, and it costs to be in both of those locations. Even if the telco or ISP owns the spaces, there are costs related to being in that space. Even for a single rack space received in a decent data center, you are looking at possibly thousands by the time the rack is obtained and the space used - not to count any rental / space used for the cabling in and out of the building.
You have to power it. You have to cool it. You have to maintain it (and yes, things do break, they don't live forever). So you have minute by minute costs (power, cooling) monthly cost (rent) and maintainance (say average very 5 years to replace equipment). So you have to have all of those things and calculate out the costs per second, figure out the bandwidth, and you determine the basic cost of moving data from A to B.
Second is that the pipe has to connect to others to make it work out. You can't just have a blank pipe with nothing connected at each end. Datacenters generally have interconnect fees, so even if you are doing a free peering, you are still paying for them to allow you to route a fiber and connect.
Each additional connection, each additional pipe has costs. There is no free bandwidth lunch.
So if a residential ISP sells connectivity at 100 to 1 (100 1 meg connections per 1 meg of outside connectivity) and suddenly every starts using 100% bandwidth, they are obliged to buy more connectivity - much more.
Peak provisioning is pretty much way too expensive to pull off. It costs anywhere from $5 to $10 (in the US and Europe anyway) for transit. So for your 25 meg home connection, it would cost (real price) $125 to $250 a month for a full, 100% connection. In order to keep your prices more in line, and because normal web surfing / mail / facebook / whatever generally doesn't use all your connection, they oversell. Good ISPs probably oversell 10 to 1, the bad ones 100 or more. In a 10 to 1 situation, your 25 meg connection costs $12.5 to $25 a month just for transit - that is money from your ISP to their providers direct.
The real rent collectors at the interconnects and transit companies. It is why ISPs spend so much time, effort, and money to build out their own networks and connections where possible, because it's much cheaper than buying transit.
Brent, you realize of course that for what you pay Teksavvy, Bell ends up making the lion's share of the money, provides the infrastructure, and still really does control your traffic. You can't get away from them. They set the rules and network operations that Teksavvy offers, and in turn takes most of the money.
So yes, you can stick up the antenna... but it won't help you get the NHL games or the like which are now all but exclusively on cable.
If you are getting streaming from outside of Canada, beware - the CRTC will work to shut down non-canadian sources in the same manner they worked to stop US sat companies from having their equipment in Canada. It doesn't always stop it, but it just keeps getting harder and harder to get it.
Karl, with due respect - you seem to miss it entirely. You cannot cut the cable in Canada entirely, because if you move away from cable and choose a streaming alterantive (like Crave TV) you are buying it from the same people! Even if you get it from an independent supplier, they are getting their programming from the big three or four and paying them for it.
They own it from end to end - move to pure internet, and they own you there too. Drop wired in internet and go wireless and they have you there too. You cannot get away from them, period, at all.
It's so different from the US market as to be beyond compare.
Understanding that a la carte (which has been around in Quebec for a number of years already) really doesn't have as huge an impact as all that. You pay for less channels, but you tend to pay more for them. Net, it's not a truly big deal.
Oh, and as for being "terrified of evolution", you need to pay attention: a la carte programming was not initially proposed by the CRTC - rather it's a pricing model started in Quebec by Videotron, one of those companies you think is scared to evolve. Again, without understanding the market or the history, you pretty much make mistake after mistake.
An absolutely atrocious article. Sorry Karl, but you miss the boat all over the place here.
First and foremost, you forgot to explain the landscape. There are only a handleful of media, cable, and telephone suppliers in Canada - and they generally own all three (four if you count wireless). Rogers, Bell, Shaw, and Quebecor (Videotron) are all vertical market players. Bell is the biggest,as the phone company, sat and IpTV, internet, wireless, radio stations, TV stations, and owners of the largest TV netowrk in Canada. They are also owners of many of the cable channels.
Rogers is similar, as is Shaw. Quebecor is even more outrageous, owning much of the french market in TV, radio, cable, wireless, newspapers, and the main owner is the leader of the official opposition in the Quebec National Assembly.
The reason so many cable channels have been created in Canada has to do with a twofold issue: The requirement for Canadian ownership, and also the ratios of Canadian to "import" channels that cable, IP and Sat TV companies can offer. In order to have as many US services available, they basically created a whole range of channels similar to US offerings, and packaged them up in the required ratios. AS a result, there are many channels (such as Bell's Much (formerly Much Music, an MTV clone) which has 10 million subscribers but has never had more than 1 million tuned in, and that is for a single awards program each year. The rest of the time, their ratings are effective non-existent.
Now the CRTC is changing the rules and the requirements, so they will have a la carte. However, since the US servers are popular and the mix requirement remains in place, it's unlikely that bills will drop that much in the long run.
The Canadian market is interesting, but is not a free market. So using it to try to explain what might happen in the US is misleading at best.
Sorry Karl, you need to bone up on the subject matter.
"If 'We can't buy the rights to this, the creator might die' is a serious consideration to whether or not a company is willing to invest in doing so, rather than 'We want the rights to this because we think we can make a profit selling it', then the priorities of the company are completely shot."
The problem here is that they are not mutually exclusive statements. They want to make a profit. But the first year or two may not be profitable, and the real profit may be made on the long tail of the work where every copy sold is nearly full profit. So if the copyright disappeared on the second day of the contract because of a fatal car crash or whatever, then every other company could quickly move to bring the same book to market and deny them the time required to be profitable.
The end result might be different deals with authors that would not be as financially beneficial to the writer, to protect profits. For that matter, it might lead to economic motivated murders, where writers are killed so other companies can bring popular works to market as they have "recently fallen into the public domain".
"especially if having copyright end at the death of the creator lead to even more stuff being created, as I imagine would be the case."
There is little indication that this would offset the opportunity cost losses when authors can't write full time, and are instead required to take other jobs to pay their bills. While many authors struggle to get their first successful work out, after that point the income helps them to be able to spend more time writing and less time doing other things to make money.
"If you see copyright as primarily an incentive to monetize your work"
I don't. You are missing the point entirely. Copyright defines ownership. When someone can be owned, it can be bought, sold, transferred, loaned, or kept to one's self.
Ownership of physical property is no different. For some, that ownership implied economic value, for others it brings anything from safety to pride. Economic value is only one of the things that comes from legally defined possession.
"This is probably the kind of worldview publishers operate under."
They are primary economic players, so yes, they are much more likely to see things in economic terms. That someone can create and work and reliably be able to sell rights (or an entire work) with certain guarantees of ownership under the law is pretty important. It's not the only parts, but for publishers, it's the important part. It's also important for many writers who write not just for the love of writing, but for economic reasons. We likely would not have all the works of Philip K. Dick if it were not for economic reasons, as an example.
RIP to the Thin White Duke
David Bowie was a powerful performer and a grand speaker, someone not afraid to make bold predictions. He was also very smart, generally covering his ass financially on the other side to make sure that net, he always came out ahead.
While predicting the death of copyright, he also issued the famous Bowie Bonds, a securitized bond of his (gasp) copyright work. So while talking about the death of copyright, he essentially sold of his copyright for a then staggering 50 plus million as an investment. This could only be done if he believed and could convince others of the long term value of his back catalog.
Much like Trent Reznor, David Bowie has played with many of the potential routes and concepts that they internet brings and can offer. I think both realized along the way that certain things just don't work as well online. Getting there early means that where they moved to next is probably a better indication of the future than their words of 10 or 20 years before.
David Bowie saw the future, took the best of it, and left the rest behind. Now he is gone and the world is a poorer place for his loss. RIP, say hi to Lemmy for us :)