Glyn Moody’s Techdirt Profile


About Glyn MoodyTechdirt Insider

Posted on Techdirt - 2 October 2015 @ 8:23am

Argentina Plans To Increase Copyright In Photos From 20 Years To Life Plus 70 Years, Devastating Wikipedia

from the cultural-memory-loss dept

As Techdirt has pointed out, copyright extensions are bad enough, but retroactive ones are even worse, since the creation of the work has already occurred, so providing additional incentives makes no sense, even accepting the dubious idea that artists think about copyright terms before setting to work. Moreover, copyright extensions are a real kind of copyright theft -- specifically, stealing from the public domain. If you think that is just rhetoric, it's worth looking at what is happening in Argentina.

As a post on the Wikimedia Argentina blog explains (original in Spanish), a proposed law would extend the copyright in photos from 25 years after an image was taken (or 20 years from first publication) to life plus 70 years -- a vast extension that would mean that most photos taken in the 20th century would still be in copyright. That's a big problem for Wikipedia in Argentina, since it is using photographs that have passed into the public domain under existing legislation. If the new law is passed in its current form, large numbers of photos would have to be removed:

Wikipedia would have to erase nearly all the photos of twentieth century history: the mere exposure without consent of the new rightsholders would be a crime. Not only Wikipedia: even the General Archive of the [Argentinian] Nation would become illegal and 40 million Argentines would be left without access to their historical memory.
It's a great but sad example of how copyright can destroy culture on a massive scale. Let's hope that law doesn't pass.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

45 Comments | Leave a Comment..

Posted on Techdirt - 1 October 2015 @ 11:18pm

Chip And PIN Meets Facial Recognition: Chipping Away At Privacy, Pinning You Down In A Database

from the are-you-sure-you've-thought-this-through? dept

As part of President Obama's BuySecure initiative, US merchants and the public are being encouraged to adopt the Chip and PIN technology for credit, debit, and other payment cards. As the announcement in October last year noted, these Chip and PIN cards have been used for some years in other parts of the world, notably Europe and Canada. For all the technology's vaunted security, there are inevitably still weaknesses that can be exploited, as with any system. That was true five years ago, and it's still true now, as shown by this story on the BBC Web site about one company's idea for reducing Chip and PIN fraud:

One of the biggest payments processing companies has revealed it is developing a chip-and-pin terminal that includes facial recognition technology.

Worldpay's prototype automatically takes a photo of a shop customer's face the first time they use it and then references the image to verify their identity on subsequent transactions.
The company admits that the system is unlikely to be perfect:
Worldpay is not suggesting that shoppers be blocked from making payments if its computer system failed to make a match.

Rather, it suggests that tills would display an "authorisation needed" alert, prompting shop staff to request an additional ID, such as a driving licence.
It's only an experimental idea at present, but Worldpay says it could roll it out to the 400,000 retailers that use its system within five years if there's sufficient interest. That would obviously create rather a large collection of facial biometrics, which raises questions of how they would be stored. But don't worry, Worldpay has got that sorted:
The firm says it would store the captured images in a "secure" central database.
Well, that shouldn't be a problem, then -- provided you remember to change your face when that database gets broken into….

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

31 Comments | Leave a Comment..

Posted on Techdirt - 30 September 2015 @ 6:12am

Canada Wants To Cut Price Of 'World's Most Expensive Drug'; US Manufacturer Sues To Stop It

from the that's-gratitude? dept

The Turing Pharma case has received widespread coverage, but as Techdirt readers know, it's hardly a unique example of the pharmaceutical industry taking advantage of a flawed system. In fact, over in Canada, there's another interesting example of the industry's sense of entitlement, reported here by CBC News:

A U.S. drug company is taking the Canadian government to court for its attempt to lower the price of what has been called the world's most expensive drug.

Alexion Pharmaceuticals has filed a motion in Federal Court, arguing that Canada's drug price watchdog has no authority to force the company to lower its price for Soliris.
According to the article, a 12-month course of Soliris costs about $520,000 in Canada at today's exchange rates, and a mere $500,000 in the US.
While Soliris is not a cure, it can stop the assault [by two rare blood diseases] on the body's tissues and organs. Since patients typically need to take the medication indefinitely, it can cost tens of millions of dollars over a lifetime.
Understandably, some Canadian regions are struggling to provide the drug for all the people who need it:
Due to the high cost, some patients in Canada can't get the drug. Only some provinces will cover the cost of treatment and there are different criteria to qualify for coverage in various jurisdictions.
The pharmaceutical industry likes to argue that, though high, such prices are necessary in order for companies to recoup the research and development costs of new drugs. According to the CBC article, Soliris has already brought Alexion around $4.5 billion in revenues, which ought to be enough to cover any such outlay, not least because of the following important fact:
In case of Soliris, most of the research and development was done by university researchers working in academic laboratories supported by public funds.

"I think the public science is well over 80 or 90 per cent of the work," said Sachdev Sidhu, a University of Toronto scientist who is also in the business of drug development.
That means that Alexion had to spend less than usual to develop and bring the drug to market. It also means that, once more, a pharma company gets to build on the work funded by the public, but without any sense of obligation to pay that back in the form of lower prices -- on the contrary. Perhaps most damagingly, the lawsuit brought by Alexion to defend its exorbitant pricing could have very serious negative consequences for everyone in Canada:
A University of Ottawa professor who specializes in health law said he was shocked that Alexion would challenge Canada's authority to regulate drug prices. If Alexion's case is successful, it could end Ottawa's ability to control the cost of patented drugs, Amir Attaran told CBC News.

"This is the single greatest threat to pricing of drugs in Canada ever," he said Thursday.
In the pursuit of high profit margins, the world's dysfunctional drug industry continues to ride roughshod over everything in its path, whether a patient trying to survive a rare chronic disease, or an entire nation trying to provide decent medical treatment to as many of its population as possible.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

74 Comments | Leave a Comment..

Posted on Techdirt - 29 September 2015 @ 3:20am

Colombia Shows How Not To Regulate Drones

from the pedestrian-waving-a-red-flag dept

As a growing number of Techdirt posts on the subject indicate, drones are fast entering the mainstream. But as they become more common, and as more mishaps involving them inevitably occur, so the calls for government regulation grow louder. Fortunately for the rest of us, Colombia has stepped forward to show us some of the things not to do, as a post on PetaPixel makes clear. The new drone regulations are written in Spanish, not unreasonably, but Pablo Castro has put together a useful summary, singling out four key aspects that give a feel for the general approach.

The first is that drone operators are required to take a training course. Fair enough, you might think, but there are couple of problems with the idea:

[the course] must be taken at an aeronautics school authorised by the Aeronáutica Civil, and to date none has been authorised to teach such courses.

Also, should they be authorised, several of these schools have confirmed these courses will cost at least $5,000 USD. Oh, and by law they must be renewed every six months.
Then, of course there's the mandatory insurance. Again, that would be a reasonable requirement were it not for the following:
no Colombian insurance company offers such coverage for drones at the time of writing this article.
Point three is more subtle:
we must not fly within a 5km radius of any airport. However, we must make sure we establish radio communication with the nearest airport control tower before and during every flight.

Yes, that’s correct: all drone operators must own radios with ranges upwards of 5km and capable of the frequencies airports use, the cheapest of which cost more than a DJI Phantom and require a license to operate legally.
The final aspect is that not only must drone operators submit flights plans to the relevant authorities 15 days before they carry them out, but they must also justify why the job can't be done by conventional aircraft. As Castro remarks:
This last point contains an obvious hint as to why the Aeronáutica Civil has taken such a drastic stance on drones. It turns out this entity is tightly connected to the handful of aviation companies that used to make thousands of dollars on every flight involving aerial photography, videography, and the like, but with the widespread use of drones, their precious cash cow is dying. So unsurprisingly, corruption is the real motivation behind this new law, not the safety of our citizens.
And if corruption is not an issue, lobbying most certainly will be. As drones become more common and more capable, we can doubtless expect to hear calls for regulations restricting them in various ways. The justification may be "safety" or "national security", but in many cases, the real reason will be the fears of the traditional aviation companies that they are about to be replaced by much-cheaper drone-based services.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

18 Comments | Leave a Comment..

Posted on Techdirt - 28 September 2015 @ 3:47am

GCHQ's Karma Police: Tracking And Profiling Every Web User, Every Website

from the this-is-what-you'll-get,-when-you-mess-with-us dept

One of the very first revelations from the Snowden leaks was a GCHQ program modestly entitled "Mastering the Internet." It was actually quite a good name, since it involved spying on vast swathes of the world's online activity by tapping into the many fiber optic cables carrying Internet traffic that entered and left the UK. The scale of the operation was colossal: the original Guardian article spoke of a theoretical intake of 21 petabytes every day. As the Guardian put it:

For the 2 billion users of the world wide web, Tempora represents a window on to their everyday lives, sucking up every form of communication from the fibre-optic cables that ring the world.
But the big question was: what exactly did GCHQ do with that huge amount of information? Two years later, we finally know, thanks to a new article in The Intercept, which provides details of another major GCHQ program called "Karma Police" -- the name of a song by Radiohead, with the repeated line "This is what you'll get, when you mess with us". A GCHQ document obtained by Snowden indicates that Karma Police goes back some years -- at least to 2008. It provides the following summary of the project's aims:
KARMA POLICE aims to correlate every user visible to passive SIGINT [signals intelligence] with every website they visit, hence providing either (a) a web browsing profile for every visible user on the internet, or (b) a user profile for every visible website on the internet.
Profiling every (visible) user, and every (visible) website seems insanely ambitious, especially back in 2008 when computer speeds and storage capacities were far lower than today. But the information that emerges from the new documents published by The Intercept suggests GCHQ really meant it -- and probably achieved it.
As of 2012, GCHQ was storing about 50 billion metadata records about online communications and Web browsing activity every day, with plans in place to boost capacity to 100 billion daily by the end of that year. The agency, under cover of secrecy, was working to create what it said would soon be the biggest government surveillance system anywhere in the world.
That's around 36 trillion metadata records gathered in 2012 alone -- and it's probably even higher now. As Techdirt has covered previously, intelligence agencies like to say this is "just" metadata -- skating over the fact that metadata is actually much more revealing than traditional content because it is much easier to combine and analyze. An important document released by The Intercept with this story tells us exactly what GCHQ considers to be metadata, and what it says is content. It's called the "Content-Metadata Matrix," and reveals that as far as GCHQ is concerned, "authentication data to a communcations service: login ID, userid, password" are all considered to be metadata, which means GCHQ believes it can legally swipe and store them. Of course, intercepting your login credentials is a good example of why GCHQ's line that it's "only metadata" is ridiculous: doing so gives them access to everything you have and do on that service.

Login ID, userid and password all considered to be "metadata"

The trillions of metadata records are stored in a huge repository called "Black Hole." In August 2009, 41 percent of Black Hole's holdings concerned web browsing histories. The rest included a wide range of other online services: email, instant messenger records, search engine queries, social media, and data about the use of tools providing anonymity online. GCHQ has developed software to analyze these other kinds of metadata in various ways:

SOCIAL ANTHROPOID, which is used to analyze metadata on emails, instant messenger chats, social media connections and conversations, plus “telephony” metadata about phone calls, cell phone locations, text and multimedia messages; MEMORY HOLE, which logs queries entered into search engines and associates each search with an IP address; MARBLED GECKO, which sifts through details about searches people have entered into Google Maps and Google Earth; and INFINITE MONKEYS, which analyzes data about the usage of online bulletin boards and forums.
In order to connect these different kinds of Internet activity with individuals, GCHQ makes great use of information stored in cookies:
A top-secret GCHQ document from March 2009 reveals the agency has targeted a range of popular websites as part of an effort to covertly collect cookies on a massive scale. It shows a sample search in which the agency was extracting data from cookies containing information about people's visits to the adult website YouPorn, search engines Yahoo and Google, and the Reuters news website.

Other websites listed as "sources" of cookies in the 2009 document are Hotmail, YouTube, Facebook, Reddit, WordPress, Amazon, and sites operated by the broadcasters CNN, BBC, and the U.K.'s Channel 4.
Clearly the above activities allow incredibly-detailed pictures of an individual's online activities to be built up, not least their porn-viewing habits. One tool designed to "provide a near real-time diarisation of any IP address" is called, rather appropriately, Samuel Pepys, after the famous 17th-century English diarist.

The extraordinary scale of GCHQ's spying on "every visible user" raises key questions about its legality. According to The Intercept story:

In 2010, GCHQ noted that what amounted to "25 percent of all Internet traffic" was transiting the U.K. through some 1,600 different cables. The agency said that it could "survey the majority of the 1,600" and "select the most valuable to switch into our processing systems."
Much of that traffic will be from UK citizens when they access global services like Google or Facebook, which GCHQ has admitted it defines as "external platforms," and which is thus completely stripped of what few safeguards UK law offers against this kind of intrusive surveillance by GCHQ.

This means that it is certain that many -- perhaps millions -- of UK citizens have been profiled by GCHQ using these newly-revealed programs, without any kind of warrant or authorization being given or even sought. The information stored in the Black Hole respository, and analyzed with tools like Samuel Pepys, provides unprecedented insights into the minutiae of their daily lives -- which websites they visit, which search terms they enter, who they contact by email or message on social networks. Within that material, there is likely to be a host of intimate facts that could prove highly damaging to the individual's career or relationships if revealed -- perfect blackmail material, in other words. Thanks to other Snowden documents, we know that the NSA had plans to use this kind of information in precisely this way. It would be naive to think it would never be used domestically, too.

It's frustrating that it has taken over two years for these latest GCHQ documents to be published, since they reveal that the scale of British online surveillance and analysis is even worse than the first Snowden documents indicated, bad as they were. They prove that the current calls for additional spying powers in the Snooper's Charter are even more outrageous than we thought, since the UK authorities already track and store British citizens' online moves in great detail.

When Edward Snowden handed over his amazing trove of documents to journalists to release as they thought best, he also placed a huge responsibility on their shoulders to do so as expeditiously as possible. If, as seems likely, there are yet more important revelations about the scale of US and UK spying to come, it is imperative that they are published as soon as possible to help the fight against those countries' continuing attempts to bolster mass surveillance and weaken our freedoms.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

59 Comments | Leave a Comment..

Posted on Techdirt - 24 September 2015 @ 11:34pm

Auto Industry's Own Study Demolishes Case For Car Safety Harmonization In TAFTA/TTIP

from the putting-lives-in-danger dept

Back in February, we suggested that TAFTA/TTIP should really be called the "Atlantic Car Trade Agreement," or ACTA for short, since nearly 50% of the claimed boost to transatlantic trade that would accrue from TTIP consists of swapping vehicles between the US and EU. The claim was that, since cars made in the US and EU were equally safe, there was no good reason why they could not be sold on both sides of the Atlantic. According to an important article in The Independent, proponents of this view were so confident that US and EU safety standards were broadly similar that they commissioned a report to prove it, with the aim of using it to bolster the case for harmonizing car safety standards in TAFTA/TTIP:

The Washington-based Alliance of Automobile Manufacturers (AAM) sponsored the research, announced in a joint press release last year alongside the European car lobby ACEA and the American Automotive Policy Council.
So that there could be no question about the validity of the results, they asked some of the world's top people in the field to participate:
Independent experts from the University of Michigan Transportation Research Institute and the SAFER transportation research centre at Chalmers University of Technology in Gothenburg, Sweden, carried out the study. They are two of the leading traffic safety research centres in the world. Experts in France and at the UK’s Transport Research Laboratory were also involved.
Here's what the industry's report found:
The research actually established that American models are much less safe when it comes to front-side collisions, a common cause of accidents that often result in serious injuries.
The following is no surprise, then:
The findings were never submitted -- or publicly announced -- by the industry bodies that funded the study.
Putting a brave face on things, a spokesperson for the US Alliance of Automobile Manufacturers told The Independent:
"There is much credit to be given for the historic efforts made in this study, and we fully support the methodology for comparing and analyzing U.S. and EU crash environments and vehicle performance. "
While the European car manufacturers association said:
"ACEA remains confident that regulatory convergence can be achieved in TTIP while maintaining the current high level of safety performance in both the EU and the US"
But the European Transport Safety Council (ETSC), the independent organization that advises the European Commission and the European Parliament on road safety, is not so sure. Its executive director, Antonio Avenoso, is quoted as saying:
"This study shows that EU and US trade negotiators would potentially be putting lives in danger by allowing vehicles approved in the US to be sold today in Europe and vice-versa. … Clearly without much more research and analysis, including vehicle safety standards in the TTIP agreement would be irresponsible."

The trouble is, if car safety standard harmonization is not included in TTIP, there won't be the big boost to trade that is predicted to come from increased transatlantic vehicle sales. And without that big boost, TAFTA/TTIP's benefits, never very big in the first place, become even more negligible. Sounds like it's time to slam the brakes on ACTA….

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

13 Comments | Leave a Comment..

Posted on Techdirt - 24 September 2015 @ 3:32am

German Competition Authority Decides To Take No Action Over Google's Removal Of Snippets From Google News

from the misusing-copyright dept

Almost exactly a year ago, Techdirt wrote about Google's decision to drop the use of news snippets from certain German publishers, who were members of the collection society VG Media, in a long-running dispute over "ancillary copyright", also known as the Google tax. VG Media lodged a claim against Google with Germany's competition authority, the Bundeskartellamt, in the hope the authorities would force Google to put the snippets back by licensing them. An interesting post on the Disruptive Competition (DisCo) Project Web site notes the Bundeskartellamt has now issued its ruling, and said that it will not open formal proceedings against Google over this matter:

The answer of the antitrust watchdog is simple: if an online service does not want to acquire a license for the display of snippets and hence only displays search results in a more limited, shorter version, it can do so. There is nothing in antitrust law that would prevent companies from doing so, even if they are found to be dominant on a given market.
The Bundeskartellamt's reasoning is quite simple:
Google announced that in future it would show search results relating to the websites of press publishers that were represented by VG Media in the legal dispute only in a reduced form if the publishers did not agree to a free-of-charge use of their work. Google justified this by claiming that otherwise it ran the risk of being sued for breaching the ancillary copyright.

The Bundeskartellamt considers this to be an objective justification for Google's conduct. Even a dominant company cannot be compelled under competition law to take on a considerable risk of damages where the legal situation is unclear.
The rest of the DisCo post explores research that shows the harmful effects Spain's Google tax has had on publishers in that country -- something that Mike wrote about back in July. The author of the DisCo analysis, Jakob Kucharczyk, has a good encapsulation of the problem common to all these attempts to introduce ancillary copyright:
The underlying flaw in this strategy is that these legislative proposals misuse copyright for industrial policy purposes. It remains unclear which problem or market failure these laws actually try to solve.
The question is: How long will it take European governments to grasp this point? And how much of their publishing industries will have disappeared when they finally do?

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

35 Comments | Leave a Comment..

Posted on Techdirt - 22 September 2015 @ 11:25pm

Thailand Might Be Required To Sacrifice Plant And Seed Sovereignty For The Sake Of Trade Agreement With EU

from the bad-deals dept

Techdirt wrote recently about African nations agreeing to a new plant variety treaty that will benefit Western seed companies at the expense of local farmers. Not surprisingly, those corporations want similar powers elsewhere, and a post on the site from earlier this year reveals that the European Union is trying to use a so-called "free trade" agreement currently being discussed with Thailand to give it to them:

If Thailand accepts the intellectual property law in the course of the negotiations for the Free Trade Agreement of EU-Thai, Thailand would have to amend its 1999 Plant Varieties Protection Act to make Thai law consistent with the 1991 UPOV Convention.
The post goes on to spell out how such a move is likely to affect Thai agriculture:
Thailand would have to amend its laws. It would have to abandon the principles of requesting prior authorization and benefits sharing in relation to the development of new plant varieties. This would mean that the seed companies, multinational bio-tech companies, and the big agri-businesses would not need to make a request nor share benefits when they exploit wild plants or widely available plants and local plants to develop new varieties.
That's a classic case of exploiting the plant commons without sharing any of the benefits that flow from doing so. Extended monopoly rights will make things even worse, and undermine traditional farming traditions:
Thailand would have to extend the period of corporate monopoly rights over new varieties from 12 years in most cases, to 20 years. In addition, it would open up a loophole in the law for private companies to prevent farmers from collecting seeds of the new varieties for planting in the next season, as well as preventing them from distributing and exchanging seeds with neighbours both inside and outside their community, which is a common cultural practice of farming communities. believes that these changes would mean local farmers paying three times the current price for seeds, and that corporations would soon gain complete control of the seed business. Those are all depressingly familiar consequences of giving up plant and seed sovereignty, but the story contains the following novel aspect:
Compliance with demands of the European Union or hasty government amendments to domestic laws allows the government to claim that Thailand did not amend any laws on account of the EU-Thai FTA negotiations.
That's noteworthy, because there's evidence that the European Commission is aiming to implement key US demands for TAFTA/TTIP before negotiations are completed so that it too can claim that it did not amend any laws on account of it. If the post is correct, it's a sneaky trick that seems to be spreading.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

24 Comments | Leave a Comment..

Posted on Techdirt - 22 September 2015 @ 3:16am

As US Turns Away From Idea Of Backdooring Crypto, David Cameron Has A Problem

from the saber-rattling dept

Last week, Mike wrote about what seems an important shift in US government policy on encryption, as the White House finally recognizes that adding backdoors isn't a sensible option. That leaves a big question mark over what the UK will do, since David Cameron and intelligence officials have been hinting repeatedly that they wanted to undermine encryption in some unspecified way. Just last week, the new head of MI5, the UK's domestic intelligence service, gave the first-ever live media interview by a senior British intelligence official. Asked about the alleged danger of parts of the Internet "going dark", he said:

"It requires the cooperation of the companies who run and provide services over the internet that we all use. It is in no one's interest that terrorists would be able to plot and communicate out of the reach of any authorities with the proper legal power."
That's from The Guardian, and another article there points out that the UK government's strategy of trying to get the big US online services to co-operate now looks in trouble:
If the White House does drop the battle [over backdoors] it will leave Britain with little option but to accept the widespread use of encryption. The UK's ability to directly lobby the big American technology firms is limited, and in a report leaked in June the former British diplomat Sir Nigel Sheinwald said that a new international treaty was the only way to get the co-operation of the companies. Without the support of the White House such a treaty seems unlikely.

Without the co-operation of the tech firms what the UK government can do when facing widespread encryption is limited. In June the Home Office confirmed that, for extreme cases, it was considering inserting "black box" probes into the transatlantic cables, to collect data leaving and entering the UK. But if the communications were encrypted on their way to the US, such collection would have little value.
Of course, a lot depends on the detailed policy adopted by the US government, and whether the US intelligence community manages to exploit any future terrorist attacks to get backdoors on the agenda again. But for the moment, it seems that David Cameron's anti-encryption saber-rattling will remain just that.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

38 Comments | Leave a Comment..

Posted on Techdirt - 21 September 2015 @ 3:52am

VW Accused Of Using Software To Fool Emissions Tests: Welcome To The Internet Of Cheating Things

from the is-that-domestic-appliance-lying-to-you? dept

There have been a number of stories on Techdirt recently about the increasing use of software in cars, and the issues that this raises. For example, back in April, Mike wrote about GM asserting that while you may own the car, the company still owns the software that runs it. You might expect GM to come out against allowing you to modify that software, but very recently we reported that it had received support from a surprising quarter: the Environmental Protection Agency (EPA). The EPA had a particular concern that engine control software might be tampered with, causing cars to breach emissions regulations. We've just found out that the EPA was right to worry about this, but not for the reason it mentioned, as the The New York Times explains:

The Environmental Protection Agency issued [the German car manufacturer Volkswagen] a notice of violation and accused the company of breaking the law by installing software known as a "defeat device" in 4-cylinder Volkswagen and Audi vehicles from model years 2009-15. The device is programmed to detect when the car is undergoing official emissions testing, and to only turn on full emissions control systems during that testing. Those controls are turned off during normal driving situations, when the vehicles pollute far more heavily than reported by the manufacturer, the E.P.A. said.
So, just as the EPA feared, software that regulates the emissions control system was indeed tampered with, though not by reckless users, but by the cars' manufacturer, Volkswagen (VW), which must now recall nearly half a million cars, and faces the prospect of some pretty big fines -- Reuters speaks of "up to $18 billion". The EPA's Notice Of Violation (pdf) spells out the details of what it calls the software "switch":
The "switch" senses whether the vehicle is being tested or not based on various inputs including the position of the steering wheel, vehicle speed, the duration of the engine's operation, and barometric pressure. These inputs precisely track the parameters of the federal test procedure used for emission testing for EPA certification purposes. During EPA emission testing, the vehicles' ECM [electronic control module] ran software which produced compliant emission results under an ECM calibration that VW referred to as the "dyno calibration" (referring to the equipment used in emission testing, called a dynamometer). At all other times during normal vehicle operation, the "switch" was activated and the vehicle ECM software ran a separate "road calibration" which reduced the effectiveness of the emission control system (specifically the selective catalytic reduction or the lean NOx [nitrous oxides] trap.) As a result, emission of NOx increased by a factor of 10 to 40 times above the EPA compliant levels, depending on the type of drive cycle (e.g. city, highway).
That trick was discovered by the West Virginia University's Center for Alternative Fuels, Engines & Emissions when studying the VW vehicles. Initially, VW claimed that the increased emissions were due to "technical issues" and "unexpected in-use conditions." But further tests confirmed the problem, and eventually VW admitted "it had designed and installed a defeat device in these vehicles in the form of a sophisticated software algorithm that detected when a vehicle was undergoing emissions testing."

It's significant that the trick was discovered through extensive mechanical testing. Assuming some form of DRM was employed, it would not have been possible to spot the cheating algorithm of the emissions control code because it would have been illegal to circumvent the software protection. This emphasizes once more the folly of allowing the DMCA to apply to such systems, where problems could be found much earlier by inspecting the software, rather than waiting for them to emerge in use, possibly years later.

The revelation about VW's behavior once more concerns code in cars, but there is a much larger issue here. As software starts to appear routinely in an ever-wider range of everyday objects, so the possibility arises for them to exhibit different behaviors in different situations. Thanks to programming, these objects no longer have a single, fixed set of features, but are malleable, which makes checking their conformance to legal standards much more problematic. When the VW story broke last week, Zeynep Tufekci, assistant professor at the University of North Carolina, tweeted that this was an example of "The Internet of cheating things." I'm not sure whether she coined that phrase -- I'd not seen it before – but it encapsulates neatly a key feature of the world we are beginning to enter.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

Read More | 112 Comments | Leave a Comment..

Posted on Techdirt - 17 September 2015 @ 11:16pm

EU Proposes New Corporate Sovereignty Court For TAFTA/TTIP; US Not Interested

from the CETA-still-a-backdoor-for-ISDS,-anyway dept

As we have reported, the most problematic aspect of the proposed TAFTA/TTIP trade agreement between the US and the EU has been the proposed corporate sovereignty chapter, formally known as investor-state dispute settlement (ISDS). The outcry over this was so great in Europe last year that the European Commission put negotiations of this topic on hold, while it carried out a public consultation on the matter -- presumably assuming that the extremely technical questions about this complex issue would kill off any further interest by the public. Instead, an unprecedented 150,000 submissions were received, 145,000 of which said get rid of ISDS completely. In response, the European Commission merely promised to try to address the many concerns raised with a new and "improved" version.

This was sketched out back in May, when the Commission suggested making the current secret tribunals more like a traditional court. Yesterday, Cecilia Malmström, the EU commissioner responsible for trade, and thus the TAFTA/TTIP negotiations, formally unveiled the European Commission's proposed replacement for traditional corporate sovereignty tribunals, which turns out to be almost identical to the first ideas presented back in May:

The proposal for the new court system includes major improvements such as:

a public Investment Court System composed of a first instance Tribunal and an Appeal Tribunal would be set up;

judgements would be made by publicly appointed judges with high qualifications, comparable to those required for the members of permanent international courts such as the International Court of Justice and the WTO Appellate Body;

the new Appeal Tribunal would be operating on similar principles to the WTO Appellate Body;

the ability of investors to take a case before the Tribunal would be precisely defined and limited to cases such as targeted discrimination on the base of gender, race or religion, or nationality, expropriation without compensation, or denial of justice;

governments' right to regulate would be enshrined and guaranteed in the provisions of the trade and investment agreements.
Although this addresses some of the more glaring faults with traditional corporate sovereignty, notably the lack of transparency, and the inability to appeal against tribunal rulings, it leaves untouched ISDS's biggest problem: the fact that it grants foreign investors unique rights to a completely separate legal system -- one unavailable to domestic companies or the public. For that reason, many organizations that were against old-style ISDS, are also against the new Investment Court System (ICS).

Even if the new ICS were perfect -- and it isn't -- it still wouldn't solve the problem of ISDS for EU citizens. Although Malmström said yesterday that the ICS approach was designed to be used in all future EU trade agreements as a replacement for the usual corporate sovereignty chapters, she admitted that was not an option in the Comprehensive Economic Trade Agreement (CETA) between Canada and the EU, saying: "we are not re-opening the CETA agreement." This confirms what we surmised in a recent post. But as Techdirt noted there, if CETA includes ISDS, US companies with subsidiaries in Canada, of which there are many, will be able to use Canada's trade agreement to by-pass TTIP's new ICS system completely, and sue EU nations indirectly, using ISDS with all its widely-recognized faults. Reforming TAFTA/TTIP's ISDS without reforming CETA's corporate sovereignty provisions is pretty pointless.

Even supporters of the new ICS are worried by this aspect. Bernd Lange is the MEP with responsibility for making recommendations on how the European Parliament (EP) should vote on international trade matters. Although he is relatively happy with the ICS solution, he has confirmed on Twitter that unless the ISDS chapter in CETA is re-negotiated, he will not recommend that the agreement with Canada is ratified when it comes to the main vote, expected in a few months' time. And without the support of his Socialists & Democrats group, CETA is unlikely to pass in the European Parliament, which would kill it completely.

Finally, there is the rather important question of whether the US will accept Malmström's new ICS. As we wrote last month, there's already some indication that the US is not prepared to move from ISDS tribunals to a new kind of open court system. That confirms an earlier dismissal of the idea by US Undersecretary for International Trade at the Commerce Department, Stefan Selig, back in May. Another indication of the US view can be found in a sharp rejection of the EU's ICS proposal by the US Chamber of Commerce, reported here by the Global Edition of the Handelsblatt newspaper:

"While we recognize the E.U. has a political problem relating to future investment treaties, the U.S. business community cannot in any way endorse today's E.U. proposal as a model for the Transatlantic Trade and Investment Partnership (TTIP)," according to Marjorie Chorlins, the chamber's vice president of European affairs.

"The recent European debate around investment treaties -- the obligations governments accept in them and the methods they provide for dispute settlement -- is not grounded in the facts, and the distortions in this debate cannot be allowed to trump sound policy," she said in a statement."
It will be interesting to see what the official US position is on the ICS idea, but those comments from the influential and well-connected US Chamber of Commerce suggest that the battle over whether corporate sovereignty should be included in TAFTA/TTIP, and in what form, is far from over.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

20 Comments | Leave a Comment..

Posted on Techdirt - 16 September 2015 @ 11:09pm

Austrian Ruling Likely To Have Big implications For Copyright Levies Across EU

from the game-on-for-GEMA? dept

Techdirt has been following for a while the copyright levy system in the European Union as it slowly descends into complete chaos, unable to reconcile its analog origins with a digital world where it just makes no sense. A post on the IPKat blog offers something very interesting in this context : a court case that not only quashes Austrian law on private levies, but one that is likely to have important knock-on effects in other EU countries. As the blog post explains, the decision by the Commercial Court in Vienna is just the latest chapter in a long-running saga involving the Austrian collection society, Austro Mechana, and Amazon:

Austro Mechana initiated the proceedings against a number of Amazon entities in October 2007. The main request of Austro Mechana was to oblige Amazon to pay copyright levies for all storage devices sold to customers in Austria. Austro Mechana also filed an information request regarding the quantity and type of storage devices sold to customers in Austria.

Amazon lost at both first instance and in appeal. After that, as suggested by Amazon, the Austrian Supreme Court submitted a request for a preliminary ruling to the CJEU [Court of Justice of the European Union] (September 2011). Amazon had argued that the Austrian law on copyright levies and the procedures implemented by Austro Mechana for the collection of copyright levies complied with neither the [EU's] InfoSoc Directive nor the jurisprudence of the CJEU.
Europe's highest court, the CJEU, handed down its opinion back in 2013. It was largely in favor of Amazon, so the Austrian Supreme Court cancelled all the previous rulings, and sent the case back to the Vienna Commercial Court to consider in the light of the CJEU guidance. The Court of Justice of the European Union does not rule on particular cases, but considers the larger questions of law that are involved, leaving it to local courts to use its published guidance in their subsequent judgments.

There were three main grounds why the Austrian Commercial Court ruled that the entire Austrian copyright levy system had to go. First, it did not provide a proper re-imbursement right: if storage devices are not used for private copying, there must be an "effective" mechanism that allows for re-imbursement of the levy. Austria's system didn't. Secondly, there was no distinction between lawful and unlawful sources for private copying, something else that was required by the InfoSoc Directive. Finally, half of the monies raised by copyright levies were distributed to "social and cultural institutions", rather than to the artists, and that fell foul of the CJEU ruling too.

As the IPKat points out, it is likely that all of these issues affect Germany's copyright levy system, too, so we can probably expect a legal challenge there to be successful. And there's a delightful sting in the tail of the blog post:

The ruling of the Commercial Court of course also raises the question whether dealers, manufacturers or importers may have a claim for repayment of the levies on the principles of unfair enrichment. If such requests would to be made, they might well jeopardise the very existence of Austro Mechana.
Presumably the same would be true in Germany, which would leave the central collecting organization there, the ZPÜ (the Zentralstelle für private Überspielungsrechte) exposed as well. One of the founders of the ZPÜ, and presumably still one of its most important members, is GEMA, well known to Techdirt readers. The possibility that a future court case might force GEMA to pay back a good chunk of all the copyright levies it has received is, of course, a tantalizing prospect.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

13 Comments | Leave a Comment..

Posted on Techdirt - 15 September 2015 @ 2:02pm

Coming To A Surveillance State Near You: Lip-Reading Computers

from the I'm-sorry-Dave,-you-can't-say-that dept

One of the most famous -- and important -- scenes in Stanley Kubrick's film "2001" is when the two astronauts sit in a space pod in order to avoid being overheard by the ship's computer, HAL, which they believe may represent a threat to their lives. Although they have prudently turned off the pod's communication system, what they don't realize is that HAL is able to follow their conversation by lip-reading, and hence is alerted to their disconnection plans.

Although it is unlikely that the Turkish authorities were inspired by the film, the following incident, reported by in a post on the growing censorship in the country, reminds us that the use of lip-reading for surveillance purposes is not science fiction:

Last week, at the funeral of a solider in Osmaniye, south-eastern Turkey, mourners voiced anger at the government's decision to commit troops to conflict with PKK forces in the south-east, leading to several arrests.

Veli Ağbaba, deputy president of the opposition Republican People's Party (CHP), and his colleagues visited two suspects in prison, and have stated that they were arrested on charges of "insulting the president" after footage of the funeral was scrutinized by lip-reading experts.
Calling in lip-reading experts to check whether somebody was insulting the President of Turkey at a funeral might seem a one-off product of an increasingly-paranoid security apparatus. Moreover, using humans is a surveillance technique that doesn't really scale -- unlike metadata analysis, say -- so you might hope this is unlikely to be a problem for most of us. But it turns out that we are very close to building real lip-reading HALs. Here's a 2014 article from The Week:
A Jordanian scientist has created an automated lip-reading system that can decipher speech with an average success rate of 76 per cent. The findings, in conjunction with recent advances in the fields of computer vision, pattern recognition, and signal processing, suggest that computers will soon be able to read lips accurately enough to raise questions about privacy and security.
Moore's Law and other advances in computing pretty much guarantee that 76 percent success rate will rise inexorably, until high-accuracy lip-reading becomes a standard feature for CCTV surveillance systems, especially as very high-resolution cameras fall in price and are deployed more widely. HAL would be proud.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

32 Comments | Leave a Comment..

Posted on Techdirt - 14 September 2015 @ 11:14pm

Diamond Open Access Gets Real: 'Free To Read, Free To Publish' Arrives

from the here's-how-it's-done dept

A couple of years ago, we wrote about a new kind of open access. Alongside the traditional "gold" open access, whereby research institutions pay publishers to make academic papers freely available to all readers, and "green" open access, which consists of posting papers to an institutional repository or open online archive, the mathematician Tim Gowers came up with something he called "diamond" open access. At its heart lies, one of the earliest attempts to open up academic publishing in the early 1990s using the (then) new Net -- basically, it's an online server, where preprint papers are posted for anyone to read. Here are the key features of a diamond open access title:

While in most respects it will be just like any other journal, it will be unusual in one important way: it will be purely an arXiv overlay journal. That is, rather than publishing, or even electronically hosting, papers, it will consist of a list of links to arXiv preprints. Other than that, the journal will be entirely conventional: authors will submit links to arXiv preprints, and then the editors of the journal will find referees, using their quick opinions and more detailed reports in the usual way in order to decide which papers will be accepted.
That comes from a Gower's latest blog post, which announces the creation of a new diamond open access journal called "Discrete Analysis". It's not the first to adopt this model -- Gowers mentions a couple of others already in existence – but the post is interesting because it spells out in detail how this new kind of academic publishing works:
The software for managing the refereeing process will be provided by Scholastica, an outfit that was set up a few years ago by some graduates from the University of Chicago with the aim of making it very easy to create electronic journals. However, the look and feel of Discrete Analysis will be independent: the people at Scholastica are extremely helpful, and one of the services they provide is a web page designed to the specifications you want, with a URL that does not contain the word “scholastica”. Scholastica does charge for this service -- a whopping $10 per submission. (This should be compared with typical article processing charges of well over 100 times this from more conventional journals.)
It's that two orders of magnitude reduction in costs that is so revolutionary here. It means that with a very small grant from someone -- in the case of Discrete Analysis, the money is coming from Cambridge University -- a journal can be created that is not only completely free for everyone who reads it, but free for the academics who publish in it too. This gets around a problem with the gold open access model: the fact that academic institutions have to find quite serious sums in order to adopt it. Diamond open access is not just cheap but hugely cheaper, so this cost issue pretty much goes away. As Gowers writes:
In theory, this offers a way out of the current stranglehold that the publishers have over us: if enough universities set up enough journals at these very modest costs, then we will have an alternative and much cheaper publication system up and running, and it will look more and more pointless to submit papers to the expensive journals, which will save the universities huge amounts of money. Just to drive the point home, the cost of submitting an article from the UK to the Journal of the London Mathematical Society is, if you want to use their open-access option, £2,310. If Discrete Analysis gets 50 submissions per year (which is more than I would expect to start with), then this single article processing charge would cover our costs for well over five years.
Finally, for those who are wondering what advantages this kind of overlay journal offers over using arXiv on its own, Gowers makes three important points:
An obvious partial answer to this question is that the list of links on our journal website will be a list of certificates that certain arXiv preprints have been peer reviewed and judged to be of a suitable standard for Discrete Analysis. Thus, it will provide information that the arXiv alone does not provide.

However, we intend to do slightly more than this. For each paper, we will give not just a link, but also a short description. This will be based on the abstract and introduction, and on any further context that one of our editors or referees may be able to give us. The advantage of this is that it will be possible to browse the journal and get a good idea of what it contains, without having to keep clicking back and forth to arXiv preprints. In this way, we hope to make visiting the Discrete Analysis home page a worthwhile experience.

Another thing we will be able to do with these descriptions is post links to newer versions of the articles. If an author wishes to update an article after it has been published, we will provide two links: one to the “official” version (that is, not the first submitted version, but the “final” version that takes into account comments by the referee), and one to the new updated version, with a brief summary of what has changed.
All-in-all, this is an exciting development, and one that could have a major impact on scholarly publishing if it is taken up more widely. However, the fact that it took even its inventor over two years to create his first diamond open access title shows that it is likely to be a while before that happens.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

7 Comments | Leave a Comment..

Posted on Techdirt - 11 September 2015 @ 3:11am

India's New Patent Guidelines Declare Software And Business Methods Clearly Patentable For The First Time

from the welcome-to-the-land-of-trolls dept

At a time when software patents seem to be on the retreat in the US, India has perversely decided to move in the other direction, as reported here by The Economic Times:

The patent office for the first time made a clear interpretation of the Patents (Amendment) Act, 2002 to mean that if a software has novelty, is inventive or tangible, and has proper technical effect or industrial application, it can be patented. The guidelines serve as a reference for officers in granting patents.
India's patent law has not changed, which means that the following exclusions from patentability, found in The Patents (Amendment) Act 2002, are still relevant:
k) a mathematical or business method or a computer programme per se or algorithms;

(l) a literary, dramatic, musical or artistic work or any other aesthetic creation whatsoever including cinematographic works and television productions;

(m) a mere scheme or rule or method of performing mental act or method of playing game;

(n) a presentation of information;
These are very similar to the exclusions listed in Article 52 of the European Patent Convention (EPC), which governs patent law in Europe. And where the EPC uses the phrase "as such" when it comes to computer programs, the India exclusions contain the equivalent phrase "computer programme per se". As Techdirt readers know, the inclusion of "as such" as a qualifier to the exclusion of computer programs from patentability has opened up a huge loophole through which clever lawyers have driven many thousands of software patents. The fear -- quite justified -- is that exactly the same will happen in India because of the new guidelines' interpretation of what that "per se" phrase means (pdf):
The JPC [Joint Parliamentary Committee] report holds that the computer programmes as such are not intended to be granted patent. It uses the phrase "... certain other things, ancillary thereto or developed thereon.....". The term "ancillary" indicates something essential to give effect to the main subject. In respect of CRIs [Computer Related Inventions], the term "ancillary thereto" would mean the "things" which are essential to give effect to the computer programme. The clause "developed thereon" in the JPC report may be understood as any improvement or technical advancement achieved by such development. Therefore, if a computer programme is not claimed by "in itself" rather it has been claimed in such manner so as to establish industrial applicability of the invention and fulfills all other criterion of patentability, the patent should not be denied. In such a scenario, the claims in question shall have to be considered taking in to account whole of the claims.
Just in case that isn't crystal clear, India's Patent Office helpfully offers some concrete examples of things that it regards as definitely patentable, as well as things that definitely aren't patentable. See if you can guess which category the following example belongs to:
A method for estimating a length of time required to download one or more application programs on a wireless device over wireless network, said method comprising operations of:

the wireless device exchanging one or more data files with server, said data files including at least information representing a size of the one or more application programs available for downloading onto the wireless device;

during the exchanging, at least one of the server and wireless device measuring one or more data transfer rates for the exchanging operation;

receiving user input of one or more application programs to download;

at least one of the server and wireless device:

utilizing the one or more measured data transfer rates and the size of the selected one or more application programs to estimate a length of time required to download the one or more application programs onto the wireless device and the wireless device providing an output of the estimated time.
So the invention consists of sending a test file or two, measuring the average data transfer rate, and then using that to estimate the download time for another file of known size. As you will no doubt have guessed, this is regarded as patentable under the new guidelines.

What could possibly go wrong?

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

10 Comments | Leave a Comment..

Posted on Techdirt - 9 September 2015 @ 11:16pm

Uruguay Withdraws From TISA, Strikes A Symbolic Blow Against The Trade Deal Ratchet

from the who's-next? dept

Techdirt first mentioned the Trade in Services Agreement (TISA) last year, when "The Really Good Friends of Services" -- the self-chosen name for about 20 members of the World Trade Organization -- could no longer keep their plans locked behind closed doors, and word started to spread. Essentially, TISA completes the unholy trinity of global trade agreements that also includes TPP and TAFTA/TTIP. Between the three of them, they sew up just about every aspect of trade in both goods and services -- the latter being TISA's particular focus. They share a common desire to liberalize trade as much as possible, and to prevent national governments from imposing constraints on corporate activity around the world.

One particularly blatant reflection of this desire is the inclusion of something called the "ratchet clause." As with "The Really Good Friends of Services," that's an official name, not something chosen by the opponents of TISA (although they could hardly have come up with anything more revealing.) Here's how the European Commission's TISA page explains it:

A ratchet clause in a trade agreement means a country cannot reintroduce a particular trade barrier that it had previously and unilaterally removed in an area where it had made a commitment.
In other words, the ratchet clause ensures that there is only one direction of travel -- towards greater deregulation, and greater loss of control by sovereign nations.

TISA is unusual for being honest about introducing a ratchet. But there's another, more subtle, kind of ratchet that acts on all major treaties. It means that once a country has joined the negotiations, it becomes increasingly hard to back out, whatever the growing reservations of its public once they find out what is being done in their name. Indeed, that one-way street is one of the most powerful features of trade agreements: corporations only need to get some coveted but controversial measure inserted in a treaty's text, and it will automatically cascade down to all the signatories, however much they -- or their people -- may dislike it. It's how things like anti-circumvention laws for DRM were brought in: once it was included in the WIPO Copyright Treaty, all signatories had to pass legislation implementing it, because they had "no choice", the treaty "forced" them to do it -- a convenient excuse for passing unpopular laws.

The trade ratchet is also why big treaties tend to get bigger: the more countries that join them, the greater the pressure on others to join too lest they are left out in the economic cold. And once in, they tend to stay in. TISA is already huge -- around 50 countries are participating -- so the pressure to join is proportionately intense, and the idea that a country already part of the negotiations might pull out of such "important" talks is similarly unthinkable. And yet that is precisely what Uruguay's Congress has just voted to do:

The ruling progressivist coalition Broad Front overwhelmingly decided to withdraw Uruguay from the negotiations on the supra-national trade-deal TISA (Trade in Services Agreement) in a vote on Saturday.
Inside US Trade today (behind a paywall, but currently visible on its home page) reports:
Uruguayan President Tabare Vazquez has decided to withdraw his country from the negotiations for the Trade in Services Agreement (TISA) following opposition from the center-left ruling coalition and national labor unions, and has ordered his foreign minister to formally notify other participants in the talks.
Clearly, the withdrawal of Uruguay will have almost no effect whatsoever on TISA itself: the major trading nations will continue their talks behind closed doors, agreeing more of the text that locks in their view of how trade in services should be freed from government controls. But Uruguay's move possesses a tremendous symbolic importance. It says that, yes, it is possible to withdraw from global negotiations, and that the apparently irreversible trade deal ratchet can actually be turned back. It sets an important precedent that other nations with growing doubts about TISA -- or perhaps TPP -- can look to and maybe even follow.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

9 Comments | Leave a Comment..

Posted on Techdirt - 9 September 2015 @ 3:12am

Jamaican Government Steals Years Of Public Domain Works From Its People

from the jammin'-the-jammin' dept

Just under four years ago, Techdirt reported that Jamaica was planning something extremely foolish: a retroactive extension to its copyright term. As that article noted, when the European Union did something similar, the European Commission's own figures showed that the move would cost the EU public around one billion Euros, and it was inevitable that the Jamaican people would also lose out if the move went ahead.

The fact that we've heard nothing for four years might have nourished the hope that the Jamaican government had come to its senses, and thrown out any plans it had to short-change its own people in this way. No such luck, of course. Indeed, a depressing post from the EFF reveals that the recently-passed legislation is down there with the worst:

The copyright term in Jamaica is now 95 years from the death of the author, or 95 years from publication for government and corporate works. This makes it the third-longest copyright term in the world, after Mexico and Côte d'Ivoire respectively with 100 and 99 years from the death of the author.
But there's more:
The extension was made retroactive to January 1962. Besides being the year when Jamaica attained independence, 1962 also just so happens to have been the year when Jamaican ska music (a popular genre in its own right, but also a precursor of the even more popular reggae) burst onto the international music scene. The parallels with the extension of the U.S. copyright term in the "Mickey Mouse Protection Act" are quite eerie. But, worse than what happened into the U.S., the retrospective effect of the law means that works that have already passed into the public domain in Jamaica are now to be wrenched back out again.
Under the new copyright law, foreign users of Jamaican copyrights are not bound by the extended copyright term, and yet Jamaicans are obliged to honor foreign copyrights for the full extended term. As the EFF notes:
all that this measure has accomplished is that citizens of Jamaica, a developing country, will be paying more money into Hollywood's coffers, while Jamaica's own rich cultural heritage draws in not a penny more in return.

What's especially ridiculous here is that Jamaica's own ska and reggae success owed much to the lack of copyright protections at the time. It was that lack of copyright enforcement that allowed the music to spread and become a global phenomenon.

This law is so bad that you might hope a future Jamaican government would simply repeal it. After all, there is no rule that says copyright can only be extended, never shortened -- that it is subject to an irreversible ratchet. But imagine what would happen if this were proposed. Copyright companies and artists would be apoplectic, and doubtless start screaming that their rights and property were being being "stolen," because something they had would be taken away from them under the change.

But the same logic applies to situations where copyright is extended, and the passage of works into the public domain delayed, especially if works that are already in the public domain are actively removed from it. In this case, the public has inarguably had something taken away from it -- a right to use a huge number of works in any way without needing to obtain a license from somebody. And that, of course, is exactly what has happened in Jamaica, thanks to the introduction of this retroactive 45-year term extension. It's a perfect example of real copyright theft, not the fake kind claimed so often by fans of a greedy intellectual monopoly that always wants more.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

36 Comments | Leave a Comment..

Posted on Techdirt - 3 September 2015 @ 11:21pm

European Commission Admits Defeat In Trying To Improve Corporate Sovereignty Chapter In CETA

from the now-what? dept

The excitement over the mad dash to finish TPP -- and the failure to do so -- has rather obscured the other so-called trade deals currently being negotiated, such as TAFTA/TTIP and the one between the EU and Canada, CETA. As Techdirt has reported, CETA is even further along than TPP, in what is known as the "legal scrub" phase when the lawyers tweak the agreed text in an attempt to catch drafting errors or other infelicities. Back in June, the EU commissioner responsible for trade and trade agreements, Cecilia Malmström, told that CETA "could be done by the end of July". That obviously didn't happen, or we would presumably have heard about it by now. The article gives a clue as to why not:

Malmström acknowledged that the most delicate part of the legal scrubbing, the clauses about the investor state dispute settlement, also referred to as ISDS, still remain. "We haven't started with that yet," she said, arguing that only three pages of the treaty deals with the sensitive issue, whereas "there are 1597 pages talking about other things."
So the festering wound that is corporate sovereignty, formally "investor-state dispute settlement" (ISDS), could well be the problem here. Back in March of this year, on the subject of corporate sovereignty in CETA, Malmström said:
We will propose any further changes [in ISDS] we collectively agree on in TTIP to the Canadian government.
This promise seems to have been based on the optimistic idea that the European Commission would actually be able to come up with some improvements to ISDS in the wake of its massive rejection by the public in the Commission's consultation on the subject. It's true that Malmström went on to present some very vague ideas about ways in which corporate sovereignty could be "improved", but so far these have not been turned into concrete proposals that could be discussed with the Canadian government.

Given that failure, it should perhaps come as no surprise that Zeit Online is reporting (original in German) that Malmström now has no plans to try to change CETA for the better before it is signed, merely saying that afterwards there will be a "check" on the ISDS mechanism -- a worthless promise, since at that point Canada will have zero reason to make any concessions.

But signing CETA with the current corporate sovereignty chapter is a big problem. CETA's ISDS is actually worse than previous approaches, despite repeated claims to the contrary by the European Commission. But by way of consolation, leaving it untouched may also make getting CETA passed much harder. As we wrote in February, there are some indications that German's Chancellor, Angela Merkel, is not happy about the current ISDS chapter in CETA, while the French government has said that if the ISDS chapter is not re-written, France will not ratify the treaty.

Similarly, in a non-binding but significant set of recommendations to the European Commission regarding TTIP, the European Parliament said that ISDS must be replaced with "a new system for resolving disputes between investors and states which is subject to democratic principles and scrutiny" -- hinting that, without such a new system, it would vote against TTIP. If that is its position on TTIP, it would make no sense not to apply it to CETA, too: since US companies will be able to use their Canadian subsidiaries to sue the EU by invoking CETA's corporate sovereignty chapter, a bad ISDS in CETA would undermine a "better" one in TTIP.

Of course, politicians are notoriously fickle, so it's by no means certain that national governments and the European Parliament will stick to these stated positions when it comes to the actual votes for CETA and TAFTA/TTIP. But there's no doubt that the European Commission's failure to come up with even the mildest of face-saving changes to CETA's corporate sovereignty provisions has just made the task of pushing the deals through a little bit harder.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

19 Comments | Leave a Comment..

Posted on Techdirt - 2 September 2015 @ 2:59am

Problem: Male Operators Use Surveillance Cameras For Ogling Women; Mayor's Solution: Employ Only Female Operators

from the missing-the-point dept

Lo Barnechea is a commune of Chile located in Santiago Province, with a population of about 75,000. Its Mayor, Felipe Guevara, has decided what Lo Barnechea really needs is a massive surveillance system installed in aerostats tethered over the area, as explained by a post on the Derechos Digitales site (original in Spanish.) It's not clear from the article why he chose this unusual approach; perhaps it's because most of his district is mountainous, and that poses problems for conventional surveillance systems. Whatever the reasons, Guevara is smart enough to recognize that powerful mass surveillance systems of the kind he wants to install, which involves cameras able to pick out people from a distance of more than a kilometer away, have a serious problem:

The mayor explains that in a surveillance system implemented in Argentina, the operators started to use the cameras to follow women in the streets.
That might have alerted him to the larger issue here: the fact that there will always be a temptation to abuse such powerful systems. But no, Guevara is undeterred, because he believes he has come up with a way to avoid this issue:
The mayor's unusual solution: to employ women, because "they are less voyeuristic, more discreet" than men."
Or maybe they are so discreet, they just get away with it...

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

23 Comments | Leave a Comment..

Posted on Techdirt - 1 September 2015 @ 9:33am

Canadian Scientist Muzzled For Writing And Performing Song About Canadian Government Muzzling Scientists

from the Harperman-meets-Streisand-Effect dept

Techdirt has been following for a while the Canadian government's unabashed attempts to muzzle scientists and librarians who work for the state, as it tries to deny them the right to express their views if those happen to disagree with the Prime Minister Stephen Harper's political agenda. That battle over freedom of speech is not only continuing, but escalating according to this story in The Globe and Mail:

An Environment Canada scientist is under investigation for allegedly breaching the public service code of ethics by writing and performing a political song that criticizes the Harper government.

Tony Turner, a physical scientist who most recently was working on a study of migratory birds, has been put on administrative leave with pay over allegations that his participation in his song Harperman puts him in a conflict of interest, the union representing him said.

Turner's song, with its opening lines "Who controls our parliament? Harperman, Harperman. Who squashes all dissent? Harperman, Harperman," and a refrain of "It's time for you to go," is pretty mild stuff. A former head of the Ontario Public Service defended the government's actions as follows:

The public sector's ethics code states that federal public servants are expected to "[carry] out their duties in accordance with legislation, policies and directives in a non-partisan and impartial manner." Mr. Dean said the non-partisan nature of the public service offers protection that goes both ways: It prevents government officials from pressing public servants to act in partisan interests, and public servants make a commitment to do their jobs regardless of the political stand of the government of the day.
For one thing, it's not the case that the ethics code "prevents government officials from pressing public servants to act in partisan interests". As a BBC story on the muzzling of Canadian scientists reported:
The [media] protocol requires that all interview requests for scientists employed by the government must first be cleared by officials. A decision as to whether to allow the interview can take several days, which can prevent government scientists commenting on breaking news stories.

Sources say that requests are often refused and when interviews are granted, government media relations officials can and do ask for written questions to be submitted in advance and elect to sit in on the interview.
That's not allowing scientists to speak in a "non-partisan and impartial manner": the "media protocol" is clearly designed to cow government scientists and to ensure that they toe the official line in everything they say, regardless of what the science may indicate.

But the other point is that Turner was not performing his Harperman song as a government employee, but as a citizen -- he is described on the YouTube page as an "Ottawa folksinger", and there is no reference anywhere to his work as a government scientist.

Of course, the great thing about the Canadian government's absurd overreaction to this gentlest of private protests is that many more people will now learn that Turner is an environmental scientist who is being muzzled by a bunch of desperate control freaks who are frightened that the Canadian people might be told the truth about important scientific issues. Thank goodness for the Streisand Effect….

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

27 Comments | Leave a Comment..

More posts from Glyn Moody >>