Glyn Moody’s Techdirt Profile


About Glyn MoodyTechdirt Insider

Posted on Techdirt - 17 August 2018 @ 7:39pm

As Academic Publishers Fight And Subvert Open Access, Preprints Offer An Alternative Approach For Sharing Knowledge Widely

from the this-is-the-future dept

The key idea behind open access is that everyone with an Internet connection should be able to read academic papers without needing to pay for them. Or rather without needing to pay again, since most research is funded using taxpayers' money. It's hard to argue against that proposition, or that making information available in this way is likely to increase the rate at which medical and scientific discoveries are made for the benefit of all. And yet, as Techdirt has reported, academic publishers that often enjoy profit margins of 30-40% have adopted a range of approaches to undermine open access and its aims -- and with considerable success. A recent opinion column in the Canadian journal University Affairs explains how traditional publishers have managed to subvert open access for their own benefit:

An ironic twist to the open-access movement is that it has actually made the publishers richer. They've jumped on the bandwagon by offering authors the option of paying article processing charges (APCs) in order to make their articles open access, while continuing to increase subscription charges to libraries at the institutions where those authors work. So, in many cases, the publishers are being paid twice for the same content -- often charging APCs higher than purely open access journals.

Another serious problem is the rise of so-called "predatory" open access publishers that have distorted the original ideas behind the movement even more. The Guardian reported recently:

More than 175,000 scientific articles have been produced by five of the largest "predatory open-access publishers", including India-based Omics publishing group and the Turkish World Academy of Science, Engineering and Technology, or Waset.

But the vast majority of those articles skip almost all of the traditional checks and balances of scientific publishing, from peer review to an editorial board. Instead, most journals run by those companies will publish anything submitted to them -- provided the required fee is paid.

These issues will be hard, if not impossible, to solve. As a result, many are now looking for a different solution to the problem of providing easy and cost-free access to academic knowledge, this time in the form of preprints. Techdirt reported earlier this year that there is evidence the published versions of papers add very little to the early, preprint version that is placed online directly by the authors. The negligible barriers to entry, the speed at which work can be published, and the extremely low costs involved have led many to see preprints as the best solution to providing open access to academic papers without needing to go through publishers at all.

Inevitably, perhaps, criticisms of the idea are starting to appear. Recently, Tom Sheldon, who is a senior press manager at the Science Media Centre in London, published a commentary in one of the leading academic journals, Nature, under the headline: "Preprints could promote confusion and distortion". As he noted, this grew out of an earlier discussion paper that he published on the Science Media Centre's blog. The Science Media Centre describes itself as "an independent press office helping to ensure that the public have access to the best scientific evidence and expertise through the news media when science hits the headlines." Its funding comes from "scientific institutions, science-based companies, charities, media organisations and government". Sheldon's concerns are not so much about preprints themselves, but their impact on how science is reported:

I am a big fan of bold and disruptive changes which can lead to fundamental culture change. My reading around work on reproducibility, open access and preprint make me proud to be part of a scientific community intent on finding ways to make science better. But I am concerned about how this change might affect the bit of science publication that we are involved with at the Science Media Centre. The bit which is all about the way scientific findings find their way to the wider public and policymakers via the mass media.

One of his concerns is the lack of embargoes for preprints. At the moment, when researchers have what they think is an important result or discovery appearing in a paper, they typically offer trusted journalists a chance to read it in advance on the understanding that they won't write about it until the paper is officially released. This has a number of advantages. It creates a level playing field for those journalists, who all get to see the paper at the same time. Crucially, it allows journalists to contact other experts to ask their opinion of the results, which helps to catch rogue papers, and also provides much-needed context. Sheldon writes:

Contrast this with preprints. As soon as research is in the public domain, there is nothing to stop a journalist writing about it, and rushing to be the first to do so. Imagine early findings that seem to show that climate change is natural or that a common vaccine is unsafe. Preprints on subjects such as those could, if they become a story that goes viral, end up misleading millions, whether or not that was the intention of the authors.

That's certainly true, but is easy to remedy. Academics who plan to publish a preprint could offer a copy of the paper to the group of trusted journalists under embargo -- just as they would with traditional papers. One sentence describing why it would be worth reading is all that is required by way of introduction. To the extent that the system works for today's published papers, it will also work for preprints. Some authors may publish without giving journalists time to check with other experts, but that's also true for current papers. Similarly, some journalists may hanker after full press releases that spoon-feed them the results, but if they can't be bothered working it out for themselves, or contacting the researchers and asking for an explanation, they probably wouldn't write a very good article anyway.

The other concern relates to the quality of preprints. One of the key differences between a preprint and a paper published in a journal is that the latter usually goes through the process of "peer review", whereby fellow academics read and critique it. But it is widely agreed that the peer review process has serious flaws, as many have pointed out for years -- and as Sheldon himself admits.

Indeed, as defenders note, preprints allow far more scrutiny to be applied than with traditional peer review, because they are open for all to read and spot mistakes. There are some new and interesting projects to formalize this kind of open review. Sheldon rightly has particular concerns about papers on public health matters, where lives might be put at risk by erroneous or misleading results. But major preprint sites like bioRxiv (for biology) and the upcoming medRxiv (for medicine and health sciences) are already trying to reduce that problem by actively screening preprints before they are posted.

Sheldon certainly raises some valid questions about the impact of preprints on the communication of science to a general audience. None of the issues is insurmountable, but it may require journalists as well as scientists to adapt to the changed landscape. However, changing how things are done is precisely the point about preprints. The present academic publishing system does not promote general access to knowledge that is largely funded by the taxpayer. The attempt by the open access movement to make that happen has arguably been neutered by shrewd moves on the part of traditional publishers, helped by complaisant politicians. Preprints are probably the best hope we have now for achieving a more equitable and efficient way of sharing knowledge and building on it more effectively.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

20 Comments | Leave a Comment..

Posted on Techdirt - 8 August 2018 @ 7:59pm

ICANN Loses Yet Again In Its Quixotic Quest To Obtain A Special Exemption From The EU's GDPR

from the oh,-do-give-it-a-rest dept

Back in May, we wrote about the bizarre attempt by the Internet Corporation for Assigned Names and Numbers (ICANN) to exempt itself from the EU's new privacy legislation, the GDPR. ICANN sought an injunction to force EPAG, a Tucows-owned registrar based in Bonn, Germany, to collect administrative and technical contacts as part of the domain name registration process. EPAG had refused, because it felt doing so would fall foul of the GDPR. A German court turned down ICANN's request, but without addressing the question whether gathering that information would breach the GDPR.

As the organization's timeline of the case indicates, ICANN then appealed to the Higher Regional Court of Cologne, Germany, against the ruling. Meanwhile, the lower court that issued the original judgment decided to re-visit the case, which it has the option to do upon receipt of an appeal. However, it did not change its view, and referred the matter to the upper Court. The Appellate Court of Cologne has issued its judgment (pdf), with a comprehensive smackdown of ICANN, yet again (via The Register):

Regardless of the fact that already in view of the convincing remarks of the Regional Court in its orders of 29 May 2018 and 16 July 2018 the existence of a claim for a preliminary injunction (Verfügungsanspruch) is doubtful, at least with regard to the main application, the granting the sought interim injunction fails in any case because the Applicant has not sufficiently explained and made credible a reason for a preliminary injunction (Verfügungsgrund).

The Appellate Court pointed out that ICANN could hardly claim it would suffer "irreparable harm" if it were not granted an injunction forcing EPAG to gather the additional data. If necessary, ICANN could collect that information at a later date, without any serious consequences. ICANN's case was further undermined by the fact that gathering administrative and technical contacts in the past had always been on a voluntary basis, so not doing so could hardly cause great damage.

Once more, then, the question of whether collecting this extra personal information was forbidden under the GDPR was not addressed, since ICANN's argument was found wanting irrespective of that privacy issue. And because no interpretation of the GDPR was required for the case, the Appellate Court also ruled there were no grounds for referring the question to the EU's highest court, the Court of Justice of the European Union.

ICANN says that it is "considering its next steps", but it's hard to see what those might be, given the unanimous verdict of the courts. Maybe it's time for ICANN to comply with the EU law like everybody else, and for it to stop wasting money in its forlorn attempts to get EU courts to grant it a special exemption from the GDPR's rules.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

22 Comments | Leave a Comment..

Posted on Techdirt - 2 August 2018 @ 3:26am

Facebook Granted 'Unprecedented' Leave To Appeal Over Referral Of Privacy Shield Case To Top EU Court

from the never-a-dull-moment dept

Back in April, we wrote about the latest development in the long, long saga of Max Schrem's legal challenge to Facebook's data transfers from the EU to the US. The Irish High Court referred the case to the EU's top court, asking the Court of Justice of the European Union (CJEU) to rule on eleven issues that the judge raised. Facebook tried to appeal against the Irish High Court's decision, but the received wisdom was that it was not an option for CJEU referrals of this kind. But as the Irish Times reports, to everyone's surprise, it seems the received wisdom was wrong:

The [Irish] Supreme Court has agreed to hear an unprecedented appeal by Facebook over a High Court judge's decision to refer to the European Court of Justice (CJEU) key issues concerning the validity of EU-US data transfer channels.

The Irish Chief Justice rejected arguments by the Irish Data Protection Commissioner and Schrems that Facebook could not seek to have the Supreme Court reverse certain disputed findings of fact by the High Court. The judge said that it was "at least arguable" Facebook could persuade the Supreme Court that some or all of the facts under challenge should be reversed. On that basis, the appeal could go ahead. Among the facts that would be considered were the following key points:

The chief justice said Facebook was essentially seeking that the Supreme Court "correct" the alleged errors, including the High Court findings of "mass indiscriminate" processing, that surveillance is legal unless forbidden, on the doctrine of legal standing in US law and in the consideration of other issues including safeguards.

Facebook also argues the High Court erred in finding the laws and practices of the US did not provide EU citizens with an effective remedy, as required under the Charter of Fundamental Rights of the EU, for breach of data privacy rights.

Those are crucial issues not just for Facebook, but also for the validity of the entire Privacy Shield framework, which is currently under pressure in the EU. It's not clear whether the Irish Supreme Court is really prepared to overrule the High Court judge, and to what extent the CJEU will take note anyway. One thing that is certain is that a complex and important case just took yet another surprising twist.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

3 Comments | Leave a Comment..

Posted on Techdirt - 26 July 2018 @ 8:06pm

EU And Japan Agree To Free Data Flows, Just As Tottering Privacy Shield Framework Threatens Transatlantic Transfers

from the cooperation-not-confrontation dept

The EU's strong data protection laws affect not only how personal data is handled within the European Union, but also where it can flow to. Under the GDPR, just as was the case with the preceding EU data protection directive, the personal data of EU citizens can only be sent to countries whose privacy laws meet the standard of "essential equivalence". That is, there may be differences in detail, but the overall effect has to be similar to the GDPR, something established as part of what is called an "adequacy decision". Just such an adequacy ruling by the European Commission has been agreed in favor of Japan:

This mutual adequacy arrangement will create the world's largest area of safe transfers of data based on a high level of protection for personal data. Europeans will benefit from strong protection of their personal data in line with EU privacy standards when their data is transferred to Japan. This arrangement will also complement the EU-Japan Economic Partnership Agreement, European companies will benefit from uninhibited flow of data with this key commercial partner, as well as from privileged access to the 127 million Japanese consumers. With this agreement, the EU and Japan affirm that, in the digital era, promoting high privacy standards and facilitating international trade go hand in hand. Under the GDPR, an adequacy decision is the most straightforward way to ensure secure and stable data flows.

Before the European Commission formally adopts the latest adequacy decision, Japan has agreed to tighten up certain aspects of its data protection laws by implementing the following:

A set of rules providing individuals in the EU whose personal data are transferred to Japan, with additional safeguards that will bridge several differences between the two data protection systems. These additional safeguards will strengthen, for example, the protection of sensitive data, the conditions under which EU data can be further transferred from Japan to another third country, the exercise of individual rights to access and rectification. These rules will be binding on Japanese companies importing data from the EU and enforceable by the Japanese independent data protection authority (PPC) and courts.

A complaint-handling mechanism to investigate and resolve complaints from Europeans regarding access to their data by Japanese public authorities. This new mechanism will be administered and supervised by the Japanese independent data protection authority.

It is precisely these areas that are proving so problematic for the data flow agreement between the EU and the US, known as the Privacy Shield framework. As Techdirt has reported, the European Commission is under increasing pressure to suspend Privacy Shield unless the US implements it fully -- something it has failed to do so far, despite repeated EU requests. Granting adequacy to Japan is an effective way to flag up that other major economies don't have any problems with the GDPR, and that the EU can turn its attention elsewhere if the US refuses to comply with the terms of the Privacy Shield agreement.

The new data deal with Japan still has several hurdles to overcome before it goes into effect. For example, the European Data Protection Board, the EU body in charge of applying the GDPR, must give its view on the adequacy ruling, as must the civil liberties committee of the European Parliament -- the one that has just called for Privacy Shield to be halted. Nonetheless, the European Commission will be keen to adopt the adequacy decision, not least to show that countries are still willing to reduce trade barriers, rather than to impose them, as the US is currently doing.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

1 Comments | Leave a Comment..

Posted on Techdirt - 25 July 2018 @ 3:27am

South Africa's Proposed Fair Use Right In Copyright Bill Is Surprisingly Good -- At The Moment

from the stand-back-for-the-lobbyist-attacks dept

Too often Techdirt writes about changes in copyright law that are only for the benefit of the big publishing and recording companies, and offer little to individual creators or the public. So it makes a pleasant change to be able to report that South Africa's efforts to update its creaking copyright laws seem, for the moment, to be bucking that trend. Specifically, those drafting the text seem to have listened to the calls for intelligent fair use rights fit for the digital world. As a post on explains, a key aspect of copyright reform is enshrining exceptions that give permission to Internet users to do all the usual online stuff -- things like sharing photos on social media, or making and distributing memes. The South African text does a good job in this respect:

A key benefit of the Bill is that its new exceptions are generally framed to be open to all works, uses, and users. Research shows that providing exceptions that are open to purposes, uses, works and users is correlated with both information technology industry growth and to increased production of works of knowledge creation.

The solution adopted for the draft of the new copyright law is a hybrid approach that contains both a set of specific modern exceptions for various purposes, along with an open general exception that can be used to assess any use not specifically authorized:

The key change is the addition of "such as" before the list of purposes covered by the right, making the provision applicable to a use for any purpose, as long as that use is fair to the author.

In order to test whether a use is fair, the standard four factors are to be considered:

(i) the nature of the work in question;

(ii) the amount and substantiality of the part of the work affected by the act in relation to the whole of the work;

(iii) the purpose and character of the use, including whether --

(aa) such use serves a purpose different from that of the work affected; and
(bb) it is of a commercial nature or for non-profit research, library or educational purposes; and

(iv) the substitution effect of the act upon the potential market for the work in question.

Crucially, the legislators rejected calls by some to include a fifth factor that would look at whether licenses for the intended use were available. As the post points out, had that factor been included, it would have made it considerably harder to claim fair use. That's one reason why the copyright world has been pushing so hard for licensing as the solution to everything -- whether it's orphan works, text and data mining, or the EU's revised copyright directive. That rejection sends an important signal to other politicians looking to update their copyright laws, and makes the South African text particularly welcome, as the post underlines:

We commend its Parliament on both the openness of this process and on the excellent drafting of the proposed fair use clause. We are confident it will become a model for other countries around the world that seek to modernize their copyright laws for the digital age.

However, for that very reason, the fair use proposal is like to come under heavy attack from the copyright companies and their lobbyists. It remains to be seen whether the good things in the present Bill will still be there in the final law.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

26 Comments | Leave a Comment..

Posted on Techdirt - 23 July 2018 @ 7:49pm

Applicant For Major EU Open Access Publishing Contract Proposes Open Source, Open Data And Open Peer Review As Solution

from the Elsevier-not-invited dept

We've just written about growing discontentment among open access advocates with the role that the publishing giant Elsevier will play in monitoring open science in the EU. That unhappiness probably just went up a notch, as a result of the following development, reported here by Nature:

Elsevier last week stopped thousands of scientists in Germany from reading its recent journal articles, as a row escalates over the cost of a nationwide open-access agreement.

The move comes just two weeks after researchers in Sweden lost access to the most recent Elsevier research papers, when negotiations on its contract broke down over the same issue.

The open science monitoring project involving Elsevier is only a very minor part of the EU's overall open science strategy, which itself is part of the €80 billion research and innovation program called Horizon 2020. A new post on the blog of the open access publisher Hindawi reveals that it has put in a bid in response to the European Commission's call for tenders to launch a major new open research publishing platform:

The Commission's aim is to build on their progressive Open Science agenda to provide an optional Open Access publishing platform for the articles of all researchers with Horizon 2020 grants. The platform will also provide incentives for researchers to adopt Open Science practices, such as publishing preprints, sharing data, and open peer review. The potential for this initiative to lead a systemic transformation in research practice and scholarly communication in Europe and more widely should not be underestimated.

That last sentence makes a bold claim. Hindawi's blog post provides some good analysis of why the transition to open access and open science is proving so hard. Hindawi's proposed solution is based on open source code, and openness in general:

Our proposal to the Commission involves the development of an end-to-end publishing platform that is fully Open Source, with an editorial model that incentivises Open Science practices including preprints, data sharing, and objective open peer review. Data about the impact of published outputs would also be fully open and available for independent scrutiny, and the policies and governance of the platform would be managed by the research community. In particular, researchers who are currently disenfranchised by the current academic reward system, including early career researchers and researchers whose primary research outputs include data and software code, would have a key role in developing the policies of the platform.

Recognizing the flaws in the current system of assessment and rewards is key here. Open access has been around for two decades, but the reliance on near-meaningless impact factors to judge the alleged influence of titles, and thus of the work published there, has barely changed at all. As the Hindawi blog post notes:

As long as journal rank and journal impact factor remain the currency used to judge academic quality, no amount of technological change or economic support for open outputs and open infrastructure will make research and researchers more open

Unfortunately, even a major project like the Horizon 2020 open research publishing platform -- whichever company wins the contract -- will not be able to change that culture on its own, however welcome it might be in itself. Core changes must come from within the academic world. Sadly, there are still precious few signs that those in positions of power are willing to embrace not just open access and even open science, but also a radical openness that extends to every aspect of the academic world, including evaluation and recognition.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

9 Comments | Leave a Comment..

Posted on Techdirt - 16 July 2018 @ 9:36am

Guy In Charge Of Pushing Draconian EU Copyright Directive, Evasive About His Own Use Of Copyright Protected Images

from the do-as-I-say,-not-as-I-do? dept

There's one person who wields more power than anyone to shape the awful EU Copyright Directive: the MEP Axel Voss. He's the head of the main Legal Affairs Committee (JURI) that is steering the Directive through the European Parliament. Voss took over from the previous MEP leading JURI, Therese Comodini Cachia, after she decided to return to her native Malta as member of the national parliament. Her draft version of the Directive was certainly not perfect, but it did possess the virtue of being broadly acceptable to all sides of the debate. When Voss took over last year, the text took a dramatic turn for the worse thanks to the infamous "snippet tax" (Article 11 of the proposed Directive) and the "upload filter" (Article 13).

As Mike reported a couple of weeks ago, Voss offered a pretty poor defense of his proposals, showing little understanding of the Internet. But he made clear that he thinks respecting copyright law is really important. For example, he said he was particularly concerned that material is being placed online, where "there is no remuneration of the concerned author." Given that background, it will probably come as no surprise to Techdirt readers to learn that questions are now being asked whether Voss himself has paid creators for material that he has used on his social media accounts:

BuzzFeed News Germany ... looked at the posts from the past 24 months on Voss's social media channels. In the two years, BuzzFeed News has found at least 17 copyrighted images from at least eight different image agencies, including the German press agency dpa.

As good journalists, BuzzFeed News Germany naturally contacted Axel Voss to ask whether he had paid to use all these copyrighted images:

Since last Thursday, 5 July, BuzzFeed News Germany has asked Voss's office and his personal assistant a total of five times in writing and several times over the phone whether Axel Voss or his social media team has paid for the use of these copyrighted photos. Voss's staff responded evasively five times. Asked if the office could provide us with licensing evidence, the Voss office responded: "We do not provide invoices to uninvolved third parties."

Such a simple question -- had Voss paid for the images he used? -- and yet one that seemed so hard for the Voss team to answer, even with the single word "yes". The article (original in German) took screenshots of the images the BuzzFeed Germany journalists had found. That's just as well, because shortly afterwards, 12 of the 17 posts with copyrighted images had been deleted. The journalists contacted Axel Voss once more, and asked why they had disappeared (original in German). To which Axel Voss's office replied: anyone can add and remove posts, if they wish. Which is true, but fails to answer the question, yet again. However, Axel Voss's office did offer an additional "explanation":

according to the current legal situation (...), if the right-holder informs us that we have violated their rights, we remove the image in question according to the notice and takedown procedure of the e-commerce directive.

That is, Axel Voss, or his office, seems to believe it's fine to post copyrighted material online provided you take it down if someone complains. But that's not how it works at all. The EU notice and takedown procedure applies to the Internet services hosting material, not to the individual users of those services. The fact that the team of the senior MEP responsible for pushing the deeply-flawed Copyright Directive through the European Parliament is ignorant of the current laws is bad enough. That he may have posted copyrighted material without paying for it while claiming to be worried that creators aren't being remunerated for their work, is beyond ridiculous.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

21 Comments | Leave a Comment..

Posted on Techdirt - 11 July 2018 @ 7:37pm

European Parliament Turns Up The Pressure On US-EU Privacy Shield Data Transfer Deal A Little More

from the how-much-longer-can-it-last? dept

Many stories on Techdirt seem to grind on forever, with new twists and turns constantly appearing, including unexpected developments -- or small, incremental changes. The transatlantic data transfer saga has seen a bit of both. Back in 2015, the EU's top court ruled that the existing legal framework for moving data across the Atlantic, Safe Harbor, was "invalid". That sounds mild, but it isn't. Safe Harbor was necessary in order for data transfers across the Atlantic to comply with EU data protection laws. A declaration that it was "invalid" meant that it could no longer be used to provide legal cover for huge numbers of commercial data flows that keep the Internet and e-commerce ticking over. The solution was to come up with a replacement, Privacy Shield, that supposedly addressed the shortcomings cited by the EU court.

The problem is that a growing number of influential voices don't believe that Privacy Shield does, in fact, solve the problems of the Safe Harbor deal. For example, in March last year, two leading civil liberties groups -- the American Civil Liberties Union and Human Rights Watch -- sent a joint letter to the EU's Commissioner for Justice, Consumers and Gender Equality, and other leading members of the European Commission and Parliament, urging the EU to re-examine the Privacy Shield agreement. In December, an obscure but influential advisory group of EU data protection officials asked the US to fix problems of Privacy Shield or expect the EU's top court to be asked to rule on its validity. In April of this year, the Irish High Court made just such a referral as a result of a complaint by the Austrian privacy expert Max Schrems. Since he was instrumental in getting Safe Harbor struck down, that's not something to be taken lightly.

Lastly, one of the European Parliament's powerful committees, which helps determine policy related to civil liberties, added its voice to the discussion. It called on the European Commission to suspend the Privacy Shield agreement unless the US fixed the problems that the committee discerned in its current implementation. At that point, it was just a committee making the call. However, in a recent plenary session, the European Parliament itself voted to back the idea, and by a healthy margin:

MEPs call on the EU Commission to suspend the EU-US Privacy Shield as it fails to provide enough data protection for EU citizens.

The data exchange deal should be suspended unless the US complies with EU data protection rules by 1 September 2018, say MEPs in a resolution passed on Thursday by 303 votes to 223, with 29 abstentions. MEPs add that the deal should remain suspended until the US authorities comply with its terms in full.

It's important to note that this vote is largely symbolic: if the US refuses to improve the data protection of EU citizens, there's nothing to force the European Commission to comply with the demand of the European Parliament. That said, the call by arguably the most democratic part of the EU -- MEPs are directly elected by European citizens -- piles more pressure on the European Commission, which is appointed by EU governments, not elected. If nothing else, this latest move adds to the general impression that Privacy Shield is not likely to survive in its present form much longer.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

5 Comments | Leave a Comment..

Posted on Techdirt - 9 July 2018 @ 7:36pm

Elsevier Will Monitor Open Science In EU Using Measurement System That Favors Its Own Titles

from the conflict-of-interest?-I've-heard-of-it dept

Back in April, we wrote about a curious decision to give the widely-hated publisher Elsevier the job of monitoring open science in the EU. That would include open access too, an area where the company has major investments. The fact that the European Commission seemed untroubled by that clear conflict of interest stunned supporters of open access. Now one of them -- the paleontologist Jon Tennant -- is calling on the European Commission to remove Elsevier, and to find another company with no conflicts of interest. As Tennant writes in the Guardian:

How is it reasonable for a multi-billion dollar publishing corporation to not only produce metrics that evaluate publishing impact [of scientific articles], but also to use them to monitor Open Science and help to define its future direction? Elsevier will be providing data through the monitor that will be used to help facilitate future policy making in the EU that it inevitably will benefit from. That's like having McDonald's monitor the eating habits of a nation and then using that to guide policy decisions.

Elsevier responded with a blog post challenging what it calls "misinformation" in Tennant's article:

We are one of the leading open access publishers, and we make more articles openly available than any other publisher. We make freely available open science products and services we have developed and acquired to enable scientists to collaborate, post their early findings, store their data and showcase their output.

It added:

We have co-developed CiteScore and Snowball Metrics with the research community -- all of which are open, transparent, and free indicators.

CiteScore may be "open, transparent, and free", but Tennant writes:

Consider Elsevier's CiteScore metric, a measure of the apparent impact of journals that competes with the impact factor based on citation data from Scopus. An independent analysis showed that titles owned by Springer Nature, perhaps Elsevier’s biggest competitor, scored 40% lower and Elsevier titles 25% higher when using CiteScore rather than previous journal impact factors.

In other words, one of the core metrics that Elsevier will be applying as part of the Open Science Monitor appears to show bias in favor of Elsevier's own titles. One result of that bias could be that when the Open Science Monitor publishes its results based on Elsevier's metrics, the European Commission and other institutions will start using Elsevier's academic journals in preference to its competitors. The use of CiteScore creates yet another conflict of interest for Elsevier.

As well as writing about his concerns, Tennant is also making a formal complaint to the European Commission Ombudsman regarding the relationship between Elsevier and the Open Science Monitor:

The reason we are pursuing this route is due to the fact that the opportunity to raise a formal appeal was denied to us. In the tender award statement, it states that "Within 2 months of the notification of the award decision you may lodge an appeal to the body referred to in VI.4.1.", which is the General Court in Luxembourg. The notification of the award was on January 11, 2018, and it was exactly 2 months and 1 day later when the role of Elsevier as subcontracted was first publicly disclosed. Due to this timing, we were unable to lodge an appeal.

In other words, it was only revealed that Elsevier was the sub-contractor when it was too late to appeal against that choice. A cynic might almost think those behind the move knew people would object, and kept it quiet until it was impossible under the rules to appeal. Open science? Not so much…

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

12 Comments | Leave a Comment..

Posted on Techdirt - 3 July 2018 @ 11:51am

Copyright Industries Reveal Their Ultimate Goal: An Internet Where Everything Online Requires A License From Them

from the now-would-be-a-good-time-to-stop-it-happening dept

Yesterday, Mike took apart an extraordinarily weak attempt by the UK's music collection society, PRS for Music, to counter what it claimed were "myths" about the deeply-harmful Article 13 of the proposed EU Copyright Directive. On the same day, the Guardian published a letter from the PRS and related organizations entitled "How the EU can make the internet play fair with musicians". It is essentially a condensed version of the "myth-busting" article, and repeats many of the same fallacious arguments. It also contains some extremely telling passages that are worth highlighting for the insights that they provide into the copyright industries' thinking and ultimate goal. Here is the main thrust of the letter:

This is not about censorship of the internet, as the likes of Google and Facebook would have you believe. The primary focus of this legislation is concerned with whether or not the internet functions as a fair and efficient marketplace -- and currently it doesn't.

Once again, there is no attempt to demonstrate that Article 13 is not about censorship, merely an assertion that it isn't, together with the usual claim that it's all being orchestrated by big US Internet companies. The fact that over two-thirds of a million people have signed an online petition calling for the "censorship machine" of Article 13 to be stopped rather punctures that tired argument.

More interesting is the second sentence, which essentially confirms that for the recording industry, the Copyright Directive -- and, indeed, the Internet itself -- is purely about getting as much money as possible. There is no sense that there are other important aspects -- like encouraging ordinary people to express themselves, and to be creative for the sheer joy of creating, or in order to amuse and engage with friends and strangers. The fact that all these non-commercial uses will be adversely affected by Article 13 is irrelevant to the recording industry, which seems to believe that making a profit takes precedence over everything else. However, even if they choose to ignore this side of the Internet, the signatories of the letter are well-aware that there is a huge backlash against the proposed law precisely because it is a threat to this kind of everyday online use. Attempting to counter this, they go on:

It is important to recognise that article 13 of the proposed EU copyright directive imposes no obligation on users. The obligations relate only to platforms and rightsholders. Contrary to some sensationalist headlines, internet memes will not be affected, as they are already covered by exceptions to copyright, and nothing in the proposed article will allow rightsholders to block the use of them.

Techdirt pointed out yesterday why the first part of that is intellectually dishonest. The Copyright Directive won't impose obligations on users directly, but on the platforms that people use, which amounts to the same thing in practice. The letter then trots out the claim that Internet memes will not be affected, and specifically says this is because they are already covered by EU exceptions to copyright.

This is simply not true. Article 5 of the EU's 2001 Directive on the "harmonization of certain aspects of copyright and related rights in the information society" lays down that "Member States may provide for exceptions or limitations", including "for the purpose of caricature, parody or pastiche". However, that is optional, not compulsory. In fact, nineteen EU Member States -- including the EU's most populous country, Germany -- have chosen not to provide an exception for parody. Even assuming that memes would be covered by parody exceptions -- by no means guaranteed -- they are in any case illegal in 19 EU nations.

Licensing is not an option here. There are many diverse sources for the material used in memes, most of which have no kind of organization that could give a license. The only way for online companies to comply with Article 13 would be to block all memes using any kind of pre-existing material in those 19 countries without a parody exception. Worse: because it will be hard to apply different censorship rules for each EU nation, it is likely that the upload filters will block all such memes in the whole EU, erring on the side of caution. It will then be up to the person whose meme has been censored to appeal against that decision, using an as-yet undefined appeals mechanism. The chilling effect this "guilty until proven innocent" approach will have on memes and much else is clear.

The blatant misinformation about whether memes would be blocked is bad enough. But in many ways, the most shocking phrase in the letter is the following:

Actually, article 13 makes it easier for users to create, post and share content online, as it requires platforms to get licences, and rightsholders to ensure these licences cover the acts of all individual users acting in a non-commercial capacity.

There, in black and white, is the end-game that the recording industry is seeking: that every online act of individual users, even the non-commercial ones, on the major platforms must be licensed. But the desire to control the online world, and to dictate who may do what there, is not limited to the recording companies: it's what all the copyright industries want. That can be seen in Article 11 of the Copyright Directive -- the so-called "snippet tax" -- which will require licensing for the use by online sites of even small excerpts of news material.

It's also at the root of the core problem with Article 3 of the proposed EU law. This section deals with the important new field of text and data mining (TDM), which takes existing texts and data, and seeks to extract new information by collating them and analyzing them using powerful computers. The current Copyright Directive text allows TDM to be carried out freely by non-profit research organisations, on material to which they have lawful access. However, companies must pay publishers for a new, additional, license to carry out TDM, even on material they have already licensed for traditional uses like reading. That short-sighted double-licensing approach pretty much guarantees that AI startups, which typically require frictionless access to large amounts of training data, won't choose to set up shop in the EU. But the publishing industry never cares about the collateral damage it inflicts, provided it attains its purely selfish goals.

Although it's rather breathtaking to see the copyright world openly admit that its ultimate aim is to turn the Internet into a space where everything is licensed, we shouldn't be surprised. Back in 2013, Techdirt wrote about the first stages of the EU's revision of its copyright law. One preliminary initiative was called "Licences for Europe", and its stated aim was to "explore the potential and limits of innovative licensing and technological solutions in making EU copyright law and practice fit for the digital age". What we are seeing now in the proposed Copyright Directive is simply a fulfillment of these ambitions, long-cherished by the copyright industries. If you aren't happy about that, now would be a good time to tell the EU Parliament to Save Your Internet. It may be your last chance.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

111 Comments | Leave a Comment..

Posted on Techdirt - 2 July 2018 @ 7:48pm

Researchers Reveal Details Of Printer Tracking Dots, Develop Free Software To Defeat It

from the whistleblowers-of-the-world,-rejoice,-but-still-be-careful dept

As Techdirt has reported previously in the case of Reality Leigh Winner, most modern color laser printers place tiny yellow tracking dots on every page printed -- what Wikipedia calls "printer steganography". The Electronic Frontier Foundation (EFF) first started warning about this sneaky form of surveillance back in 2005. It published a list of printers and whether it was known that they used tracking dots. In 2017, the EFF stopped updating the list, and wrote:

It appears likely that all recent commercial color laser printers print some kind of forensic tracking codes, not necessarily using yellow dots. This is true whether or not those codes are visible to the eye and whether or not the printer models are listed here. This also includes the printers that are listed here as not producing yellow dots.

Despite the EFF's early work in exposing the practice, there has been limited information available about the various tracking systems. Two German researchers at the Technical University in Dresden, Timo Richter and Stephan Escher, have now greatly extended our knowledge about the yellow dot code (via As the published paper on the work explains, the researchers looked at 1286 printed pages from 141 printers, produced by 18 different manufacturers. They discovered four different encoding systems, including one that was hitherto unknown. The yellow dots formed grids with 48, 64, 69 or 98 points; using the grid to encode binary data, the hidden information was repeated multiple times across the printed page. In all cases the researchers were able to extract the manufacturer's name, the model's serial number, and for some printers the date and time of printing too.

It's obviously good to have all this new information about tracking dots, but arguably even more important is a software tool that the researchers have written, and made freely available. It can be used to obfuscate tracking information that a printer places in one of the four grid patterns, thus ensuring that the hard copy documents cannot easily be used to trace who printed them. Printer manufacturers will doubtless come up with new ways of tracking documents, and may already be using some we don't know about, but this latest work at least makes it harder with existing models.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

40 Comments | Leave a Comment..

Posted on Techdirt - 2 July 2018 @ 3:23am

Over 60 Organizations Want Sanctions For EU Nations' Failure To Repeal 'Invalid' Data Retention Laws

from the attacking-the-data-retention-zombie dept

We recently wrote about a slight setback in the fight against mass surveillance in Europe. But in general, there have been some good decisions in this field by the EU's top court. In 2014, the Court of Justice of the European Union (CJEU) ruled that the region's Data Retention Directive was "invalid", in what is generally known as the "Digital Rights Ireland" case. In 2016, the CJEU took a similarly dim view of the UK's Data Retention and Investigatory Powers Bill (DRIPA), in the "Tele-2/Watson" judgment. Under EU law, those decisions had to be implemented by all the EU Member States. But a report by Privacy International published in September last year showed that compliance has been dismal (pdf):

in an alarmingly large number of Member States (roughly 40% of all countries surveyed in this report) the pre-Digital Rights Ireland regime transposing Directive 2006/24 is still in place.

That is, the national laws that implemented the Data Retention Directive had not been repealed, despite the CJEU's ruling that they were invalid, nor had new legislation been passed. The research also showed something interesting about the other countries that had repealed or amended their data retention laws:

What has emerged from our analysis is that as a rule of thumb repeal or amendments to data retention legislation have mainly occurred as a result of challenges in national courts, predominately by human rights NGOs, while Governments and legislators have been largely inactive.

In other words, governments have to be kicked into doing something, otherwise they just ignore the CJEU's ruling. Based on that fact, dozens of NGOs, community networks, academics and activists have decided to increase the pressure on Member States that are slacking:

60 organisations, community networks and academics in 19 EU Member States are sharing their concerns to the European Commission, to demand action, and to stand for the protection of fundamental rights enshrined in Articles 7, 8 and 11 of the Charter of Fundamental Rights of the European Union, as interpreted by the Grand Chamber of the CJEU. We call for the application of sanctions for non-compliant Member States by referring to the CJEU, which should logically strike down all current data retention national frameworks.

As the dedicated web site indicates, there are now over 60 organizations backing the move and signatories to the formal letter of complaint sent to the European Commission (pdf). Given the CJEU's clear ruling against the earlier data retention frameworks, it seems likely that it will also strike down the national implementations of them. Whether the European Commission will send these cases to the CJEU, and how long it will take if it decides to do so, is less clear. If nothing else, the latest move underlines just how important it is for digital rights organizations to keep up the pressure -- and how hard it is to kill off bad EU laws once they are passed.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

Leave a Comment..

Posted on Techdirt - 21 June 2018 @ 11:58am

In A Surprising Decision, European Court Of Human Rights Says Sweden's Mass Surveillance Is Fine

from the but-top-EU-court's-views-may-matter-more dept

In the wake of Snowden's revelations of the scale of mass surveillance around the world, various cases have been brought before the courts in an attempt to stop or at least limit this activity. One involved Sweden's use of bulk interception for gathering foreign intelligence. A public interest law firm filed a complaint at the European Court of Human Rights (ECtHR). It alleged that governmental spying breached its privacy rights under Article 8 of the European Convention on Human Rights (pdf). The complaint said that the system of secret surveillance potentially affected all users of the Internet and mobile phones in Sweden, and pointed out that there was no system for citizens to use if they suspected their communications had been intercepted. The ECtHR has just ruled that "although there were some areas for improvement, overall the Swedish system of bulk interception provided adequate and sufficient guarantees against arbitrariness and the risk of abuse":

In particular, the scope of the signals intelligence measures and the treatment of intercepted data were clearly defined in law, permission for interception had to be by court order after a detailed examination, it was only permitted for communications crossing the Swedish border and not within Sweden itself, it could only be for a maximum of six months, and any renewal required a review. Furthermore, there were several independent bodies, in particular an inspectorate, tasked with the supervision and review of the system. Lastly, the lack of notification of surveillance measures was compensated for by the fact that there were a number of complaint mechanisms available, in particular via the inspectorate, the Parliamentary Ombudsmen and the Chancellor of Justice.

When coming to that conclusion, the Court took into account the State's discretionary powers in protecting national security, especially given the present-day threats of global terrorism and serious cross-border crime.

One expert in this area, TJ McIntyre, expressed on Twitter his disappointment with the judgment:

It might have been too much to expect bulk intercept ruled out in principle, but it is surprising to see a retreat from existing standards on safeguards.

McIntyre played a leading role in one of the key cases brought against mass surveillance, by Digital Rights Ireland in 2014. It resulted in the EU's top court, the Court of Justice of the European Union (CJEU), ruling the EU's Data Retention Directive was "invalid". As McIntyre notes, the detailed ECtHR analysis mentions the CJEU decision, but not the more recent ruling by the latter that struck down the "Safe Harbor" framework because of mass surveillance by the NSA.

The judgment significantly waters down safeguards previously developed by the ECtHR in relation to notification and possibility of a remedy against unlawful surveillance.

For example, McIntyre points out the ECtHR accepted that it is necessary for the Swedish signals intelligence service to store raw material before it can be manually processed:

Remarkably weak controls on storage and downstream use of intercept material were accepted by the ECtHR -- in particular, it was satisfied with retention of bulk intercept "raw material" for one year!

Something of a setback in terms of limiting mass surveillance, the latest judgment goes against the general trend of decisions by the arguably more important CJEU court. In 2014 the latter effectively ruled that its own decisions should take precedence over those of the ECtHR if they came into conflict. That is now more likely, given the CJEU's hardening position against mass surveillance, and the diverging judgment from the ECtHR, which shows some softening.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

16 Comments | Leave a Comment..

Posted on Techdirt - 20 June 2018 @ 7:38pm

China's Latest Censorship Crackdown Target: Videos Of Women Rubbing, Kissing And Licking Binaural Microphones

from the whisper-sweet-nothings-in-my-ear dept

A few weeks back, we wrote about some unpublished censorship guidelines that provided insights into what the Chinese government is trying to stamp out online. However, one of the more curious activities whose depiction was forbidden was "vulgar use of a microphone controller". That seemed both surprisingly specific, and yet tantalizingly vague. A new post on Abacus News may explain what was meant by that phrase. It reports on yet another censorship move by the Chinese authorities:

the country's anti-pornography office ordered a number of platforms to remove a lot of ASMR content -- because they say some are akin to softcore porn.

Autonomous sensory meridian response (ASMR) is defined by Wikipedia as follows:

a term used for an experience characterized by a static-like or tingling sensation on the skin that typically begins on the scalp and moves down the back of the neck and upper spine. It has been compared with auditory-tactile synesthesia. ASMR signifies the subjective experience of "low-grade euphoria" characterized by "a combination of positive feelings and a distinct static-like tingling sensation on the skin". It is most commonly triggered by specific auditory or visual stimuli, and less commonly by intentional attention control.

The banned videos in China typically show people -- well, nearly always young women -- whispering into special high-quality binaural microphones that aim to capture audio the same way our ears hear sounds. As well as producing extremely realistic results, the microphones also allow sounds to move from one ear to the other -- best experienced with headphones to enhance this effect -- as if the person speaking is right next to you, and moving around very close to you.

The women in the videos whisper, rather than speak, because it has been found to be the most effective way to produce ASMR's characteristic "tingling" sensation. But ASMR videos also include the sounds of people licking, kissing, and rubbing the microphones in various ways -- which may explain that "vulgar use of a microphone controller" the Chinese authorities want to censor. As a representative example, the Abacus News points to a two-hour long YouTube video of one of the ASMR stars in China, Xuanzi Giant 2 Rabbit:

In the video, she speaks softly into an ear-shaped microphone, taps it, covers it in plastic, even rubs a Q-tip inside it, creating a variety of sounds to trigger ASMR.

But she does it while dressed in the revealing outfit of Mai Shiranui from The King of Fighters, and whispers things like "Husband, your highness, do you have any instructions?" In another clip, wearing the same outfit, she strikes a provocative pose on the bed.

ASMR is even referred to as "in-skull orgasm" by many Chinese internet users, highlighting the sexual image of some videos.

It's not hard to see why China's anti-pornography department might want to block this kind of thing. However, as a short video by The New York Times exploring the phenomenon makes clear, mainstream ADMR is rather different from these Chinese variants. The aim is to relax rather than excite, and to tap into what may be a calming physiological response similar to that produced when animals groom each other. In any case, the Chinese attempt to censor ASMR videos seems pretty hopeless:

After hearing about this crackdown, we tried to search by the keyword "ASMR" on some of China's biggest streaming platforms, like Bilibili and Douyu. The searches yielded no results. But the videos still appear if you go directly to the playlists of many ASMR hosts. And since they're not banned in the West, many are available on YouTube.

This probably means we can expect yet another Chinese crackdown on ASMR videos at some point in the future, and yet another failure to eradicate that "vulgar use of a microphone controller".

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

17 Comments | Leave a Comment..

Posted on Techdirt - 18 June 2018 @ 7:48pm

Open Source Industry Australia Says Zombie TPP Could Destroy Free Software Licensing

from the another-reason-not-to-ratify dept

It seems incredible, but the TPP trade deal is still staggering on, zombie-like. It's official name is now the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP), but even the Australian government just calls it TPP-11. The "11" refers to the fact that TPP originally involved 12 nations, but the US pulled out after Donald Trump's election. The Australian Senate Standing Committee on Foreign Affairs, Defence & Trade is currently conducting an inquiry into TPP-11 as a step towards ratification by Australia. However, in its submission to the committee (pdf), Open Source Industry Australia (OSIA) warns that provisions in TPP-11's Electronic Commerce Chapter "have the potential to destroy the Australian free & open source software (FOSS) sector altogether", and calls on the Australian government not to ratify the deal. The problem lies in Article 14.17 of the TPP-11 text (pdf):

No Party shall require the transfer of, or access to, source code of software owned by a person of another Party, as a condition for the import, distribution, sale or use of such software, or of products containing such software, in its territory.

In its submission to the committee, the OSIA writes:

Article 14.17 of CPTPP prohibits requirements for transfer or access to the source code of computer software. Whilst it does contain some exceptions, those are very narrow and appear rather carelessly worded in places. The exception that has OSIA up in arms covers "the inclusion of terms and conditions related to the provision of source code in commercially negotiated contracts". If Australia ratifies CPTPP, much will turn on whether the Courts interpret the term "commercially negotiated contracts" as including FOSS licences all the time, some of the time or none of the time.

If the Australian courts rule that open source licenses are not "commercially negotiated contracts", those licences will no longer be enforceable in Australia, and free software as we know it will probably no longer exist there. Even if the courts rule that free software licenses are indeed "commercially negotiated contracts", there is another problem, the OSIA says:

The wording of Art. 14.17 makes it unclear whether authors could still seek injunctions to enforce compliance with licence terms requiring transfer of source code in cases where their copyright has been infringed.

Without the ability to enforce compliance through the use of injunctions, open source licenses would once again be pointless. Although the OSIA is concerned about free software in Australia, the same logic would apply to any TPP-11 country. It would also impact other nations that joined the Pacific pact later, as the UK is considering (the UK government seems not to have heard of the gravity theory for trade). It would presumably apply to the US if it did indeed rejoin the pact, as has been mooted. In other words, the impact of this section on open source globally could be significant.

It's worth remembering why this particular article is present in TPP. It grew out of concerns that nations like China and Russia were demanding access to source code as a pre-requisite of allowing Western software companies to operate in their countries. Article 14.17 was designed as a bulwark against such demands. It's unlikely that it was intended to destroy open source licensing too, although some spotted early on that this was a risk. And doubtless a few big software companies will be only too happy to see free software undermined in this way. Unfortunately, it's probably too much to hope that the Australian Senate Standing Committee on Foreign Affairs, Defence & Trade will care about or even understand this subtle software licensing issue. The fate of free software in Australia will therefore depend on whether TPP-11 comes into force, and if so, what judges think Article 14.17 means.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

23 Comments | Leave a Comment..

Posted on Techdirt - 14 June 2018 @ 12:00pm

EU Politicians Tell European Commission To Suspend Privacy Shield Data Transfer Framework

from the US-must-try-harder dept

A couple of months ago, we wrote about an important case at the Court of Justice of the European Union (CJEU), the region's highest court. The final judgment is expected to rule on whether the Privacy Shield framework for transferring EU personal data to the US is legal under EU data protection law. Many expect the CJEU to throw out Privacy Shield, which does little to address the earlier criticisms of the preceding US-EU agreement: the Safe Harbor framework, struck down by the same court in 2015. However, that's not the only problem that Privacy Shield is facing. One of the European Parliament's powerful committees, which helps determine policy related to civil liberties, has just issued a call to the European Commission to suspend the Privacy Shield agreement unless the US tries harder:

The data exchange deal should be suspended unless the US complies with it by 1 September 2018, say MEPs, adding that the deal should remain suspended until the US authorities comply with its terms in full.

There are a couple of reasons why the European Parliament's committee has taken this unusual step. One is the recent furore surrounding Cambridge Analytica's use of personal data collected by Facebook, which the EU politicians incorrectly call a "data breach". However, as they correctly point out, both companies were certified under Privacy Shield, which doesn't seem to have prevented the data from being misused:

Following the Facebook-Cambridge Analytica data breach, Civil Liberties MEPs emphasize the need for better monitoring of the agreement, given that both companies are certified under the Privacy Shield.

MEPs call on the US authorities to act upon such revelations without delay and if needed, to remove companies that have misused personal data from the Privacy Shield list. EU authorities should also investigate such cases and if appropriate, suspend or ban data transfers under the Privacy Shield, they add.

The other concern is the recently-passed Clarifying Lawful Overseas Use of Data Act (CLOUD Act), which grants the US and foreign police access to personal data across borders. This undermines the effectiveness of the privacy protections of the data transfer scheme, since it would allow the personal data of EU citizens to be accessed more easily. The head of the civil liberties committee, Claude Moraes, is quoted as saying:

While progress has been made to improve on the Safe Harbor agreement, the Privacy Shield in its current form does not provide the adequate level of protection required by EU data protection law and the EU Charter. It is therefore up to the US authorities to effectively follow the terms of the agreement and for the Commission to take measures to ensure that it will fully comply with the GDPR.

The mention of the new GDPR there is significant, since it raises the bar for the Privacy Shield framework's compliance with EU data protection laws. A greater stringency makes it more likely that the European Commission will suspend the deal, and that the CJEU will strike it down permanently at some point.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

5 Comments | Leave a Comment..

Posted on Techdirt - 13 June 2018 @ 11:58am

Top German Publisher Says: 'You Wouldn't Steal A Pound Of Butter... So We Need A Snippet Tax'

from the articles-11-and-13-must-go dept

Last week, Mike provided a virtuoso excoriation of the European publishers' shameless demand to be given even more copyright control over tiny snippets of news stories as part of the awful EU copyright directive. As that post pointed out, the publishers' "mythbuster" did nothing of the sort, but it did indicate a growing panic among the industry as more critical attention is brought to bear on the ridiculous "snippet tax" -- Article 11 of the proposed new EU copyright law -- which has already failed twice elsewhere. The German site Über Medien -- "About Media" -- offers another glimpse of publishers trying desperately to justify the unjustifiable (original in German). Actually, it's one publisher in particular: Mathias Döpfner. He's the CEO of the German company Axel Springer, one of the world's largest publishers, although even his company is unlikely to benefit much from the snippet tax. Speaking on Austrian television, Döpfner made a rather remarkable claim:

It's about the question of whether the intellectual good that is produced is a protected good or not. At the moment it is a good that is not protected in the digital world. Anyone can take an article, a video, a journalistic element that a publisher has prepared, copy it, put it in another context and even market it successfully.

Yes, the boss of one of the biggest and most successful publishers in the world is claiming that digital material is not protected by copyright, and that anyone can take and use it, which is why new laws are needed. Since he was talking about the EU's Article 11, he also seems to be conflating using snippets with taking an entire article. To underline his point, Döpfner offered a homely comparison:

If I can go to the grocery store and just grab a pound of butter or a carton of milk without paying for it, why should anyone come and pay for it, and why would anyone else offer butter or milk?

But that's not what Google is doing when it uses snippets. It's more like it is taking a picture of the pound of butter, and then showing people the photo along with the address of the grocery store when they search for "butter" using Google's search engine. Google is not stealing anything, just sending business to the store. It's the same with displaying snippets that link to the original article. The Über Medien post rightly goes on to note that publishers don't really have a problem with Google showing snippets and sending them traffic. But their sense of entitlement is so great they want to force the US company to pay for the privilege of sending them traffic. Or, to put it in terms of Döpfner's forced analogy:

Publishers do not want Google to stop stealing butter and milk in their supermarkets. The publishers want to oblige Google to steal bread and milk from them and pay for it.

The fact that the head of German's biggest publisher resorts to the old "you wouldn't steal a car/pound of butter/carton of milk" rhetoric shows just how vanishingly thin the argument in favor of a snippet tax really is. It's time for the EU politicians to recognize this, and remove it from the proposed copyright directive, along with Article 13's even-more pernicious upload filter. EU citizens can use the new SaveYourInternet site to contact their representatives. Ahead of the important EU vote on the proposed law early next week, now would be a really good time to do that.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

50 Comments | Leave a Comment..

Posted on Free Speech - 7 June 2018 @ 8:13pm

Unpublished Censorship Guidelines Lay Bare The Deepest Fears Of The Chinese Government

from the vulgar-use-of-a-microphone-controller dept

It's hardly a state secret that China is instituting the most complete surveillance and censorship system ever attempted by a society (so far), and on an unprecedented scale. Techdirt has been tracking that sad saga over the years, mostly reporting on how censorship is being implemented. Less information has been available about what exactly the Chinese government doesn't want people to know about/discuss. Aside from the obvious issues -- repression of Tibetans and Uyghurs, Tiananmen Square protests, environmental problems, government corruption etc. -- just what is Beijing afraid of? A document obtained by the The Globe and Mail may shed some light on this question, although it's still not entirely clear who wrote it:

It began circulating early this year, and is believed to have been issued by the powerful Cyberspace Administration of China, China's central Internet authority, which did not respond to requests for comment.

It's also possible that the document, which outlines 10 basic categories of banned content, was written by a government-affiliated trade association, a censorship expert said.

In any case, experts seem to accept that it represents the Chinese government's position quite well, which makes the insights it gives into official thinking extremely valuable. Forbidden activities include many that come as no surprise, such as: insulting leaders, criticizing official policies, spreading information about "made-up" accidents, epidemics, police incidents, and issues related to the economy. Celebrities are protected to a certain extent, with a ban on over-the-top stories about their sex scandals or luxurious lifestyles. Talking about violence, superstitions or religions are also out, as are the following:

Not only is pornography banned, but so is all obscenity, a category that includes "using a bed or sofa as a prop or background," appearing shirtless, wearing tattoos or dancing in a way "that has flirtatious and vulgar elements." Also forbidden is the spreading of harmful information, a category that includes cursing, smoking and drinking, gambling or "vulgar use of a microphone controller (or any mimicking of it)."

But alongside much that is outright wacky -- what on earth does "vulgar use of a microphone controller" even mean? -- the article quotes Yaxue Cao, the founder and editor of, who points out a more serious underlying strategy discernible here:

"It targets political dissent of course, but any activities that might cause a large number of people to coalesce, whether through popular entertainment such as Duanzi (jokes) and cartoons, or through direct sales network," she said, in an e-mail. "It also aims at content that might give people ideas of resistance and how-to knowledge. I go through each category, this is the theme I see: a heightened sense of regime insecurity."

It's a great point that explains much of what the Chinese government has done over the last few years. What the authorities fear above all else is not so much any of the topics mentioned above in themselves, but the thought that they might help people to band together, and even formulate an idea that is truly frightening for Beijing: that they could start to resist.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

44 Comments | Leave a Comment..

Posted on Techdirt - 5 June 2018 @ 3:23am

Study Shows That Wartime Program To Abolish Copyright On German Science Books Brought Significant Benefits To US

from the perhaps-we-should-do-it-more-often dept

As Techdirt readers know, there is a ratchet effect that means copyright always gets longer and stronger. As well as being inherently unfair -- why must the public always lose out when copyright law is changed? -- there's another unfortunate consequence. If the term or breadth of copyright were reduced from time to time, we would be able to test the effects of doing so on things like creativity. For example, if it turned out that shortening copyright increased the number of works being produced, then there would be a strong argument for reducing it further in the hope that the effect would be strengthened. The fact that we have been unable test this hypothesis is rather convenient for copyright maximalists. It means they can continue to call for the term of copyright to be increased without having to address the argument that this will cause less creativity, or reduce access to older works.

Even though it is not possible to test the effects of reduced copyright directly, two US academics, Barbara Biasi and Petra Moser, have spotted a clever way of investigating the idea indirectly, in the field of science publishing. As they write in a post on CEPR's policy portal, in 1942 the US Book Republication Program (BRP) allowed US publishers to reprint exact copies of German-owned science books, effectively abolishing copyright for that class of works. They have looked at what impact this dramatic change had on the use of those reprinted works by scientists. Comparing citation rates before and after the BRP was introduced is not enough on its own: citation rates fluctuate, so it is necessary to compare the BRP citation rate with something else. The researchers' solution is to look at the citation rate of Swiss books from the same time:

This approach addresses the issue that English-language citations may have increased mechanically after 1942, if English-language scientists published more after the war. Like German scientists, Swiss scientists were leaders in chemistry and mathematics and wrote primarily in German, but due to Switzerland's neutrality, Swiss-owned copyrights were not accessible to the BRP. [Office of Library Services] estimates of a matched sample of BRP and Swiss books (in similar fields and with similar levels of pre-BRP non-English citations) confirm the significant increase in citations in response to the BRP.

Specifically, there was a 67% increase in citations of BRP books compared to similar Swiss books. The research suggests this was driven largely by the 25% drop in average prices seen after the BRP scheme was introduced. The reduction in price seems to have allowed a wider range of US libraries to purchase the more affordable BRP texts, whereas Swiss books remained concentrated in the holdings of two wealthy research libraries (Yale and Chicago). Better access was correlated with more citations: the data shows that the latter increased most near the locations of BRP libraries. The researchers conclude:

In the context of contemporary debates, our findings imply that policies which strengthen copyrights, such as extensions in copyright length, can create enormous welfare costs by discouraging follow-on science, especially among less affluent institutions and scientists.

Critics might point out that this is just one study of one rather specific area. But that's an argument for reducing copyright terms, perhaps on a trial basis, to see whether the results of this research are confirmed. However, the copyright ratchet will never allow that, not least because the companies involved probably know it would confirm that constantly strengthening copyright is bad for everyone except themselves.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

34 Comments | Leave a Comment..

Posted on Techdirt - 31 May 2018 @ 7:40pm

ICANN's Pre-emptive Attack On The GDPR Thrown Out By Court In Germany

from the who-is-whois-for? dept

The EU's General Data Protection Regulation (GDPR) has only just started to be enforced, but it is already creating some seriously big waves in the online world, as Techdirt has reported. Most of those are playing out in obvious ways, such as Max Schrems's formal GDPR complaints against Google and Facebook over "forced consent" (pdf). That hardly came as a shock -- he's been flagging up the move on Twitter for some time. But there's another saga underway that may have escaped people's notice. It involves ICANN (Internet Corporation for Assigned Names and Numbers), which runs the Internet's namespace. Back in 2015, Mike memorably described the organization as "a total freaking mess", in an article about ICANN's "war against basic privacy". Given that history, it's perhaps no surprise that ICANN is having trouble coming to terms with the GDPR. The bone of contention is the information that is collected by the world's registrars for the Whois system, run by ICANN. EPAG, a Tucows-owned registrar based in Bonn, Germany, is concerned that this personal data might fall foul of the GDPR, and thus expose it to massive fines. As it wrote in a recent blog post:

We realized that the domain name registration process, as outlined in ICANN's 2013 Registrar Accreditation Agreement, not only required us to collect and share information we didn't need, it also required us to collect and share people's information where we may not have a legal basis to do so. What's more, it required us to process personal information belonging to people with whom we may not even have a direct relationship, namely the Admin and Tech contacts [for each domain name].

All of those activities are potentially illegal under the GDPR. EPAG therefore built a new domain registration system with "consent management processes", and a data flow "aligned with the GDPR's principles". ICANN was not happy with this minimalist approach, and sought an injunction in Germany in order to "preserve Whois data" -- that is, to force EPAG to collect those administrative and technical contacts. A post on the Internet Governance Project site explains why those extra Whois contacts matter, and what the real issue here is:

The filing by ICANN's Jones Day lawyers, which can be found here, asserts a far more sweeping purpose for Whois data, which is part of an attempt to make ICANN the facilitator of intellectual property enforcement on the Internet. "The technical contact and the administrative contact have important functions," the brief asserts. "Access to this data is required for the stable and secure operation of the domain name system, as well as a way to identify those customers that may be causing technical problems and legal issues with the domain names and/or their content."

As the tell-tale word "content" there reveals, the real reason ICANN requires registrars to collect technical and administrative contacts is because the copyright industry wants easy access to this information. It uses the personal details provided by Whois to chase the people behind sites that it alleges are offering unauthorized copies of copyright material. This is precisely the same ICANN overreach that Techdirt reported on back in 2015: the organization is supposed to be running the Internet's domain name system, not acting as a private copyright police force. The difference is that now the GDPR provides good legal and financial reasons to ignore ICANN's demands, as EPAG has noted.

In a surprisingly swift decision, the German court hearing ICANN's request for an injunction against EPAG has already turned it down:

the Court said that the collection of the domain name registrant data should suffice in order to safeguard against misuse the security aspects in connection with the domain name (such as criminal activity, infringement or security problems).

The Court reasoned that because it is possible for a registrant to provide the same data elements for the registrant as for the administrative and technical contacts, ICANN did not demonstrate that it is necessary to collect additional data elements for those contacts. The Court also noted that a registrant could consent and provide administrative and technical contact data at its discretion.

However, as ICANN rightly notes, that still leaves unanswered the key question: would collecting the administrative and technical contact information contravene the GDPR? ICANN says it is "continuing to pursue the ongoing discussions" with the EU on this, and a clarification of the legal situation here would certainly be in everyone's interests. But there is another important angle to this. As the security researcher Brian Krebs wrote on his blog back in February:

For my part, I can say without hesitation that few resources are as critical to what I do here at KrebsOnSecurity than the data available in the public WHOIS records. WHOIS records are incredibly useful signposts for tracking cybercrime, and they frequently allow KrebsOnSecurity to break important stories about the connections between and identities behind various cybercriminal operations and the individuals/networks actively supporting or enabling those activities. I also very often rely on WHOIS records to locate contact information for potential sources or cybercrime victims who may not yet be aware of their victimization.

There's no reason to doubt the importance of Whois information to Krebs's work. But the central issue is which is more important for society: protecting millions of people from spammers, scammers and copyright trolls by limiting the publicly-available Whois data, or making it easier for security researchers to track down online criminals by using that same Whois information? It's an important discussion that is likely to rage for some time, along with many others now being brought into sharper focus thanks to the arrival of the GDPR.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

58 Comments | Leave a Comment..

More posts from Glyn Moody >>