Glyn Moody’s Techdirt Profile

glynmoody

About Glyn Moody Techdirt Insider




Posted on Techdirt - 15 March 2019 @ 7:39pm

The 2012 Web Blackout Helped Stop SOPA/PIPA And Then ACTA; Here Comes The 2019 Version To Stop Article 13

from the how-to-make-bad-ideas-politically-toxic dept

Remember SOPA (Stop Online Piracy Act) and PIPA (Protect IP Act)? Back in 2012, they threatened to cause widespread damage to the online world by bringing in yet more extreme and unbalanced measures against alleged copyright infringement. Things looked pretty bad until a day of massive online protest was organized on January 18, 2012, with thousands of sites partially or totally blacked out. Politicians were taken aback by the unexpected scale of the anger, and their support for SOPA and PIPA crumbled quickly. That success fuelled protests in Europe against ACTA (Anti-Counterfeiting Trade Agreement), which also sought to bring in harsh measures against online infringement. After tens of thousands of people took part in street demonstrations across Europe, many politicians wanted nothing to do with the by-now toxic proposal, and it was voted down in the European Parliament in July 2012.

As Techdirt pointed out last year, the proposed EU Copyright Directive is even worse than ACTA. As such it clearly merits serious, large-scale action of the kind that stopped SOPA/PIPA and ACTA. And it's happening. The German-language version of Wikipedia, the second-largest by number of articles, has announced the following (original in German):

On Thursday, March 21, the German-language edition of the online encyclopedia will be shut down completely for 24 hours. In this way, Wikipedia activists want to send a signal, in particular against the introduction of the controversial Articles 11 and 13 in the [EU's] copyright reform.

It is expected that a number of other major sites will be joining in the protest. Meanwhile, another German organization is campaigning against Article 13. In an open letter to MEPs, its supporters write:

We are the operators and administrators of more than 400 German-language discussion forums with more than 18 million members. We are united by the great concern that the EU Copyright Directive will endanger the existence of our forums and thus the discussion culture on the Internet.

The public discussion on the EU Copyright Directive revolves almost exclusively around YouTube and other large US platforms. In doing so, we lose sight of the fact that discussion forums of all sizes will also be affected by the new directive.

That's an important point. Supporters of Article 13 try to give the impression that only deep-pocketed companies like Google will be hit by the new law. As the discussion forum operators point out, their organizations will not be exempt from the requirements of the EU Copyright Directive. Its effects will be devastating:

Because of these uncertainties and the legal and financial liability risk, many discussion forums will close, as small associations or voluntary operators cannot bear this situation. Commercial operators are also endangered in their existence if they have to conclude fee-based licenses and are obliged to install expensive upload filters.

Uncertain regulations for us means years of legal uncertainty, legal risk and potential legal costs, which no operator can afford in the long run as forums usually do not generate a large amount of revenue.

As a result, the discussion culture on the European Internet will be severely impaired, and many citizens will lose their digital home in discussion forums.

Internet startups in the EU will face the same insurmountable problems thanks to Article 13's impossible demands. Many will be forced to shut down. It's an irony that many have already pointed out. A law that supporters claim is designed to tackle the disproportionate power of companies like Google and Facebook will end up entrenching them more deeply, and wiping out much of the EU's own digital ecosystem. Let's hope 2019's big blackout grabs people's attention as the one in 2012 did, and that MEPs drop Article 13 just as they dropped ACTA.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

94 Comments | Leave a Comment..

Posted on Techdirt - 13 March 2019 @ 10:44am

German Government Confirms That Article 13 Does Mean Upload Filters, Destroying Claims To The Contrary Once And For All

from the now-delete-it-from-the-text dept

Techdirt has just written about an important intervention by the UN Special Rapporteur on freedom of expression in the debate about Article 13 of the proposed EU Copyright Directive. David Kaye said that most Internet sites "would face legal pressure to install and maintain expensive content filtering infrastructure to comply with the proposed Directive." Despite the evident expertise of Kaye in this area, some may try to dismiss this clear condemnation of Article 13 as the UN interfering in a legislative process that really only concerns the Member States of the EU, and no one else. That makes the following official reply by Christian Lange, Parliamentary State Secretary to the German Federal Minister of Justice and Consumer Protection, to a question submitted by a member of Germany's national parliament, rather significant:

In the [German] federal government's view it appears likely that algorithmic measures will have to be taken in connection with large volumes of data for practical reasons alone.

That translation of the original German comes from Florian Mueller, who has written a blog post (in English) about the political background and significance of this statement. He notes that it appears in the well-respected German newspaper Frankfurter Allgemeine Zeitung, "that used to spread the no-upload-filter propaganda [but] now considers it ridiculous to deny that Article 13 involves upload filters." So the appearance of this confirmation that Article 13 will indeed require "algorithmic measures" -- AKA upload filters -- in a serious German newspaper represents an important moment in the continuing battle to get MEPs to understand the damage this measure will cause, and to prevent it.

It is now inarguable that Article 13 will require the deployment of upload filters across many sites in the EU. The UN Special Rapporteur David Kaye has warned that upload filters put freedom of expression under threat, and harm creators and artists the most. Putting those two together means that any European politician supporting Article 13 is inevitably attacking a fundamental human right in the EU, and making life worse for artists. With just two weeks before the final vote in the European Parliament, now would be a really good time for EU citizens to ask their MEPs whether they are happy to be remembered for that, or would rather help to remove Article 13 completely.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

42 Comments | Leave a Comment..

Posted on Techdirt - 12 March 2019 @ 12:11am

Top EU Court Rules Public Interest Is More Important Than Protection Of Commercial Interests

from the healthy-precedent dept

As Techdirt noted some years back, there has been a steady push to strengthen the protection afforded to trade secrets. Similarly, the argument is often made that transparency must be subordinated to protecting commercial interests, as happened recently in an important struggle over access to information in the EU. It concerned the safety of the chemical glyphosate, widely used as a herbicide, for example in Roundup from Monsanto (now owned by the German chemical giant, Bayer). The EU body responsible for assessing risks associated with the food chain is EFSA (European Food Safety Authority). As part of the process of renewing approval for glyphosate, which was granted in 2017 for five more years, EFSA conducted a review of the toxicity and carcinogenicity of the chemical, drawing on a variety of published and unpublished data. Whether glyphosate increases the risk of cancer is a highly contentious area, with widely differing expert views:

In March 2015 the International Agency for Research on Cancer (IARC) working group of experts classified Glyphosate in Group 2A (probable human carcinogens) with strong evidence for a genotoxic mechanism of carcinogenicity. The Joint WHO/FAO Expert Meeting on Pesticide Residues (JMPR), which is responsible for assessing the risk of pesticide residues in Food in Codex, originally evaluated Glyphosate in 2004. JMPR did not find evidence for carcinogenicity in humans and assigned an Acceptable Daily Intake (ADI).

In 2018, a US court ordered Monsanto to pay $289 million in damages to a former school groundkeeper who sued the company after he was diagnosed with non-Hodgkin's lymphoma, which he claimed was caused by his exposure to glyphosate. The award was later reduced to $78 million, still a significant sum.

In the EU, a group of four members of the European Parliament, and another individual, asked separately to see two key industry studies that were used by EFSA in coming to its decision in favor of approval. EFSA refused, because it claimed that disclosure of the information might seriously harm the commercial and financial interests of the companies that had submitted the data, and that there was no overriding public interest justifying disclosure. Those seeking access appealed to one of the EU's highest, but least-known, courts, the General Court of the European Union. Its job is to hear actions taken against the institutions of the EU, as in this case. The court has just issued its judgment (pdf):

an overriding public interest in disclosing the studies is deemed to exist. EFSA could not therefore refuse to disclose them on the ground that that would have an adverse effect on the protection of the commercial interests of the owners of the requested studies.

Potentially, that ruling could have a big impact on future cases where EU institutions seek to prevent information relating to health and the environment being released on the grounds it would allegedly harm commercial interests. It's also a setback for the general idea that business secrets should trump transparency.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

29 Comments | Leave a Comment..

Posted on Techdirt - 4 March 2019 @ 4:10pm

Big Win For Open Access, As University Of California Cancels All Elsevier Subscriptions, Worth $11 Million A Year

from the academic-publishing's-emperor-has-no-clothes dept

As Techdirt has reported over the years, the move to open access, whereby anyone can read academic papers for free, is proving a long, hard journey. However, the victories are starting to build up, and here's another one that could have important wider ramifications for open access, especially in the US:

As a leader in the global movement toward open access to publicly funded research, the University of California is taking a firm stand by deciding not to renew its subscriptions with Elsevier. Despite months of contract negotiations, Elsevier was unwilling to meet UC's key goal: securing universal open access to UC research while containing the rapidly escalating costs associated with for-profit journals.

In negotiating with Elsevier, UC aimed to accelerate the pace of scientific discovery by ensuring that research produced by UC's 10 campuses -- which accounts for nearly 10 percent of all U.S. publishing output -- would be immediately available to the world, without cost to the reader. Under Elsevier's proposed terms, the publisher would have charged UC authors large publishing fees on top of the university's multi-million dollar subscription, resulting in much greater cost to the university and much higher profits for Elsevier.

The problems faced by the University of California (UC) are the usual ones. The publishing giant Elsevier was willing to move to an open access model -- but only if the University of California paid even more on top of what were already "rapidly escalating costs". To its credit, the institution instead decided to walk, depriving Elsevier of around $11 million a year (pdf).

But that's not the most important aspect of this move. After all, $11 million is small change for a company whose operating profit is over a billion dollars per year. What will worry Elsevier more is that the University of California is effectively saying that the company's journals are not so indispensable that it will sign up to a bad deal. It's the academic publishing equivalent of pointing out that the emperor has no clothes.

The University of California is not the first academic institution to come to this realization. National library consortiums in Germany, Hungary and Sweden have all made the same decision to cancel their subscriptions with Elsevier. Those were all important moves. But the University of California's high-profile refusal to capitulate to Elsevier is likely to be noted and emulated by other US universities now that the approach has been validated by such a large and influential institution.

As to where researchers at the University of California (and in Germany, Hungary and Sweden) will obtain copies of articles published in Elsevier titles that are no longer available to them through subscriptions -- UC retains access to older ones -- there are many other options. For example, preprints are increasingly popular, and circulate freely. Contacting the authors directly usually results in copies being made available, since academics naturally want their papers read as widely as possible.

And then, of course, there is Sci-Hub, which now claims to provide access to 70 million articles. Researchers that end up at Sci-Hub in search of a hard-to-find item may well discover how much more convenient it is than the traditional subscription services that impose strict controls on access to publications. The risk for Elsevier is that once researchers get a taste of quick, seamless access to everything, they may never want go back to the old system, however much the company slashes its prices to win back business.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

29 Comments | Leave a Comment..

Posted on Techdirt - 28 February 2019 @ 12:02pm

Mozilla Says Australia's Compelled Access Law Could Turn Staff There Into 'Insider Threats'

from the how-to-undermine-your-software-industry-without-really-trying dept

Despite unanimous warnings from experts that it was a really bad idea, the Australian government went ahead and passed its law enabling compelled access to encrypted devices and communications. Apparently, the powers have already been used. Because of the way the Australian government rammed the legislation through without proper scrutiny, the country's Parliamentary Joint Committee on Intelligence and Security has commenced a review of the new law. That's the good news. The bad news is that Andrew Hastie, the Chair of the Committee, still thinks fairy tales are true:

I note with the House the concerns raised by some stakeholders in the tech sector about these laws, including in today's press. I welcome the ongoing contribution from these stakeholders as the committee continues its review. I note, however, that the legislation as passed prohibits the creation of so-called back doors. Companies cannot be required to create systemic weaknesses in their encrypted products or be required to build a decryption capability.

Sure, whatever, Andrew. One of the stakeholders that has made a submission to the Committee is Mozilla, which is worried by one aspect in particular (pdf):

Due to ambiguous language in [the compelled access law], one could interpret the law to allow Australian authorities to target employees of a Designated Communications Provider (DCP) rather than serving an order on the DCP itself through its General Counsel or an otherwise designated official for process. It is easy to imagine how Australian authorities could abuse their powers and the penalties of this law to coerce an employee of a DCP to compromise the security of the systems and products they develop or maintain.

As Tim Cushing explained in his December post when the compelled access law was approved, that would put employees in an impossible position. They would be forced by the authorities to put backdoors of some kind in a product, but it had to be accomplished in secret. Moreover, they risked five years in prison if any of their colleagues noticed, which they probably would, since unauthorized changes to code would naturally be spotted and challenged. Because of that ridiculous situation, Mozilla warns it would have to take drastic action:

this potential would force DCP’s [like Mozilla] to treat Australia-based employees as potential insider threats, introducing another vector for compromise that could undermine trust in critical products and incentivizing companies to move critical roles to other localities.

What's true for Mozilla, is true for every foreign software company: in order to protect the integrity of their code, they would be forced to regard every Australian coder as a security risk, and downgrade their access to the code accordingly. The difficulties of managing that kind of situation will probably force software companies to pull out of Australia completely. It will also have a big impact on the trustworthiness of any code produced in the country. In fact, that's already a problem, as another submission to the Parliamentary Joint Committee makes clear. It comes from one of the leading Australian software companies, FastMail, which provides hosted email services to 40,000 companies around the world. It says that "we have seen existing customers leave, and potential customers go elsewhere, citing this bill as the reason for their choice." Like Mozilla, FastMail is worried about the impossible position of employees (pdf), who may be coerced by the Australian authorities into weakening the company's code:

Our staff have expressed concerns that they may be forced to attempt to secretly add back doors or security holes in our service -- actions that would be just cause for dismissal -- and be unable to tell us why they have made these changes.

This is not just a matter of looking after our own staff's mental health, it also makes it harder for Australians looking to work for overseas companies if there is any risk that they will be compelled to act against their employer's interests.

The comments of these two organizations show clearly the practical problems of this ill-thought-out legislation. They also confirm that bringing in this kind of law is one of the quickest ways to undermine the local software industry, and increase dependence on foreign companies that are less likely to comply with demands to insert backdoors in their code. If the Australian government cares about those consequences, or indeed about the online safety of its citizens, it would do well to heed the words that conclude Mozilla's submission to the review:

This law represents an unprecedented and unchecked threat to the privacy and security of users in Australia and abroad. We urge the Committee and the Australian Parliament to move swiftly to remedy the significant harms posed by this legislation. Ultimately, the best course of action is to repeal this law and start afresh with a proper, public consultation.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

33 Comments | Leave a Comment..

Posted on Free Speech - 26 February 2019 @ 8:09pm

China Extends Its Censorship To Australian Books, Written By Australian Authors For Australian Readers

from the whatever-next? dept

News that China is extending its censorship to new domains barely provokes a yawn these days, since it's such a common occurrence. But even for those jaded by constant reports of the Chinese authorities trying to control what people see and hear, news that it is now actively censoring books written by Australian authors for Australian readers is pretty breath-taking. The Chinese government has done this before for single books whose message it disliked, but now it seems to be part of a broader, general policy:

Publishing industry figures have confirmed that the censors from the State Administration of Press, Publication, Radio, Film and Television of the People's Republic of China are vetting books sent by Australian publishers to Chinese printing presses, even though they are written by Australian authors and intended for Australian readers.

Any mention of a list of political dissidents, protests or political figures in China, including president Xi Jinping, is entirely prohibited, according to a list circulated to publishers and obtained by The Age and Sydney Morning Herald.

As the story in the Australian newspaper The Age explains, the reason why Chinese censors are able to impose their views on books designed for the Australian market is that it's cheaper to have books printed in China than in Europe, say, especially it if involves color illustrations. As a result, publishers can be faced with the choice of accepting Chinese demands, or not publishing the book at all because the costs are too high.

The list of taboo topics is long, albeit pretty specific to China. It includes mention of major Chinese political figures, such as Mao Zedong and Xi Jinping, as well as a list of 118 dissidents whose names may not be mentioned. Political topics such as Tiananmen Square, pro-democracy protests in Hong Kong, Tibetan independence, Uyghurs and Falun Gong are all out. Pornography is completely forbidden, but even artistic nudity can be censored. The Chinese authorities are very sensitive to how maps are drawn, since they can involve disputed borders. More surprising is the ban on mentioning major religions.

The Age article notes that the rules had been in place for some time, but largely ignored. Now, however, the censors are checking every page of every book, and enforcing the rules strictly. It's yet another sign of Xi Jinping's obsessive desire to control every facet of life -- even outside China, if he can.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

71 Comments | Leave a Comment..

Posted on Techdirt - 25 February 2019 @ 7:58pm

Surprise: Uganda's New Social Media Tax Seems To Have Led To Fewer People Using The Internet, And Total Value Of Mobile Transactions To Drop

from the how-to-hobble-a-nascent-digital-economy-in-one-easy-move dept

Techdirt has been following the regrettable story of African governments imposing taxes and levies on Internet use. For example, Uganda has added a daily fee of 200 Ugandan shillings ($0.05) that its citizens must pay to access social media sites and many common Internet-based messaging and voice applications (collectively known as "OTT services"). It has also imposed a tax on mobile money transactions. When people started turning to VPNs as a way to avoid these charges, the government tried to get ISPs to ban VPN use. As we pointed out, these kind of taxes could discourage the very people who could benefit the most from using the Internet. And in news that will surprise no one, so it has turned out, according to official data from the Uganda Communications Commission, summarized here by an article on the Quartz site:

In the three months following the introduction of the levy in July 2018, there was a noted decline in the number of internet users, total revenues collected, as well as mobile money transactions. In a series of tweets, the Uganda Communications Commission noted internet subscription declined by more than 2.5 million users, while the sum of taxpayers from over-the-top (OTT) media services decreased by more than 1.2 million users. The value of mobile money transactions also fell by 4.5 trillion Ugandan shillings ($1.2 million).

Given the timing, it seems likely that it was indeed the newly-introduced levy that caused the number of Internet users in Uganda to drop dramatically, and the mobile phone-based economy to contract. Neither is good for the people of Uganda, its economy or its government. It's clearly time for the Ugandan authorities to rescind the tax before too much long-term damage is caused -- and for other African nations with ill-advised Internet levies to do the same.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

16 Comments | Leave a Comment..

Posted on Techdirt - 12 February 2019 @ 7:34pm

EU's New 'Open By Default' Rules For Data Generated By Public Funding Subverted At The Last Minute

from the if-you-don't-like-the-rules,-don't-take-the-money dept

The EU's awful Copyright Directive is rightly dominating the news at the moment, but there are other interesting laws being debated and passed in the European Union that have received far less attention. One of these is a revision of the Public Sector Information (PSI) Directive. Here's the background to the move:

The re-use of data generated by public sector bodies (e.g. legal, traffic, meteorological and financial etc.) for commercial and non-commercial purposes is currently governed by Directive 2003/98/EC on the re-use of public sector information, which was reviewed in 2013.

On 25 April 2018, the Commission adopted the 2018 Data Package, addressing for the first time different types of data (public, private, scientific) within a coherent policy framework, making use of different policy instruments. As part of this package, the review of the PSI Directive was the object of an extensive public consultation process.

The basic idea behind the revision, which was agreed on at the end of January by the European Parliament, the Council of the EU and the European Commission, is sound:

All public sector content that can be accessed under national access to documents rules is in principle freely available for re-use. Public sector bodies will not be able to charge more than the marginal cost for the re-use of their data, except in very limited cases. This will allow more [small and medium enterprises] and start-ups to enter new markets in providing data-based products and services.

In December last year, the European Parliament proposed a version of the text that would require researchers in receipt of public funding to publish their data for anyone to re-use. However, some companies and academics were unhappy with this "open by default" approach. They issued a statement calling for research data to be "as open as possible, as closed as necessary", which would include some carve-outs.

According to Science|Business, that view has prevailed in the final text, which is not yet publicly available. It is now apparently permissible for companies and academics to invoke "confidentiality" and "legitimate commercial interests" as reasons for not releasing publicly-funded data. Clearly, that's a huge loophole that could easily be abused by organizations to hoard results. If companies and academic institutions aren't willing to share the fruits of their research as open data, there's a very simple solution: don't take public money. Sadly, that fair and simple approach seems not to be a part of the otherwise welcome revised PSI Directive.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

7 Comments | Leave a Comment..

Posted on Techdirt - 8 February 2019 @ 3:23am

German Data Protection Authority Says GDPR Requires Email To Use At Least Transport Layer Encryption

from the but-why,-is-not-entirely-clear dept

As Techdirt has reported, the EU's GDPR legislation certainly has its problems. But whatever you think of it as a privacy framework, there's no denying its importance and global reach. That makes a recent ruling by a data protection authority in Germany of wider interest than the local nature of the decision might suggest. As Carlo Pilz reports in his post examining the case in detail, the Data Protection Authority of North Rhine-Westphalia (Landesbeauftragte für Datenschutz und Informationsfreiheit Nordrhein-Westfalen -- LfDI NRW for short) looked at what the GDPR might mean for email -- in particular, whether it implied that email should be encrypted in order to protect personal information, and if so, how.

The LfDI NRW made a fundamental distinction between encryption at the content level and encryption at the transport level for the transmission of emails. The former encrypts content and attachments, using technology such as S/MIME and OpenPGP. However, the metadata associated with an email is not encrypted. With transport layer encryption, both the content and the metadata is encrypted. Pilz explains:

LfDI NRW therefore assumes that without exception at least transport encryption must always be implemented. As mentioned above, such a mandatory encryption obligation does not result from the GDPR. One could therefore argue that this view goes beyond the requirements of the GDPR. In its assessment, the authority may assume that transport encryption is now the "state of the art" mentioned in Art. 32 para 1 GDPR. This could be supported by the reference to the European providers. Nevertheless, Art. 32 para 1 GDPR provides, in addition to the feature "state of the art", for further criteria which must be taken into account for the measures to be implemented, such as in particular the risk of varying likelihood and severity for the rights and freedoms of natural persons. Thus, under the GDPR, controllers and processors should still be able to assess whether encryption of the data is absolutely necessary on the basis of these characteristics. Unfortunately, the opinion of the authority does not go into the individual criteria mentioned in Art. 32 para 1 GDPR in detail and does not justify its opinion.

So the newly-published position of the German data protection authority is not exactly clear as to why it thinks transport layer encryption must be used, although it does specify which standard must be followed: BSI TR-03108-1: Secure E-Mail Transport (pdf). It also mentions that transport layer encryption may not be enough when dealing with particularly sensitive personal information. This includes "account transaction data, financing data, health status data, client data from lawyers and tax consultants, employee data". The problem here is that even though encrypted in transit, the emails are stored in cleartext on the servers. To address this, LfDI NRW says that additional security measures are required, for example using end-to-end encryption, or even falling back to physical postal delivery of highly-sensitive documents.

Because of the way GDPR is enforced, it's worth noting that the opinion of the LfDI NRW does not automatically determine the policies everywhere in the EU. It may influence the thinking elsewhere, but it's possible that other data protection authorities in the EU might disagree with this interpretation -- in which case the Court of Justice of the European Union would be called upon to adjudicate and offer a definitive interpretation. However, at the very least, the LfDI NRW's position indicates that the GDPR is likely to impact when and how email encryption is deployed by companies operating in the EU, along with all the other areas it touches upon.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

20 Comments | Leave a Comment..

Posted on Techdirt - 4 February 2019 @ 8:12pm

After Plan S, Here's Plan U: Funders Should Require All Research To Be Posted First As A Preprint

from the instant-open-access? dept

Preprints are emerging as a way to get research out to everyone free of charge, without needing to pay page charges to appear in a traditional open access title. The growing popularity is in part because research shows that published versions of papers in costly academic titles add almost nothing to the freely-available preprints they are based on. Now people are starting to think about ways to put preprints at the heart of academic publishing and research. In the wake of the EU's "Plan S" to make more research available as open access, there is now a proposal for "Plan U":

If all research funders required their grantees to post their manuscripts first on preprint servers -- an approach we refer to as "Plan U" -- the widespread desire to provide immediate free access to the world's scientific output would be achieved with minimal effort and expense. As noted above, mathematicians, physicists and computer scientists have been relying on arXiv as their primary means of communication for decades. The biomedical sciences were slower to adopt preprinting, but bioRxiv is undergoing exponential growth and several million readers access articles on bioRxiv every month. Depositing preprints is thus increasingly common among scientists, and mandating it would simply accelerate adoption of a process many predict will become universal in the near future.

There is a precedent for mandating preprint deposition: since 2017, the Chan Zuckerberg Initiative (CZI) has mandated that all grantees deposit preprints prior to or at submission for formal publication. This requirement has been accepted by CZI-funded investigators, many of whom were already routinely depositing manuscripts on bioRxiv.

The proposal goes on to consider some of the practical issues involved, such as how it would fit with peer review, and what the requirements for preprint servers might be, as well as deeper questions about guaranteed long-term preservation strategies -- a crucial issue that is often overlooked. The Plan U proposal concludes:

because it sidesteps the complexities and uncertainties of attempting to manipulate the economics of a $10B/year industry, Plan U could literally be mandated by funders tomorrow with minimal expense, achieving immediate free access to research and the significant benefits to the academic community and public this entails. Funders and other stakeholders could then focus their investment and innovation energies on the critical task of building and supporting robust and effective systems of peer review and research evaluation.

Those are all attractive features of the Plan U idea, although Egon Willighagen has rightly pointed out that using the right license for the preprints is an important issue. At the time of writing, the Plan U Web site is rather minimalist. It currently consists of just one page; there are no links to who wrote the proposal, what future plans might be, or how to get involved. I asked around on Twitter, and it seems that three well-known figures in the open science world -- Michael Eisen, John Inglis, and Richard Sever -- are the people behind this. Eisen has been one of the leading figures in the open access world since its earliest days, while Inglis and Sever are co-founders of the increasingly-popular bioRxiv preprint server, which serves the biology community. That augurs well for the idea, but it would still be good to have the details fleshed out on a more informative Web site -- something that Sever told Techdirt will be coming in due course.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

8 Comments | Leave a Comment..

Posted on Techdirt - 1 February 2019 @ 3:17am

EU Drops Corporate Sovereignty For Internal Bilateral Agreements, But Top Court Adviser Says It Can Be Used In CETA

from the one-step-forward,-one-step-back dept

As Techdirt noted last September, corporate sovereignty -- the ability of companies to sue entire countries for allegedly lost profits -- has been on the wane recently. One important factor within the EU was a decision earlier last year by the region's top court that investor-state dispute settlement (ISDS) -- the official name for corporate sovereignty -- could not be used for investment deals within the EU. The reasoning was that ISDS courts represented a legal system outside EU law, which was not permitted when dealing with internal EU matters. As a direct consequence of that ruling, the Member States of the EU have just issued a declaration on the legal consequences (pdf). Essentially, these are that all bilateral investment treaties between Member States will be cancelled, and that corporate sovereignty claims can no longer be brought over internal EU matters.

However, that leaves an important question: what about trade deals between the EU and non-EU nations -- can they include ISDS chapters? In order to settle this issue, Belgium asked the Court of Justice of the European Union (CJEU) whether the corporate sovereignty chapter of CETA, the trade deal between the EU and Canada, was compatible with EU law. As well as clarifying the situation for CETA, this would also provide definitive guidance on the legality of ISDS in past and future trade deals. As is usual in cases sent to the CJEU, one of the court's top advisers general offers a preliminary opinion, which has just been published (pdf):

In today's Opinion, Advocate General Yves Bot holds that the mechanism for the settlement of disputes is compatible with the EU Treaty, the [Treaty on the Functioning of the European Union] and the Charter of Fundamental Rights of the European Union.

His argument is that ISDS courts can't bind national courts, so the latter's autonomy is not threatened, and thus corporate sovereignty chapters are compatible with EU legislation. That may be true as a matter of law, but ignores the political reality of corporate sovereignty. If huge fines are imposed by ISDS tribunals unless proposed changes to laws are dropped, governments frequently roll over and do as the corporations wish, because it seems the easier, cheaper option. So even though in theory corporate sovereignty cases can't override national laws, in practice that's often the outcome.

However, this is only the Advocate General's view, which isn't necessarily followed in the main CJEU ruling. It will be interesting to see whether the EU's top court extends its earlier ruling on intra-EU investment agreements, and throws out ISDS for all trade deals, or whether it agrees with Advocate General Bot and permits corporate sovereignty chapters for things like CETA.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

2 Comments | Leave a Comment..

Posted on Techdirt - 22 January 2019 @ 8:28pm

Proposed Update To Singapore's Copyright Laws Surprisingly Sensible

from the EU-should-look-and-learn dept

Techdirt writes plenty about copyright in the US and EU, and any changes to the respective legislative landscapes. But it's important to remember that many other countries around the world are also trying to deal with the tension between copyright's basic aim to prevent copying, and the Internet's underlying technology that facilitates it. Recently, we covered the copyright reform process in South Africa, where some surprisingly good things have been happening. Now it seems that Singapore may bring in a number of positive changes to its copyright legislation. One of the reasons for that is the very thorough consultative process that was undertaken, explained here by Singapore's Ministry of Law:

The proposed changes are made, following an extensive three-year review and two rounds of public consultations conducted from August to November 2016 and May to June 2017 respectively. Three public Town Halls and ten engagement sessions with various stakeholder groups, including consumer, industry and trade associations, businesses, intellectual property practitioners and academics were held. Close to 100 formal submissions and more than 280 online feedback forms were received.

The full 70-page report (pdf) spells out the questions asked during that review, the answers received, and the government's proposals. The Ministry of Law's press release lists some of the main changes it wants to make. One of the most welcome is a new exception for text and data mining (TDM) for the purpose of analysis:

Today, people who use automated techniques to analyse text, data and other content to generate insights risk infringing copyright as they typically require large scale copying of works without permission. It is proposed that a new exception be established to allow copying of copyrighted materials for the purpose of data analysis, where the user has lawful access to the materials that are copied. This will promote applications of data analytics and big data across a gamut of industries, unlocking new business opportunities, speeding up processes, and reducing costs for all.

Importantly, Singapore's proposed new TDM exception applies to everyone -- including big businesses. That's unlike the corresponding Article 3 in the EU's awful Copyright Directive, currently working its way through the legislative process, which imposes an unnecessary restriction that more or less guarantees the European Union will be a backwater in this fast-growing area. An obvious but wise move by Singapore is the proposal for an enhanced copyright exception for educational purposes:

Non-profit schools and their students will be able to use online resources that are accessible without payment, for instruction purposes. This will be in addition to their existing exceptions which generally cover only copying of a portion of a work. The enhancement will facilitate instruction and make it easier for teachers and students to use online materials in classes. For example, teachers and students will be able to use various audio-visual materials (e.g. videos, pictures) found online for their classroom lessons and project presentations. They will also be able to share those materials, or lessons and project presentations which have included those materials, on student learning portals for other schools to view. Online resources that require payment will not be covered by this exception.

Another suggested exception is for non-profit galleries, libraries, archives, and museums (GLAMs) to make copies for exhibition purposes. Also useful for GLAMs is a new limit on the protection given to unpublished works. This will stand at life plus 70 years for literary and artistic works, just as for published versions. GLAMs will be protected from contract override, as is the text and data mining exception. That's important, because it means that copyright owners cannot nullify the new exceptions by insisting organizations sign contracts that waive them. Individual creators receive new rights too:

the report proposes that creators be given a new right to be attributed as the creator of their work, regardless of whether they still own or have sold the copyright. For example, anyone using a work publicly, such as posting it on the internet, will have to acknowledge the creator of the work. This will accord creators due recognition and allow them to build their reputation over time. Currently, they do not need to be attributed as the creator of their work when others use it.

This is essentially a moral right alongside the usual economic ones. As the Wikipedia page on the subject explains, the degree to which moral rights exist for creators of copyright works varies enormously around the world. In France, for example, moral rights are perpetual and inalienable, whereas in the US they are less to the fore. Singapore's Ministry of Law also proposes that where rights have not been explicitly signed away in a contract, they remain with the creator. Although that will prevent naive creators being tricked out of their rights, it won't apply to work created by employees: there, it's employers who will continue to retain rights. As for enforcing copyright, there is the following:

the report proposes that new enforcement measures be made available to copyright owners to deter retailers and service providers from profiting off providing access to content from unauthorised sources, such as through the sale of set-top boxes that enable access to content from unauthorised sources, also commonly known as grey boxes or illicit streaming devices. The measures, which are absent today, will make clear that acts such as the import and sale of such devices are prohibited.

This is clearly aimed at Kodi boxes, which are currently one of the main targets of the entertainment industry. To its credit, the Ministry of Law's proposal does include important additional requirements for the measures to apply:

the product can be used to access audio-visual content from an unauthorised source and additionally must be:

designed or made primarily for providing access to such content

advertised as providing access to such content, or

sold as providing access to such content, where the retailer sells a generic device with the understanding that "add-on" services such as the provision of website links, instructions or installation of subscription services will subsequently be provided

At least that makes a clear distinction between basic Kodi boxes, and those specifically built and sold with a view to providing unauthorized access to materials. That understanding of the difference is of a piece with the rest of the legislation, which is unusually intelligent. Other governments could learn from that, and from the overall thrust of the proposals to move Singapore's copyright law towards a fair use system similar to that of the US -- something that is fiercely resisted elsewhere.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

15 Comments | Leave a Comment..

Posted on Techdirt - 10 January 2019 @ 6:59pm

PLOS ONE Topic Pages: Peer-Reviewed Articles That Are Also Wikipedia Entries: What's Not To Like?

from the good-for-the-academic-career,-too dept

It is hard to imagine life without Wikipedia. That's especially the case if you have school-age children, who now turn to it by default for information when working on homework. Less well-known is the importance of Wikipedia for scientists, who often use its pages for reliable explanations of basic concepts:

Physicists -- researchers, professors, and students -- use Wikipedia daily. When I need the transition temperature for a Bose-Einstein condensate (prefactor and all), or when I want to learn about the details of an unfamiliar quantum algorithm, Wikipedia is my first stop. When a graduate student sends me research notes that rely on unfamiliar algebraic structures, they reference Wikipedia.

That's from a blog post on the open access publisher Public Library of Science (PLOS) Web site. It's an announcement of an interesting new initiative to bolster the number of physicists contributing to Wikipedia by writing not just new articles for the online encyclopedia, but peer-reviewed ones. The additional element aims to ensure that the information provided is of the highest quality -- not always the case for Wikipedia articles, whatever their other merits. As the PLOS post explains, the new pages have two aspects:

A peer-reviewed 'article' in [the flagship online publication] PLOS ONE, which is fixed, peer-reviewed openly via the PLOS Wiki and citable, giving information about that particular topic.

That finalized article is then submitted to Wikipedia, which becomes a living version of the document that the community can refine, build on, and keep up to date.

The two-pronged approach of these "Topic Pages" has a number of benefits. It means that Wikipedia gains high-quality, peer-reviewed articles, written by experts; scientists just starting out gain an important new resource with accessible explanations of often highly-technical topics; and the scientists writing Topic Pages can add them to their list of citable publications -- an important consideration for their careers, and an added incentive to produce them.

Other PLOS titles such as PLOS Computational Biology and PLOS Genetics have produced a few Topic Pages previously, but the latest move represents a major extension of the idea. As the blog post notes, PLOS ONE is initially welcoming articles on topics in quantum physics, but over time it plans to expand to all of physics. Let's hope it's an idea that catches on and spreads across all academic disciplines, since everyone gains from the approach -- not least students researching their homework.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

13 Comments | Leave a Comment..

Posted on Techdirt - 7 January 2019 @ 7:33pm

China Starts Using Facial Recognition-Enabled 'Smart' Locks In Its Public Housing

from the just-wait-until-they-know-your-citizen-score-too dept

Surveillance using facial recognition is sweeping the world. That's partly for the usual reason that the underlying digital technology continues to become cheaper, more powerful and thus more cost-effective. But it's also because facial recognition can happen unobtrusively, at a distance, without people being aware of its deployment. In any case, many users of modern smartphones have been conditioned to accept it unthinkingly, because it's a quick and easy way to unlock their device. This normalization of facial recognition is potentially bad news for privacy and freedom, as this story in the South China Morning Post indicates:

Beijing is speeding up the adoption of facial recognition-enabled smart locks in its public housing programmes as part of efforts to clamp down on tenancy abuse, such as illegal subletting.

The face-scanning system is expected to cover all of Beijing's public housing projects, involving a total of 120,000 tenants, by the end of June 2019

Although a desire to stop tenancy abuses sounds reasonable enough, it's important to put the move in a broader context. As Techdirt reported back in 2017, China is creating a system storing the facial images of every Chinese citizen, with the ability to identify any one of them in three seconds. Although the latest use of facial recognition with "smart" locks is being run by the Beijing authorities, such systems don't exist in isolation. Everything is being cross-referenced and linked together to ensure a complete picture is built up of every citizen's activities -- resulting in what is called the "citizen score" or "social credit" of an individual. China said last year that it would start banning people with "bad" citizen scores from using planes and trains for up to a year. Once the "smart" locks are in place, it would be straightforward to make them part of the social credit system and its punishments -- for example by imposing a curfew on those living at an address, or only allowing certain "approved" visitors.

Even without using "smart" locks in this more extreme way, the facial recognition system could record everyone who came visiting, and how long they stayed, and transmit that data to a central monitoring station. The scope for abuse by the authorities is wide. If nothing else, it's a further reminder that if you are not living in China, where you may not have a choice, installing "smart" Internet of things devices voluntarily may not be that smart.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

13 Comments | Leave a Comment..

Posted on Techdirt - 19 December 2018 @ 7:59pm

German City Wants Names And Addresses Of Airbnb Hosts; Chinese Province Demands Full Details Of Every Guest Too

from the sharing,-but-not-like-that dept

Online services like Airbnb and Uber like to style themselves as part of the "sharing economy". In truth, they are just new twists on the rental sector, taking advantage of the Internet's widespread availability to broaden participation and ease negotiation. This has led to a tension between the online services and traditional local regulators, something Techdirt noted in the US, back in 2016. Similar battles are still being fought around the world. Here's what is happening in Germany, as reported by Out-Law.com:

The City of Munich asked Airbnb to provide it with all advertisements for rooms in the city which exceeded the permissible maximum lease period [of eight weeks in a calendar year]. Specifically, for the period from January 2017 to July 2018, it wanted Airbnb to disclose the addresses of the apartments offered as well as the names and addresses of the hosts.

Airbnb challenged the request before the administrative court in Munich, which has just ruled that the US company must comply with German laws, even though its European office is based in Ireland. It said that the request was lawful, and did not conflict with the EU's privacy regulations. Finally, it ruled that the City of Munich's threat to impose a €300,000 fine on Airbnb if it did not comply with its information request was also perfectly OK. Presumably Airbnb will appeal against the decision, but if it is confirmed it could encourage other cities in Germany to make similar requests. At least things there aren't as bad as in China. According to a post from TechNode:

The eastern Chinese province of Zhejiang will require online home-sharing platforms, including Airbnb, to report owner and guest information to the province's Public Security Department. The platforms will need to check, register, and report the identity of both parties, including the time the guest plans to arrive and leave the property.

That information provides a very handy way of keeping tabs on people travelling around the province who stay in Airbnb properties and the like. It's yet another example of how the Chinese authorities are forcing digital services to help keep an eye on every aspect of citizens' lives.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

28 Comments | Leave a Comment..

Posted on Techdirt - 14 December 2018 @ 3:40am

Top EU Court's Advocate General Says German Link Tax Should Not Be Applied -- But On A Technicality

from the nice,-but-it-won't-stop-article-11 dept

As numerous Techdirt posts have explained, there are two really problematic areas with the EU's proposed copyright directive. Article 13, which will require pretty much every major online site to filter uploaded content, and Article 11, the so-called "link tax", more formally known as an "ancillary copyright". It's yet another example of the copyright ratchet -- the fact that laws governing copyright only ever get stronger, in favor of the industry, never in the other direction, in favor of the public. We know for sure that Article 11 will be a disaster because it's already been tried twice -- in Germany and Spain -- and failed both times.

Despite that fact, the German and Spanish laws are still on the law books in their respective countries. VG Media, the German collective management organization handling copyright on behalf of press publishers and others lost no time in bringing a case against Google. It alleged that the US Internet company had used text excerpts, images and videos from press and media material produced by VG Media's members without paying a fee.

Alongside the issue of whether Google did indeed infringe on the new law, there is another consideration arising out of some fairly obscure EU legislation. If the new German ancillary copyright law is "a technical regulation specifically aimed at a particular information society service", then it would require prior notification to the European Commission in order to be applicable. The German court considering VG Media's case asked the Court of Justice of the European Union, (CJEU), the EU's top court, to decide whether or not the link tax law is indeed a "technical regulation" of that kind. As is usual for CJEU cases, one of the court's Advocates General has offered a preliminary opinion before the main ruling is handed down (pdf). It concludes:

the Court should rule that national provisions such as those at issue, which prohibit only commercial operators of search engines and commercial service providers which edit content, but not other users, including commercial users, from making press products or parts thereof (excluding individual words and very short text excerpts) available to the public constitute rules specifically aimed at information society services. Further, national provisions such as those at issue constitute a technical regulation, subject to the notification obligation under that Directive.

It follows therefore, that in the absence of notification of these national provisions to the [European] Commission, these new German copyright rules cannot be applied by the German courts.

Although that sounds great, there are two caveats. One is that the CJEU is not obliged to follow the Advocate General's reasoning, although it often does. This means that it is quite likely that the top EU court will rule that Germany's link tax cannot be applied, and thus that Google has not infringed on any snippets produced by VG Media's members. The more important caveat is that even if the CJEU does take that view, it won't affect Article 11, which is EU, not national, legislation, and not finalized yet. So we are still facing the dire prospect of an EU-wide ancillary copyright that not only won't work, but also is something that many publishers don't even want.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

14 Comments | Leave a Comment..

Posted on Techdirt - 11 December 2018 @ 3:31pm

How Bike-Sharing Services And Electric Vehicles Are Sending Personal Data To The Chinese Government

from the why-we-can't-have-nice-things dept

A year ago, Techdirt wrote about the interesting economics of bike-sharing services in China. As the post noted, competition is fierce, and the profit margins slim. The real money may be coming from gathering information about where people riding these bikes go, and what they may be doing, and selling it to companies and government departments. As we warned, this was something that customers in the West might like to bear in mind as these Chinese bike-sharing startups expand abroad. And now, the privacy expert Alexander Hanff has come across exactly this problem with the Berlin service of the world's largest bike-sharing operator, Mobike:

data [from the associated Mobike smartphone app] is sent back to Mobike's servers in China, it is shared with multiple third parties (the privacy policy limits this sharing in no way whatsoever) and they are using what is effectively a social credit system to decrease your "score" if you prop the bike against a lamp post to go and buy a loaf of bread.

Detailed location data of this kind is far from innocuous. It can be mined to provide a disconcertingly complete picture of your habits and life:

through the collection and analysis of this data the Chinese Government now likely have access to your name, address (yes it will track your address based on the location data it collects), where you work, what devices you use, who your friends are (yes it will track the places you regularly stop and if they are residential it is likely they will be friends and family). They also buy data from other sources to find out more information by combining this data with the data they collect directly. They know what your routines are such as when you are likely to be out of the house either at work, shopping or engaging in social activities; and for how long.

As Hanff points out, most of this is likely to be illegal under the EU's GDPR. But Mobike's services are available around the world, including in the US. Although Mobike's practices can be challenged in the EU, elsewhere there may be little that can be done.

And if you think the surveillance made possible by bike sharing is bad, wait till you see what can be done with larger vehicles. As many people have noted, today's complex devices no longer have computers built in: they are, essentially, computers with specialized capabilities. For example, electric cars are computers with an engine and wheels. That means they are constantly producing large quantities of highly-detailed data about every aspect of the vehicle's activity. As such, the data from electric cars is a powerful tool for surveillance even deeper than that offered by bike sharing. According to a recent article from Associated Press, it is an opportunity that the authorities have been quick to seize in China:

More than 200 manufacturers, including Tesla, Volkswagen, BMW, Daimler, Ford, General Motors, Nissan, Mitsubishi and U.S.-listed electric vehicle start-up NIO, transmit position information and dozens of other data points to [Chinese] government-backed monitoring centers, The Associated Press has found. Generally, it happens without car owners' knowledge.

What both these stories reveal is how the addition of digital capabilities to everyday objects -- either indirectly through smartphone apps, as with Mobike, or directly in the case of computerized electric vehicles -- brings with it the risk of pervasive monitoring by companies and the authorities. It's part of a much larger problem of how to enjoy the benefits of amazing technology without paying an unacceptably high price in terms of sacrificing privacy.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

12 Comments | Leave a Comment..

Posted on Techdirt - 5 December 2018 @ 7:36pm

Some EU Nations Still Haven't Implemented The 2013 Marrakesh Treaty For The Blind

from the copyright-trumps-compassion dept

The annals of copyright are littered with acts of extraordinary stupidity and selfishness on the part of the publishers, recording industry and film studios. But few can match the refusal by the publishing industry to make it easier for the blind to gain access to reading material that would otherwise be blocked by copyright laws. Indeed, the fact that it took so long for what came to be known as the Marrakesh Treaty to be adopted is a shameful testimony to the publishing industry's belief that copyright maximalism is more important than the rights of the visually impaired. As James Love, Director of Knowledge Ecology International (KEI), wrote in 2013, when the treaty was finally adopted:

It difficult to comprehend why this treaty generated so much opposition from publishers and patent holders, and why it took five years to achieve this result. As we celebrate and savor this moment, we should thank all of those who resisted the constant calls to lower expectations and accept an outcome far less important than what was achieved today.

Even once the treaty was agreed, the publishing industry continued to fight against making it easier for the visually impaired to enjoy better access to books. In 2016, Techdirt reported that the Association of American Publishers was still lobbying to water down the US ratification package. Fortunately, as an international treaty, the Marrakesh Treaty came into force around the world anyway, despite the US foot-dragging.

Thanks to heavy lobbying by the region's publishers, the EU has been just as bad. It only formally ratified the Marrakesh Treaty in October of this year. As an article on the IPKat blog explains, the EU has the authority to sign and ratify treaties on behalf of the EU Member States, but it then requires the treaty to be implemented in national law:

In this case, the EU asked that national legislators reform their domestic copyright law by transposing the 2017/1564 Directive of 13 September 2017. The Directive requires that all necessary national measures be implemented by 12 October 2018. Not all member states complied by this deadline, whereby the EU Commission introduced infringement procedures against them for non-compliance. The list of the non-compliant countries is as follows:

Belgium, Cyprus, Czech Republic, Germany, Estonia, Greece, Finland, France, Italy, Lithuania, Luxembourg, Latvia, Poland, Portugal, Romania, Slovenia, UK

The IPKat post points out that some of the countries listed there, such as the UK and France, have in fact introduced exceptions to copyright to enable the making of accessible copies to the visually impaired. It's still a bit of mystery why they are on the list:

At the moment, the Commission has not published details regarding the claimed non-compliance by the countries listed. We cannot assume that the non-compliance proceedings were launched because the countries failed to introduce the exceptions in full, because countries can also be sanctioned if the scope of the exception implemented is too broad, so much so that it is disproportionately harmful to the interest of rightsholders. So we will have to wait and see what part of the implementation was deemed not up to scratch by the Commission.

As that indicates, it's possible that some of the countries mentioned are being criticized for non-compliance because they were too generous to the visually impaired. If it turns out that industry lobbyists are behind this, it would be yet another astonishing demonstration of selfishness from publishers whose behavior in connection with the Marrakesh Treaty has been nothing short of disgusting.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

15 Comments | Leave a Comment..

Posted on Techdirt - 27 November 2018 @ 2:15am

New GDPR Ruling In France Could Dramatically Re-shape Online Advertising

from the not-going-with-the-consent-flow dept

The EU's General Data Protection Regulation only came into force in May of this year. Since then, privacy regulators across the EU have been trying to work out what it means in practice. As Techdirt has reported, some of the judgments that have emerged were pretty bad. A new GDPR ruling from France has just appeared that looks likely to have a major impact on how online advertising works in the EU, and therefore probably further afield, given the global nature of the Internet.

The original decision in French is rather dense, although it does include the use of the delightful word "mobinaute", which is apparently the French term for someone accessing the Internet on a mobile device. If you'd like to read something in English, Techcrunch has a long and clear explanation. There's also a good, shorter take from Johnny Ryan of the browser company Brave, which is particularly interesting for reasons I'll explain below.

First, the facts of the case. The small French company Vectaury gathers personal information, including geolocation, about millions of users of thousands of mobile apps on behalf of the companies that created them. It analyzes the data to create user profiles that companies might want to advertise to:

We continuously analyse, classify and enrich hundreds of thousands of profiles in order to offer you big data predictive models and actionable audience segments at any time. Our geo-profiling algorithm relies on a framework of more than 80 million points of interest around the world, grouped into 450 categories.

Vectaury sells access to those profiles using a standard industry technique known as "real-time bidding" (RTB). This really does happen in real-time: advertisers can bid to display their ads on Web pages as they are loading on a user's mobile. The key benefit is that it allows ads to be tightly targeted to audiences that are more likely to respond to them. However, to do this, personal information has to be sent to many potential advertisers so that they can submit their (automated) bids.

That's a problem under the GDPR, since users are supposed to give their consent before personal data is transmitted to companies in this way. To get around that problem, the industry has developed what are known as consent management platforms (CMP). In theory, these allow users to pick and choose exactly what kind of information is sent to which advertisers. But in practice they usually amount to a top-level button marked "I accept", which everyone clicks on because it's too much effort going through the subsidiary pages that lie underneath. The top-level acceptance grants permission to all the bundled advertisers, hidden in lower levels of the CMP, to use personal data as they wish.

When the French data protection authority CNIL carried out an on-site inspection of Vectaury, it found the company was holding the personal data of 67.6 million people. However, it did not accept that Vectaury had been given meaningful permission to use that data through the use of the bundled permission system. In a trail-blazing decision, CNIL said that Vectaury couldn't simply point to contracts that required its partners to ask users for permission to share personal data: Vectaury had to be able to show that it had checked it really did have permission from everyone whose data it had acquired.

That ruling is not just a big problem for Vectaury -- it's hard to see how it could possibly confirm consent for the 67.6 million people whose data it holds. It's also a problem for the online advertising industry in Europe, which uses a framework for GDPR "consent flow" that has been created by industry trade association and standards body, IAB Europe. Vectaury's system is essentially the same as IAB Europe's, so it would seem that the latest ruling by the French data protection authority also calls into question the industry standard technique for obtaining consent that is vital for the RTB process. Without that "consent flow", it is not possible to share personal data so that automated real-time bids can be submitted.

If that interpretation is correct, it would mean that RTB as currently practiced in the EU will no longer be allowed. In fact, the RTB system was already under threat because of a GDPR complaint filed a couple of months ago with the Irish Data Protection Commissioner and the UK Information Commissioner, which notes:

Every time a person visits a website and is shown a "behavioural" ad on a website, intimate personal data that describes each visitor, and what they are watching online, is broadcast to tens or hundreds of companies. Advertising technology companies broadcast these data widely in order to solicit potential advertisers' bids for the attention of the specific individual visiting the website.

A data breach occurs because this broadcast, known as an "bid request" in the online industry, fails to protect these intimate data against unauthorized access. Under the GDPR this is unlawful.

The three complainants are Jim Killock, Executive Director of the Open Rights Group, Michael Veale of University College London, and Johnny Ryan of Brave, mentioned above. His blog post about the new French GDPR ruling concludes:

This is the latest in a series of decisions published by CNIL against adtech companies. ... What marks this decision apart are the broad implications for RTB, and for the IAB consent framework.

It could also be a problem for Google, which relies on a similar approach for its own real-time ad bidding system. The potential implications of the CNIL ruling across the EU are a further indication of the massive long-term impact the GDPR will have on the Internet, perhaps in multiple and unexpected ways.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

27 Comments | Leave a Comment..

Posted on Techdirt - 8 November 2018 @ 7:01pm

Leading Open Access Supporters Ask EU To Investigate Elsevier's Alleged 'Anti-Competitive Practices'

from the are-you-listening,-Commissioner-Vestager? dept

Back in the summer, we wrote about the paleontologist Jon Tennant, who had submitted a formal complaint to the European Commission regarding the relationship between the publishing giant Elsevier and the EU's Open Science Monitor. Now Tennant has joined with another leading supporter of open access, Björn Brembs, in an even more direct attack on the company and its practices, reported here by the site Research Europe:

Two academics have demanded the European Commission investigate the academic publisher Elsevier for what they say is a breach of EU competition rules that is harming research.

Palaeontologist Jon Tennant and neuroscientist Björn Brembs, who are both advocates for making research results openly available, say the academic publishing market "is clearly not functioning well" in an official complaint about Elsevier's parent company RELX Group.

The pair claim RELX and Elsevier are in breach of EU rules both due to general problems with the academic publishing market and "abuse of a dominant position within this market".

The 22-page complaint spells out what the problem is. It makes the following important point about the unusual economics of the academic publishing market:

For research to progress, access to all available relevant sources is required, which means that there is no ability to transfer or substitute products, and there is little to no inter-brand competition from the viewpoint of consumers. If a research team requires access to knowledge contained within a journal, they must have access to that specific journal, and cannot substitute it for a similar one published by a competitor. Indeed, the entire corpus of research knowledge is built on this vital and fundamental process of building on previously published works, which drives up demand for all relevant published content. As such, publishers do not realistically compete with each other, as all their products are fundamentally unique (i.e., each publisher has a 100% market share for each journal or article), and unequivocally in high demand due to the way scholarly research works. The result of this is that consumers (i.e., research institutions and libraries) have little power to make cost-benefit evaluations to decide whether or not to purchase, and have no choice but to pay whatever price the publishers asks with little transparency over costs, which we believe is a primary factor that has contributed to more than a 300% rise in journal prices above inflation since 1986. Thus, we believe that a functional and competitive market is not currently able to form due to the practices of dominant players, like Elsevier, in this sector.

Most of the complaint is a detailed analysis of why academic publishing has become so dysfunctional, and is well-worth reading by anyone interested in understanding the background to open access and its struggles.

As to what the complaint might realistically achieve, Tennant told Techdirt that there are three main possibilities. The European Commission can simply ignore it. It can respond and say that it doesn't think there is a case to answer, in which case Tennant says he will push the Commission to explain why. Finally, in the most optimistic outcome, the EU could initiate a formal investigation of Elsevier and the wider academic publishing market. Although that might seem too much to hope for, it's worth noting that the EU Competition Authority is ultimately under the Competition Commissioner, Margrethe Vestager. She has been very energetic in her pursuit of Internet giants like Google. It could certainly be a hugely significant moment for open access if she started to take an interest in Elsevier in the same way.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Read More | 11 Comments | Leave a Comment..

More posts from Glyn Moody >>