Glyn Moody’s Techdirt Profile


About Glyn MoodyTechdirt Insider

Posted on Techdirt - 19 October 2017 @ 8:01pm

A Tale of Two Transparencies: Why The EU And Activists Will Always Disagree Over Trade Deal Negotiations

from the TTIP,-remember-that? dept

Although the Transatlantic Trade and Investment Partnership (TTIP) has dropped off the radar completely since Donald Trump's election, for some years it was a key concern of both the US and European governments, and a major theme of Techdirt's posts. One of the key issues was transparency -- or the lack of it. Eventually, the European Commission realized that its refusal to release information about the negotiations was seriously undermining its ability to sell the deal to the EU public, and it began making some changes on this front, as we discussed back in 2015. Since then, transparency has remained a theme of the European Commission's initiatives. Last month, in his annual State of the Union address, President Jean-Claude Juncker unveiled his proposals for trade policy. One of them was all about transparency:

the Commission has decided to publish as of now all its recommendations for negotiating directives for trade agreements (known as negotiating mandates). When they are submitted to the European Parliament and the Council, those documents will in parallel be sent automatically to all national Parliaments and will be made available to the general public. This should allow for a wide and inclusive debate on the planned agreements from the start.

An interesting article on Borderlex explores why moves to open up trade policy by the European Commission did not and probably never will satisfy activists who have been pushing for more transparency, and why in this area there is an unbridgeable gulf between them and the EU politicians. In contrast to Juncker's limited plan to publish negotiating directives in order to allow "a wide and inclusive debate on the planned agreements", this is what activists want, according to the article:

timely release of textual proposals on all negotiating positions, complete lists and minutes of meetings of Commission officials with third parties, consolidated texts, negotiating mandates, and all correspondence between third parties and officials.

Activists are keen to see what is happening in detail throughout the negotiations, not just some top-level view at the start, or the initial textual proposals for each chapter, but nothing afterwards. The article suggests that this is not simply a case of civil society wanting more information for its own sake, but rather reflects completely different conceptions of what transparency means. Transparency is intimately bound up with accountability, which raises the key question of: accountability to whom?

These two different views reflect a seminal academic distinction between 'delegation' and 'participation' models of accountability in international politics. In a 'delegation' model, an organisation (such as the Commission) is accountable to those who have granted it a mandate (in the EU: the Council, the [European Parliament] and national parliaments). Transparency and participation should first and foremost be directed to them. Extending managed transparency to the wider public can be instrumentally used to increase trust.

In a 'participation model', in contrast, organisations are accountable to those who bear the burden of the decisions that are taken. If contemporary trade policy impacts people's daily lives, the people -- directly or through civil society organisations that claim to represent them -- should be able to see what is going on, and be able to influence the process. Therefore, there is a presupposition for openness, disclosure, and close participation.

The article's authors suggest that for activists, transparency is a means to an end -- gaining influence through participation -- and it is the European Commission's refusal to allow civil society any meaningful role in trade negotiations that guarantees that token releases of a few policy documents will never be enough.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

4 Comments | Leave a Comment..

Posted on Techdirt - 18 October 2017 @ 3:30am

Details Emerge Of World's Biggest Facial Recognition Surveillance System, Aiming To Identify Any Chinese Citizen In Three Seconds

from the but-what-happens-when-the-dataset-leaks-out? dept

Back in July, Techdirt wrote about China's plan to build a massive surveillance system based on 600 million CCTV cameras around the country. Key to the system would be facial recognition technology that would allow Chinese citizens to be identified using a pre-existing centralized image database plus billions more photos found on social networks. Lingering doubts about whether China is going ahead with such an unprecedented surveillance system may be dispelled by an article in the South China Morning Post, which provides additional details:

China is building the world's most powerful facial recognition system with the power to identify any one of its 1.3 billion citizens within three seconds.

The goal is for the system to able to match someone's face to their ID photo with about 90 per cent accuracy.

The project, launched by the Ministry of Public Security in 2015, is under development in conjunction with a security company based in Shanghai.

The article says that the system will use cloud computing facilities to process images from the millions of CCTV cameras located across the country. The company involved is Isvision, which has been using facial recognition with CCTV cameras since 2003. The earliest deployments were in the highly-sensitive Tiananmen Square area. Other hotspots where its technology has been installed are Tibet and Xinjiang, where surveillance has been at a high level for many years.

However, the report also cautions that the project is encountering "many difficulties" due to the technical limits of facial recognition and the sheer size of the database involved. A Chinese researcher is quoted as saying that some totally unrelated people in China have faces so alike that even their parents cannot tell them apart. Another issue is managing the biometric data, which is around 13 terabytes for the facial information, and 90 terabytes for the full dataset, which includes additional personal details on everyone in China.

As the South China Morning Post article rightly notes, it won't be long before 13 terabytes will fit on a single portable USB hard drive, which raises the issue of facial recognition data being copied and used for other unauthorized purposes:

But a network security vendor for the Ministry of Public Security dismissed the possibility.

"To download the whole data set is as difficult as launching a missile with a nuclear warhead. It requires several high-ranking officials to insert and turn their keys at the same time," the vendor said.

Given all that we know about the lamentable state of computer security around the world, even for highly-sensitive data, that claim seems a little hyperbolic. Since the Chinese government is apparently determined to build and operate this huge facial recognition system despite all the challenges, the unnamed network security vendor quoted above may find out the hard way that exfiltrating some or even all of that data really isn't rocket science.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

34 Comments | Leave a Comment..

Posted on Techdirt - 11 October 2017 @ 7:38pm

New 'Coalition For Responsible Sharing' About To Send Millions Of Take-Down Notices To Stop Researchers Sharing Their Own Papers

from the how-responsible-is-that? dept

A couple of weeks ago, we wrote about a proposal from the International Association of Scientific Technical and Medical Publishers (STM) to introduce upload filtering on the ResearchGate site in order to stop authors from sharing their own papers without "permission". In its letter to ResearchGate, STM's proposal concluded with a thinly-veiled threat to call in the lawyers if the site refused to implement the upload filters. In the absence of ResearchGate's acquiescence, a newly-formed "Coalition for Responsible Sharing", whose members include the American Chemical Society (ACS), Brill, Elsevier, Wiley and Wolters Kluwer, has issued a statement confirming the move:

Following unsuccessful attempts to jointly find ways for scholarly collaboration network ResearchGate to run its service in a copyright-compliant way, a coalition of information analytics businesses, publishers and societies is now left with no other choice but to take formal steps to remedy the illicit hosting of millions of subscription articles on the ResearchGate site.

Those formal steps include sending "millions of takedown notices for unauthorized content on its site now and in the future." Two Coalition publishers, ACS and Elsevier, have also filed a lawsuit in a German regional court, asking for “clarity and judgement” on the legality of ResearchGate's activities. Justifying these actions, the Coalition's statement says: "ResearchGate acquires volumes of articles each month in violation of agreements between journals and authors" -- and that, in a nutshell, is the problem.

The articles posted on ResearchGate are generally uploaded by the authors; they want them there so that their peers can read them. They also welcome the seamless access to other articles written by their fellow researchers. In other words, academic authors are perfectly happy with ResearchGate and how it uses the papers that they write, because it helps them work better as researchers. A recent post on The Scholarly Kitchen blog noted:

Researchers particularly appreciate ResearchGate because they can easily follow who cites their articles, and they can follow references to find other articles they may find of interest. Researchers do not stop to think about copyright concerns and in fact, the platform encourages them, frequently, to upload their published papers.

The problem lies in the unfair and one-sided contracts academic authors sign with publishers, which often do not allow them to share their own published papers freely. The issues with ResearchGate would disappear if researchers stopped agreeing to these completely unnecessary restrictions -- and if publishers stopped demanding them.

The Coalition for Responsible Sharing's statement makes another significant comment about ResearchGate: that it acquires all these articles "without making any contribution to the production or publication of the intellectual work it hosts." But much the same could be said about publishers, which take papers written by publicly-funded academics for free, chosen by academics for free, and reviewed by academics for free, and then add some editorial polish at the end. Despite their minimal contributions, publishers -- and publishers alone -- enjoy the profits that result. The extremely high margins offer incontrovertible proof that ResearchGate and similar scholarly collaboration networks are not a problem for anybody. The growing popularity and importance of unedited preprints confirms that what publishers add is dispensable. That makes the Coalition for Responsible Sharing's criticism of ResearchGate and its business model deeply hypocritical.

It is also foolish. By sending millions of take-down notices to ResearchGate -- and thus making it harder for researchers to share their own papers on a site they currently find useful -- the Coalition for Responsible Sharing will inevitably push people to use other alternatives, notably Sci-Hub. Unlike ResearchGate, which largely offers articles uploaded by their own authors, Sci-Hub generally sources its papers without the permission of the academics. So, once more, the clumsy actions of publishers desperate to assert control at all costs make it more likely that unauthorized copies will be downloaded and shared, not less. How responsible is that?

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

53 Comments | Leave a Comment..

Posted on Techdirt - 4 October 2017 @ 7:34pm

Elsevier's Latest Brilliant Idea: Adding Geoblocking To Open Access

from the how-about-no? dept

We've just written about a troubling move by Elsevier to create its own, watered-down version of Wikipedia in the field of science. If you are wondering what other plans it has for the academic world, here's a post from Elsevier’s Vice President, Policy and Communications, Gemma Hersh, that offers some clues. She's "responsible for developing and refreshing policies in areas related to open access, open data, text mining and others," and in "Working towards a transition to open access", Hersh meditates upon the two main kinds of open access, "gold" and "green". She observes:

While gold open access offers immediate access to the final published article, the trade-off is cost. For those that can't or don't wish to pay the article publishing charge (APC) for gold open access, green open access -- making a version of the subscription article widely available after a time delay or embargo period -- remains a viable alternative to enabling widespread public access.

She has a suggestion for how the transition from green open access to gold open access might be effected:

Europe is a region where a transition to fully gold open access is likely to be most cost-neutral and, perhaps for this reason, where gold OA currently has the highest policy focus. This is in stark contrast to other research-intensive countries such as the US, China and Japan, which on the whole have pursued the subscription/green open access path. Therefore one possible first step for Europe to explore would be to enable European articles to be available gold open access within Europe and green open access outside of Europe.

Blithely ignoring the technical impossibility of enforcing an online geographical gold/green border, Hersh is proposing to add all the horrors of geoblocking -- a long-standing blight on the video world -- to open access. But gold open access papers that aren't fully accessible outside Europe simply aren't open access at all. The whole point of open access is that it makes academic work freely available to everyone, everywhere, without restriction -- unlike today, where only the privileged few can afford wide access to research that is often paid for by the public.

It's hard to know why Elsevier is putting forward an idea that is self-evidently preposterous. Perhaps it now feels it has such a stranglehold on the entire academic knowledge production process that it doesn't even need to hide its contempt for open access and those who support it.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

14 Comments | Leave a Comment..

Posted on Techdirt - 29 September 2017 @ 3:42pm

Elsevier Launching Rival To Wikipedia By Extracting Scientific Definitions Automatically From Authors' Texts

from the don't-do-as-we-do,-do-as-we-say dept

Elsevier is at it again. It has launched a new (free) service that is likely to undermine open access alternatives by providing Wikipedia-like definitions generated automatically from texts it publishes. As an article on the Times Higher Education site explains, the aim is to stop users of the publishing giant's ScienceDirect platform from leaving Elsevier's walled garden and visiting sites like Wikipedia in order to look up definitions of key terms:

Elsevier is hoping to keep researchers on its platform with the launch of a free layer of content called ScienceDirect Topics, offering an initial 80,000 pages of material relating to the life sciences, biomedical sciences and neuroscience. Each offers a quick definition of a key term or topic, details of related terms and relevant excerpts from Elsevier books.

Significantly, this content is not written to order but is extracted from Elsevier's books, in a process that Sumita Singh, managing director of Elsevier Reference Solutions, described as "completely automated, algorithmically generated and machine-learning based".

It's typical of Elsevier's unbridled ambition that instead of supporting a digital commons like Wikipedia, it wants to compete with it by creating its own redundant versions of the same information, which are proprietary. Even worse, it is drawing that information from books written by academics who have given Elsevier a license -- perhaps unwittingly -- that allows it to do that. The fact that a commercial outfit mines what are often publicly-funded texts in this way is deeply hypocritical, since Elsevier's own policy on text and data mining forbids other companies from doing the same. It's another example of how Elsevier uses its near-monopolistic stranglehold over academic publishing for further competitive advantage. Maybe it's time anti-trust authorities around the world took a look at what is going on here.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

33 Comments | Leave a Comment..

Posted on Techdirt - 28 September 2017 @ 7:49pm

Chinese High-Tech Startups: Now More Copied Than Copying

from the time-to-wake-up dept

Techdirt has been pointing out for a while that the cliché about Chinese companies being little more than clever copycats, unable to come up with their own ideas, ceased to be true years ago. Anyone clinging to that belief is simply deluding themselves, and is likely to have a rude awakening as Chinese high-tech companies continue to advance in global influence. China's advances in basic research are pretty clear, but what about business innovation? That's an area that the US has traditionally prided itself on being the world leader. However, an interesting article in the South China Morning Post -- a Hong Kong-based newspaper owned by the Chinese e-commerce giant Alibaba, which has a market capitalization of $400 billion -- explores how it's Chinese ideas that are now being copied:

it's a reflection of a growing trend in which businesses across Southeast Asia look to China for inspiration for everything from e-commerce to mobile payment systems and news apps.

Once derided as a copycat of Western giants, Chinese companies have grown in stature to the point that in many areas they are now seen as the pinnacle of business innovation.

The article mentions dockless bike-sharing, which is huge in China, being copied in California by a startup called Limebike. It notes that Thailand's central bank has introduced a standardized QR code that enables the country's smartphone users to pay for their purchases simply by scanning their devices -- a habit that is well on the way to replacing cash and credit cards in China. In Malaysia, an online second-hand car trading platform Carsome based its approach closely on a Chinese company operating in Nanjing. Other copycats of Chinese innovators include:

Orami, Thailand's leading e-commerce business, which started out as a clone of China's online baby product platform Mia; Offpeak, a Malaysian version of the Chinese group buying website Meituan; and BaBe, an Indonesian news app that borrowed the business idea from China's Toutiao and has been downloaded more than 10 million times.

As the article points out, it is perhaps natural that entrepreneurs in Southeast Asia should look to China for ideas given the commonalities of culture. But that kind of creative borrowing can only occur if Chinese companies are producing enough good business ideas that are worth copying. It's evident that they are, and it's time that the West recognized that fact.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

15 Comments | Leave a Comment..

Posted on Techdirt - 27 September 2017 @ 7:37pm

Lawyers Gearing Up To Hit UK With Corporate Sovereignty Claims Totalling Billions Of Dollars Over Brexit

from the nobody-painted-that-on-the-side-of-a-bus dept

We're not hearing much about corporate sovereignty -- also known as "investment state dispute settlement" (ISDS) -- these days. It's definitely still a problem, especially for smaller countries. But the big fights over the inclusion of corporate sovereignty chapters in the two global trade deals -- the Transatlantic Trade and Investment Partnership (TTIP), and the Trans-Pacific Partnership (TPP) agreement -- have been put on hold for the moment. That's for the simple reason that both TPP and TTIP are in a kind of limbo following the election of Donald Trump as US President with his anti-free trade platform.

TTIP seems completely moribund, whereas TPP -- re-branded as TPP11 to reflect the fact that there are only 11 countries now that the US has pulled out -- is showing the odd twitch of life. A recent article in the Canadian newspaper National Post points out that the departure of the US might even allow some of the worst bits of TPP to be jettisoned:

the Americans insisted on longer intellectual property patent terms and stronger copyright regulations than many countries wanted. Canada will now argue for shorter patent terms, in support of its generic drug sector and in an attempt to keep drug costs down.

Canada is also keen to water down the investor-state dispute settlement negotiated by the U.S. in the original deal, and bolster the state's right to regulate in the public interest.

The move by Canada to rein in some of the worst excesses of corporate sovereignty follows the EU's lead in this area. As Techdirt reported, during the TTIP negotiations between the EU and US, the former suggested replacing the old ISDS with a "new" Investment Court System (ICS). Although the US was not interested, Canada later agreed to this slightly watered-down version for the CETA trade deal with the EU.

The ICS still doesn't exist, and is still something of a mystery in terms of how it will work. It was proposed in an attempt to head off massive public concern about corporations being able to sue governments -- and thus taxpayers -- for huge sums, completely outside the normal legal system, and subject to few constraints. But even ICS was not enough to stop the Belgian region of Wallonia nearly de-railing the CETA deal at the last moment.

Anxious to avoid that happening again, the President of the European Commission, Jean-Claude Juncker, had a rather radical suggestion in his recent State of the Union address. In order to make future trade deals easier to push through the legislative process in the EU, Juncker proposed removing investment protection chapters from them completely, and negotiating a separate deal covering this aspect. An article on explains the thinking behind that move:

Slicing out investment protection will give [Juncker] an immediate legal advantage. Under EU law, a trade deal without investment clauses could be ratified exclusively by the European Parliament and by the member countries as represented at the Council in Brussels. That effectively removes the direct veto powers of the Walloons.

Simon Lester, Trade Policy Analyst at the Cato Institute, thinks the US should follow suit -- an idea that someone from the same group suggested a few years ago:

The Europeans have faced a greater struggle with investment protection and ISDS than has been the case in the United States, but these provisions have been a problem here as well. If we want to make it easier to get trade negotiations completed and trade agreements passed by Congress, we should consider following the EU's lead.

Although removing corporate sovereignty from trade deals does not solve the larger problem of giving companies special protection, it is a step in the right direction. For example, after concluding trade deals that do not have ISDS chapters, governments may decide to bring them in as quickly as possible in order to enjoy their claimed benefits. When commercial relations work perfectly well without them -- as is already the case for both the US-Australia trade deal, and the one between the EU and South Korea, neither of which include corporate sovereignty -- governments may decide to leave it at that, and forget about further negotiations covering investment.

Unfortunately, none of these recent moves is likely to help the UK, currently struggling with the implications of last year's "Brexit" referendum to leave the EU. Ever-inventive lawyers have realized that an unexpected withdrawal of the country from the EU could represent an excellent opportunity for companies that have invested in the UK to claim that they will suffer as a result, and to use corporate sovereignty clauses to claim compensation potentially amounting to billions of dollars. Corporate Europe Observatory has a new post exploring what could happen here:

the UK's impending exit from the European Union may bring new investment arbitration opportunities. The country has 92 investment agreements in force, which investors from other countries could use to file ISDS claims against the UK. In conferences and alerts for their multinational clients, some of the top investment arbitration law firms are already assessing the prospect of such Brexit claims. Depending on how the Brexit negotiations turn out, these lawsuits could be about anything from foreign carmakers or financial companies losing free access to the EU market, to the government scrapping subsidies for certain sectors. One lawyer from UK-based law firm Volterra Fietta has even suggested that "there may be a number of investors that would have come to the UK expecting to have a certain low wage group of employees", which might sue for loss of expected profit if they lose access to underpaid, foreign workers.

But the clever lawyers don't stop there. They see opportunities for corporations to use Brexit as a way to sue other EU countries too:

Several law firms have published briefings suggesting that it would be an advantage for corporations if they structured their foreign investment into the remaining EU member states through the UK. This means that if you are a German company, for instance, and have an investment in Romania you could let this investment 'flow' through a subsidiary -- possibly only a mailbox company -- in the UK. You could then sue Romania via its bilateral investment treaty with the UK -- even if no such treaty was in place between Romania and Germany.

This kind of "creativity" is yet another reason why tweaks to corporate sovereignty of the kind contemplated by the EU and Canada are simply not enough: ISDS needs to be dropped completely from all trade deals -- past, present and future.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

19 Comments | Leave a Comment..

Posted on Techdirt - 25 September 2017 @ 7:36pm

Scientific Publishers Want Upload Filter To Stop Academics Sharing Their Own Papers Without Permission

from the where-there's-a-gate,-there's-got-to-be-a-gatekeeper dept

Back in March of this year, Techdirt wrote about ResearchGate, a site that allows its members to upload and share academic papers. Although the site says it is the responsibility of the uploaders to make sure that they have the necessary rights to post and share material, it's clear that millions of articles on ResearchGate are unauthorized copies according to the restrictive agreements that publishers typically impose on their authors. As we wrote back then, it was interesting that academic publishers were fine with that, but not with Sci-Hub posting and sharing more or less the same number of unauthorized papers.

Somewhat belatedly, the International Association of Scientific Technical and Medical Publishers (STM) has now announced that it is not fine with authors sharing copies of their own papers on ResearchGate without asking permission. In a letter to the site from its lawyers (pdf), the STM is proposing what it calls "a sustainable way to grow and to continue the important role you play in the research ecosystem". Here's what it wants ResearchGate ("RG") to do:

RG's users could continue "claiming”, i.e. agreeing to make public or uploading documents in the way they may have become accustomed to with RG's site. An automated system, utilizing existing technologies and ready to be implemented by STM members, would indicate if the version of the article could be shared publicly or privately. If publicly, then the content could be posted widely. If privately, then the article would remain available only to the co-authors or other private research groups consistent with the STM Voluntary Principles. In addition, a message could be sent to the author showing how to obtain rights to post the article more widely. This system could be implemented within 30-60 days and could then handle this "processing" well within 24 hours.

In other words, an upload filter, of exactly the kind proposed by the European Commission in its new Copyright Directive. There appears to be a concerted push by the copyright industry to bring in upload filters where it can, either through legislation, as in the EU, or through "voluntary" agreements, as with ResearchGate. Although the lawyer's letter is couched in the politest terms, it leaves no doubt that if ResearchGate refuses to implement STM's helpful suggestion, things might become less pleasant. It concludes:

On behalf of STM, I urge you therefore to consider this proposal. If you fail to accede to this proposal by 22 September 2017, then STM will be leaving the path open for its individual members to follow up with you separately, whether individually or in groups sharing a similar interest and approach, as they may see fit.

What this latest move shows is that publishers aren't prepared to allow academics to share even their own papers without permission. It underlines that, along with fat profits, what the industry is most concerned about in this struggle is control. Academic publishers will graciously allow ResearchGate to exist, but only if they are recognized unequivocally as the gatekeeper.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

33 Comments | Leave a Comment..

Posted on Techdirt - 25 September 2017 @ 9:09am

NSA-Developed Crypto Technology No Longer Trusted For Use In Global Standards

from the I-just-can't-think-why dept

One of the most shocking pieces of information to emerge from the Snowden documents was that the NSA had paid RSA $10 million to push a weakened form of crypto in its products. The big advantage for the NSA was that it made it much easier to decrypt messages sent using that flawed technology. A few months after this news, the National Institute of Standards and Technology announced that it would remove the "Dual Elliptic Curve" (Dual EC) algorithm from its recommendations. But of course, that's not the end of the story. Betraying trust is always a bad idea, but in the security field it's an incredibly stupid idea, since trust is a key aspect of the way things work in that shadowy world. So it should come as no surprise that following the Dual EC revelations, the world's security experts no longer trust the NSA:

An international group of cryptography experts has forced the U.S. National Security Agency to back down over two data encryption techniques it wanted set as global industry standards, reflecting deep mistrust among close U.S. allies.

In interviews and emails seen by Reuters, academic and industry experts from countries including Germany, Japan and Israel worried that the U.S. electronic spy agency was pushing the new techniques not because they were good encryption tools, but because it knew how to break them.

The NSA has now agreed to drop all but the most powerful versions of the techniques -- those least likely to be vulnerable to hacks -- to address the concerns.

The Reuters report has interesting comments from security experts explaining why they opposed the new standards. Concerns included the lack of peer-reviewed publication by the creators, the absence of industry adoption or a clear need for the new approaches. There's also the intriguing fact that the UK was happy for the NSA algorithms to be adopted. Given the extremely close working relationship GCHQ has with the NSA, you can't help wondering whether the UK's support was because it too knew how to break the proposed encryption techniques, and therefore was keen for them to be rolled out widely. Certainly, the reason its representative gave for backing the two NSA data encryption methods, known as Simon and Speck, was feeble in the extreme:

Chris Mitchell, a member of the British delegation, said he supported Simon and Speck, noting that "no one has succeeded in breaking the algorithms.”

Moreover, it was only half-true: the Reuters story says that academics have already had "partial success" in finding weaknesses, which surely calls for a cautious approach and more research, rather than simply accepting the proposal and hoping for the best. And even the British representative had to admit that his NSA mates had totally blown it:

He acknowledged, though, that after the Dual EC revelations, "trust, particularly for U.S. government participants in standardization, is now non-existent."

As the NSA -- and also the W3C, thanks to its blessing of DRM in HTML -- will now find, regaining that lost trust will be a long and difficult process. Maybe others can learn from their (bad) examples.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

14 Comments | Leave a Comment..

Posted on Techdirt - 21 September 2017 @ 6:36am

EU Buried Its Own $400,000 Study Showing Unauthorized Downloads Have Almost No Effect On Sales

from the but-the-truth-is-finally-out-now dept

One of the problems in the debate about the impact of unauthorized downloads on the copyright industry is the paucity of large-scale, rigorous data. That makes it easy for the industry to demand government policies that are not supported by any evidence they are needed or will work. In 2014, the European Commission tried to address that situation by putting out a tender for the following research:

to devise a viable methodology and to subsequently implement it in view of measuring the extent to which unauthorised online consumption of copyrighted materials (music, audiovisual, books and video games) displaces sales of online and offline legal content, gathering comparable systematic data on perceptions, and actual and potential behaviour of consumers in the EU.

The contract was awarded to Ecorys, a "research and consultancy company" based in the Netherlands that has written many similar reports in the past. The value of the contract was a princely €369,871 -- over $400,000. Given that hefty figure, and the fact that this was public money, you might expect the European Commission to have published the results as soon as it received them, which was in May 2015. And yet strangely, it kept them to itself. In order to find out what happened to it, a Freedom of Information (FOI) request was submitted by the Pirate Party MEP, Julia Reda. It's worth reading the to and fro of emails between Reda and the European Commission to get an idea of how unhelpful the latter were on this request. The European Commission has now released the report, with the risible claim that this move has nothing to do with Reda's FOI request, and that it was about to publish it anyway.

The 304-page document (pdf), made available on the site, contains all the details of the questions that were put to a total of 30,000 people from Germany, France, Poland, Spain, Sweden, and the UK, their answers, and exhaustive analysis. The summary reveals the key results:

In 2014, on average 51 per cent of the adults and 72 per cent of the minors in the EU have illegally downloaded or streamed any form of creative content, with higher piracy rates in Poland and Spain than in the other four countries of this study. In general, the results do not show robust statistical evidence of displacement of sales by online copyright infringements. That does not necessarily mean that piracy has no effect but only that the statistical analysis does not prove with sufficient reliability that there is an effect. An exception is the displacement of recent top films. The results show a displacement rate of 40 per cent which means that for every ten recent top films watched illegally, four fewer films are consumed legally.

That is, there is zero evidence that unauthorized downloads harmed sales of music, books and games. Indeed, for games, there was evidence that such downloads boosted sales:

the estimated effect of illegal online transactions on sales is positive -- implying that illegal consumption leads to increased legal consumption. This positive effect of illegal downloads and streams on the sales of games may be explained by the industry being successful in converting illegal users to paying users. Tactics used by the industry include, for example, offering gameplay with extra bonuses or extra levels if consumers pay.

The research did find evidence that there was some displacement of sales in the film sector. Another result of the Ecorys work provided an explanation of why that might be:

Overall, the analysis indicates that for films and TV-series current prices are higher than 80 per cent of the illegal downloaders and streamers are willing to pay. For books, music and games prices are at a level broadly corresponding to the willingness to pay of illegal downloaders and streamers. This suggests that a decrease in the price level would not change piracy rates for books, music and games but that prices can have an effect on displacement rates for films and TV-series.

In other words, people turn to unauthorized downloads for films and TV because they feel the street prices are too high. For books, music and games, by contrast, the prices were felt to be fair, and therefore people were happy to pay them. This is exactly what Techdirt has been saying for years -- that the best way to stop unauthorized downloads is to adopt reasonable pricing. A new post on the EDRi site points out something rather noteworthy about the research results concerning video and TV series:

Interestingly, these results concerning the film industry found their way to a publication of an academic paper by Benedikt Hertz and Kamil Kiljański, both members of the chief economist team of the European Commission. Yet the other unpublished results, showing no negative impact of piracy in the music, book and games industry, were not mentioned in the paper. Beyond that, the original study itself is not referred to either.

This seems to substantiate suspicion that the European Commission was hiding the study on purpose and cherry-picked the results they wanted to publish, by choosing only the results which supported their political agenda towards stricter copyright rules.

The European Commission was quite happy to publish partial results that fitted with its agenda, but tried to bury most of its research that showed industry calls for legislation to "tackle" unauthorized downloads were superfluous because there was no evidence of harm. This is typical of the biased and one-sided approach taken by the European Commission in its copyright policy, shown most clearly in its dogged support for the Anti-Counterfeiting Trade Agreement -- and of the tilted playing field that those striving for fair copyright laws must still contend with on a regular basis. Sadly, it's too much to hope that the European Commission's own evidence, gathered at considerable cost to EU taxpayers, will now lead it to take a more rational approach to copyright enforcement, and cause it to drop the harmful and demonstrably unnecessary upload filter it is currently pushing for.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

51 Comments | Leave a Comment..

Posted on Techdirt - 20 September 2017 @ 3:38am

Free Software Foundation Europe Leads Call For Taxpayer-Funded Software To Be Licensed For Free Re-use

from the public-money,-public-code dept

Free Software Foundation Europe has a new campaign -- "Public money, public code" -- which poses the following question:

Why is software created using taxpayers' money not released as Free Software?

And goes on:

We want legislation requiring that publicly financed software developed for the public sector be made publicly available under a Free and Open Source Software licence. If it is public money, it should be public code as well.

It certainly seems pretty ridiculous that code written for public bodies, whether by external companies or contractors paid by the public purse, or produced internally, should not be released as free software. But aside from this being a question of fairness, the FSFE lists other reasons why it makes sense:

Tax savings

Similar applications don't have to be programmed from scratch every time.


Efforts on major projects can share expertise and costs.

Fostering innovation

With transparent processes, others don't have to reinvent the wheel.

An open letter on the site, supported by dozens of organizations and open for individual signatures, provides a few more:

Free and Open Source Software is a modern public good that allows everybody to freely use, study, share and improve applications we use on a daily basis.

Free and Open Source Software licences provide safeguards against being locked in to services from specific companies that use restrictive licences to hinder competition.

Free and Open Source Software ensures that the source code is accessible so that backdoors and security holes can be fixed without depending on one service provider.

Considered objectively, it's hard to think of any good reasons why code that is paid for by the public should not be released publicly as a matter of course. The good news is that this "public money, public code" argument is precisely the approach that open access advocates have used with considerable success in the field of academic publishing, so there's hope it might gain some traction in the world of software too.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

34 Comments | Leave a Comment..

Posted on Techdirt - 14 September 2017 @ 3:38am

Free Software, Open Access, And Open Science Groups Join Fight Against EU Copyright Directive's Terrible Ideas

from the how-to-destroy-Europe's-science-and-tech-future-in-one-easy-move dept

Techdirt has been covering the EU's plans to "modernize" copyright law for years now, and noted how things seem to be getting worse. Two ideas -- the so-called link tax and the upload filter -- are particularly one-sided, offering no benefits for the public, but providing the copyright industry with yet more monopolies and powers to censor. That much we knew. But two new initiatives reveal that the harmful effects are much, much broader than first thought.

The first, dubbed "Save Code Share", comes from the Free Software Foundation Europe (FSFE), and the open source organization OpenForum Europe (disclosure: I am an unpaid Fellow of the associated OpenForum Academy, but have no involvement with the new project). The two groups are concerned about the impact of Article 13 of the draft text (pdf) -- the upload filter -- on coding in Europe, as they explain in a white paper:

large businesses, SMEs and individuals relying on current tools to develop software, especially in FOSS [free and open source software] or collaboratively, could be faced with automated filtering which could engender 'false positive' identifications of infringing software, which in turn could cause developers' dependencies randomly to disappear and so literally "break" their builds, resulting in lost business, lost productivity, less reliable software, and less resilient infrastructure.

The problem they identified is that widely-used version control systems (VCS) like GitHub seem to meet the definition of "information society services" laid down by the proposed EU Copyright Directive, and as such would be required to filter all uploads to block copyright infringements. Moreover, as the white paper points out, developer Web sites would not only be held responsible for any material uploaded by users without permission of the copyright holders, but would also seem liable for illegitimate distributions of derivative works in violation of the applicable license.

GitHub and other similar services could also be required to sign licensing deals with other copyright holders, although what kind and with whom is totally unclear. That's because the ill-thought-out Article 13 was designed to catch unauthorized uploads of music and videos, not of software; but its current wording is such that it would seem to apply to VCS platforms as much as to YouTube -- a ridiculous situation. Destroying the indigenous software industry in Europe is presumably not the EU's intention here, and so the FSFE and OpenForum Europe call for Article 13 to be deleted completely.

The other new initiative, an open letter from a coalition of European academic, library, education, research and digital rights organizations, agrees, and wants Article 11 -- the link tax -- thrown out too. Here's why:

The extension of this controversial proposal [the link tax] to academic publications, as proposed by the [European Parliament's Industry, Research and Energy] Committee, significantly worsens an already bad situation. It would provide academic publishers additional legal tools to restrict access, going against the increasingly widely accepted practice of sharing research. This will limit the sharing of open access publications and data which currently are freely available for use and reuse in further scientific advances. If the proposed ancillary right is extended to academic publications, researchers, students and other users of scientific and scholarly journal articles could be forced to ask permission or pay fees to the publisher for including short quotations from a research paper in other scientific publications. This will seriously hamper the spread of knowledge.

Similarly, the coalition believes that the upload filter required by Article 13 of the current Copyright Directive draft will have a major, negative impact on the world of open access and open science:

The provisions of Article 13 threaten the accessibility of scientific articles, publications and research data made available through over 1250 repositories managed by European non-profit institutions and academic communities. These repositories, which are essential for Open Access and Science in Europe, are likely to face significant additional operational costs associated with implementing new filtering technology and the legal costs of managing the risks of intermediary liability. The additional administrative burdens of policing this content would add to these costs. Such repositories, run on a not-for-profit basis, are not equipped to take on such responsibilities, and may face closure. This would be a significant blow, creating new risks for implementing funder, research council and other EU Open Access policies.

These latest interventions are important because they show that the reach of the Copyright Directive's worst elements is much wider than originally thought. They emphasize that, by lobbying for these extreme measures, the copyright industry seems not to care what collateral damage it causes in the EU, whether to the public at large, to the local software industry, or to the entire process of scientific research. The white paper and open letter provide additional, compelling reasons why both Article 11 and Article 13 should be dropped from the final legislation. If they aren't, the danger is that the full potential of the huge and rapidly-growing high-tech ecosystem in Europe will be sacrificed in order to prop up the relatively small and sclerotic copyright industries that refuse to adapt to today's digital environment.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

15 Comments | Leave a Comment..

Posted on Techdirt - 31 August 2017 @ 7:34pm

Big Ag Gets Ag-Gag Envy, Helps Bring In 'Seed-Preemption' Laws Across The US

from the local-democracy,-who-needs-it? dept

As multiple Techdirt stories attest, farmers do love their "ag-gag" laws, which effectively make it illegal for activists to expose animal abuse in agricultural establishments -- although, strangely, farmers don't phrase it quite like that. Big Ag -- the giant seed and agricultural chemical companies such as Monsanto, Bayer, and DuPont -- seem to have decided they want something similar for seeds. As an article in Mother Jones, originally published by Food and Environment Reporting Network, reports, it looks like they are getting it:

With little notice, more than two dozen state legislatures have passed "seed-preemption laws" designed to block counties and cities from adopting their own rules on the use of seeds, including bans on GMOs. Opponents say that there's nothing more fundamental than a seed, and that now, in many parts of the country, decisions about what can be grown have been taken out of local control and put solely in the hands of the state.

Supporters of the move claim that a system of local seed rules would be complicated to navigate. That's a fair point, but it's hard to believe Big Ag really cares about farmers that much. Some of the new laws go well beyond seeds:

Language in the Texas version of the bill preempts not only local laws that affect seeds but also local laws that deal with "cultivating plants grown from seed.” In theory, that could extend to almost anything: what kinds of manure or fertilizer can be used, or whether a county can limit irrigation during a drought, says Judith McGeary, executive director of the Farm and Ranch Freedom Alliance. Along with other activists, her organization was able to force an amendment to the Texas bill guaranteeing the right to impose local water restrictions. Still, the law's wording remains uncomfortably open to interpretation, she says.

You would have thought that farmers would welcome the ability to shape local agricultural laws according to local needs and local factors like weather, water and soil. But apparently ag-gagging activists to stop them doing the same is much more important.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

19 Comments | Leave a Comment..

Posted on Techdirt - 31 August 2017 @ 10:47am

Leaked Plans Shows Top EU Body Backing Copyright Industry Against The Public, The Internet, And Innovation

from the whatever-happened-to-Estonia's-deep-understanding-of-digital? dept

Techdirt has been covering the slow and painful attempts by the EU to make its copyright laws fit for the digital age for nearly four years now. Along the way, there have been some good ideas, and an astonishingly bad one that would require many online services to filter all uploads to their sites for potential copyright infringements. Despite the widespread condemnation of what is now Article 13 in the proposed Copyright Directive, an important new leak (pdf) published on the Statewatch site shows that EU politicians are still pushing to make the censorship filters mandatory.

The document is an attempt by Estonia, which currently holds the Presidency of the Council of the EU -- one of the three main European Union bodies -- to come up with a revised text for the new Copyright Directive. In theory, it should be a compromise document that takes into account the differing opinions and views expressed so far. In practice, it is a slap in the face for the EU public, whose concerns it ignores, while pandering to the demands of the EU copyright industry.

Estonia's problem is that the whole idea of forcing Web sites to filter uploads contradicts an existing EU directive, one from 2000 on e-commerce. This created a safe harbor for sites that were "mere conduits" or simply hosting material -- that is, took no active part in publishing material online. The Directive explicitly says:

Member States shall not impose a general obligation on providers, when providing the services covered by Articles 12, 13 and 14, to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.

Most of the leaked document is a forlorn attempt to circumvent this unequivocal ban on upload filters:

In order to ensure that rightholders can exercise their rights, they should be able to prevent the availability of their content on such [online] services, in particular when the services give access to a significant amount of copyright protected content and thereby compete on the online content services' market. It is therefore necessary to provide that information society service providers that store and give access to a significant amount of works or other subject-matter uploaded by their users take appropriate and proportionate measures to ensure the protection of copyright protected content, such as implementing effective technologies.

It is reasonable to expect that this obligation also applies when information society service providers are eligible for the limited liability regime provided for in Article 14 of Directive 2000/31/EC [for hosting], due to their role in giving access to copyright protected content. The obligation of measures should apply to service providers established in the Union but also to service providers established in third countries, which offer their services to users in the Union.

In other words, even though Article 14 of the E-commerce Directive provides a safe harbor for companies hosting content uploaded by users, the EU wants to ignore that and make online services responsible anyway, and to require upload filtering, even though that is forbidden by Article 15. Moreover, this would apply to non-EU companies -- like Google and Facebook -- as well. The desperation of the Estonian Presidency is evident in the fact that it provides not one, but two versions of its proposal, with the second one piling on even more specious reasons why the E-commerce Directive should be ignored, even though it has successfully provided the foundation of Internet business activity in the EU for the last 17 years.

Jettisoning that key protection will make it far less likely that startups will choose the EU for their base. The requirement to filter every single upload for potential infringements will probably be impossible, and certainly prohibitively expensive, while the legal risks of not filtering will be too great. So the Estonian Presidency is essentially proposing the death of online innovation in the EU -- rather ironic for a country that prides itself for being in the digital vanguard.

The leaked document also contains two proposals for Article 11 of the Copyright Directive -- the infamous link tax. One takes all the previous bad ideas for this "ancillary copyright", and makes them even worse. For example, the new monopoly right would apply not just to text publications in any media -- including paper -- but also to photos and videos. In addition, it would make hyperlinks subject to this new publisher's "right". The only exceptions would be those links not involving what is termed a "communication to the public" -- a concept that is so vague that even the EU's top courts can't agree what it means. The other proposal completely jettisons the idea of any kind of link tax, and instead wants to introduce "a presumption for publishers of press publications":

in the absence of proof to the contrary, the publisher of a press publication shall be regarded as the person entitled to conclude licences and to seek application of the measures, procedures and remedies … concerning the digital use of the works and other subject-matter incorporated in such a press publication, provided that the name of the publisher appears on the publication.

This is something that has been suggested by others as providing the best solution to what publishers claim is a problem: the fact that they can't always sue sites for alleged copyright infringement of material they have published, because their standing is not clear. It effectively clarifies that existing copyright law can be used to tackle abusive re-posting of material. As such, it's a reasonable solution, unlike the link tax, which isn't.

The fact that two such diametrically-opposed ideas are offered in a document that is meant to be creating a clear and coherent way forward is an indication of what a mess the whole EU Copyright Directive project remains, even at this late stage. Unfortunately, the Estonian Presidency's unequivocally awful drafts for Article 13 suggest that the EU is still planning to bring in a law that will be bad for the Internet, bad for innovation, and bad for EU citizens.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

37 Comments | Leave a Comment..

Posted on Techdirt - 30 August 2017 @ 3:30am

India's Supreme Court Rules Privacy Is A Fundamental Right; Big Ramifications For The Aadhaar Biometric System And Beyond

from the constitutional-core-of-human-dignity dept

In a move that will have major implications for the online world in India and beyond, nine Supreme Court judges have ruled unanimously that privacy is a fundamental right under the Indian Constitution. As part of a decision spanning 547 pages (pdf) they declared:

Privacy is the constitutional core of human dignity.

The case was brought as a result of a legal challenge to India's huge biometric database, Aadhaar, whose rise Techdirt has been charting for some years. A post on the EFF Web site explains the legal background, and why the Supreme Court decision was necessary:

The right to privacy in India has developed through a series of decisions over the past 60 years. Over the years, inconsistency from two early judgments created a divergence of opinion on whether the right to privacy is a fundamental right. Last week's judgment reconciles those different interpretations to unequivocally declare that it is. Moreover, constitutional provisions must be read and interpreted in a manner which would enhance their conformity with international human rights instruments ratified by India. The judgment also concludes that privacy is a necessary condition for the meaningful exercise of other guaranteed freedoms.

Now that a solid constitutional foundation for privacy in India has been affirmed, other judges will proceed with examining the legality of Aadhaar in the light of the many relevant points made in the ruling:

The Aadhaar hearings, which were cut short, are expected to resume under a smaller three- or five-judge bench later this month. Outside of the pending Aadhaar challenge, the ruling can also form the basis of new legal challenges to the architecture and implementation of Aadhaar. For example, with growing evidence that state governments are already using Aadhaar to build databases to profile citizens, the security of data and limitations on data convergence and profiling may be areas for future privacy-related challenges to Aadhaar.

A case challenging WhatsApp's new privacy policy that allows content sharing with Facebook is also certain to be affected by the ruling, but the ramifications go far beyond Aadhaar and the digital world. As an analysis in the Economic Times notes, the judgment could lead to the decriminalization of homosexuality in India, as well as affecting laws that restrict a person's right to convert to a different religion, and state-level rules that impose restrictions on animal slaughter. The breadth of those possible impacts underlines just how epoch-making last week's decision is likely to prove.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

17 Comments | Leave a Comment..

Posted on Techdirt - 28 August 2017 @ 5:22pm

CCTV + Lip-Reading Software = Even Less Privacy, Even More Surveillance

from the HAL-would-be-proud dept

Techdirt has written a number of stories about facial recognition software being paired with CCTV cameras in public and private places. As the hardware gets cheaper and more powerful, and the algorithms underlying recognition become more reliable, it's likely that the technology will be deployed even more routinely. But if you think loss of public anonymity is the end of your troubles, you might like to think again:

Lip-reading CCTV software could soon be used to capture unsuspecting customer's private conversations about products and services as they browse in high street stores.

Security experts say the technology will offer companies the chance to collect more "honest" market research but privacy campaigners have described the proposals as "creepy" and "completely irresponsible".

That story from the Sunday Herald in Scotland focuses on the commercial "opportunities" this technology offers. It's easy to imagine the future scenarios as shop assistants are primed to descend upon people who speak favorably about goods on sale, or who express a wish for something that is not immediately visible to them. But even more troubling are the non-commercial uses, for example when applied to CCTV feeds supposedly for "security" purposes.

How companies and law enforcement use CCTV+lip-reading software will presumably be subject to legislation, either existing or introduced specially. But given the lax standards for digital surveillance, and the apparent presumption by many state agencies that they can listen to anything they are able to grab, it would be na&iumlve to think they won't deploy this technology as much as they can. In fact, they probably already have.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

44 Comments | Leave a Comment..

Posted on Techdirt - 25 August 2017 @ 3:23am

Repeal All UK Terrorism Laws, Says UK Government Adviser On Terrorism Laws

from the outbreak-of-sanity dept

It's become a depressingly predictable spectacle over the years, as politicians, law enforcement officials and spy chiefs take turns to warn about the threat of "going dark", and to call for yet more tough new laws, regardless of the fact that they won't help. So it comes as something of shock to read that the UK government's own adviser on terrorism laws has just said the following in an interview:

The Government should consider abolishing all anti-terror laws as they are "unnecessary" in the fight against extremists, the barrister tasked with reviewing Britain’s terrorism legislation has said.

the Independent Reviewer of Terrorism Legislation, argued potential jihadis can be stopped with existing "general" laws that are not always being used effectively to take threats off the streets.

As the Independent reported, the UK government's Independent Reviewer of Terrorism Legislation, Max Hill, went on:

"We should not legislate in haste, we should not use the mantra of 'something has to be done' as an excuse for creating new laws," he added. “We should make use of what we have."

Aside from the astonishingly sensible nature of Hill's comments, the interview is also worth reading for the insight it provides into the changing nature of terrorism, at least in Europe:

Mr Hill noted that some of the perpetrators of the four recent terror attacks to hit the UK were previously "operating at a low level of criminality", adding: "I think that people like that should be stopped wherever possible, indicted using whatever legislation, and brought to court."

This emerging "crime-terror nexus" is one reason why anti-terrorism laws are unnecessary. Instead, non-terrorism legislation could be used to tackle what Hill termed "precursor criminality" -- general criminal activity committed by individuals who could be stopped and prosecuted before they move into terrorism. Similarly, it would be possible to use laws against murder and making explosive devices to hand down sentences for terrorists, made harsher to reflect the seriousness of the crimes.

Even though Hill himself doubts that the UK's terrorism laws will be repealed any time soon, his views are still important. Taken in conjunction with the former head of GCHQ saying recently that end-to-end encryption shouldn't be weakened, they form a more rational counterpoint to the ill-informed calls for more laws and less crypto.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

17 Comments | Leave a Comment..

Posted on Techdirt - 21 August 2017 @ 1:28pm

Moving On From Obviously Fake News To Plausibly Fake News Sites

from the did-the-Guardıan-really-write-that? dept

Fake news is old news now. The hope has to be that we have all become slightly more suspicious when we read astonishing stories online (well, we can hope). It also means that those peddling fake news have to work a little bit harder to make us fall for their tricks. Like this:

Fake articles made to look like they have been published by legitimate news websites have emerged as a new avenue for propaganda on the internet, with experts concerned about the increasing sophistication of the latest attempts to spread disinformation. Kremlin supporters are suspected to be behind a collection of fraudulent articles published this year that were mocked up to appear as if they were from al-Jazeera, the Atlantic, Belgian newspaper Le Soir, and the Guardian.

The Guardian report on this new development says that it's not just a matter of getting the typography and layout right: even the domain names are similar. For example, the fake Guardian site's URL replaced the usual "i" in Guardian with the Turkish "ı" -- a tiny change that is easy to miss, especially when it's in a URL.

What's particularly problematic with these fake newspaper sites is that their domain names add an extra level of plausibility that make it more likely the lie will be spread by unsuspecting Internet users. Even when stories are debunked, the online echo of the false information lives on as people re-post secondary material, especially if legitimate sites are fooled and repeat the "news" themselves, lending it a spurious authenticity. Taking down the material can make things worse:

Ren TV, which has a history of producing pro-Kremlin content, did a piece portraying the removal of the article as a deletion by the Guardian of a true article, an angle also taken by an Armenian outlet following the fake Haaretz piece on the Azerbaijani first family.

In other words, deletion might be used as "proof" that powerful forces did not want people to see the "truth". Even though the original is removed, the rumors and conspiracy theories might actually increase as a result. This latest evolution of fake news shows that we are still nowhere near to tackling the problem. Indeed, it looks like things are going to get worse before they get better -- assuming they do.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

32 Comments | Leave a Comment..

Posted on Techdirt - 18 August 2017 @ 7:39pm

Welcome To The Technological Incarceration Project, Where Prison Walls Are Replaced By Sensors, Algorithms, And AI

from the shocking-new-approach dept

At heart, a prison is a place where freedom is taken away, and inmates are constrained in what they can do. Does that mean a prison has to consist of a special building with bars and prison guards? How about turning the home of a convicted criminal into a kind of virtual prison, where they are limited in their actions? That's what Dan Hunter, dean of Swinburne University's Law School in Melbourne, suggests, reported here by Australian Broadcast News:

Called the Technological Incarceration Project, the idea is to make not so much an internet of things as an internet of incarceration.

Professor Hunter's team is researching an advanced form of home detention, using artificial intelligence, machine-learning algorithms and lightweight electronic sensors to monitor convicted offenders on a 24-hour basis.

The idea is to go beyond today's electronic tagging systems, which provide a relatively crude and sometimes circumventable form of surveillance, to one that is pervasive, intelligent -- and shockingly painful:

Under his team's proposal, offenders would be fitted with an electronic bracelet or anklet capable of delivering an incapacitating shock if an algorithm detects that a new crime or violation is about to be committed.

That assessment would be made by a combination of biometric factors, such as voice recognition and facial analysis.

Leaving aside the obvious and important issue of how reliable the algorithms would be in judging when a violation was about to take place, there are a couple of other aspects of this approach worth noting. One is that it shifts the costs of incarceration from the state to the offender, who ends up paying for his or her upkeep in the virtual prison. That would obviously appeal to those who are concerned about the mounting cost to taxpayers of running expensive prisons. The virtual prison would also allow offenders to remain with their family, and thus offers the hope that they might be re-integrated into society more easily than when isolated in an unnatural prison setting. Irrespective of any possible financial benefits, that has to be a good reason to explore the option further.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

62 Comments | Leave a Comment..

Posted on Techdirt - 14 August 2017 @ 7:13pm

Danish University And Industry Work Together On Open Science Platform Whose Results Will All Be Patent-Free

from the they-said-it-couldn't-be-done dept

Here on Techdirt, we write a lot about patents. Mostly, it's about their huge downsides -- the stupid patents that should never have been awarded, or the parasitic patent trolls that feed off companies doing innovative work. The obvious solution is to get rid of patents, but the idea is always met with howls of derision, as if the entire system of today's research and development would collapse, and a new dark age would be upon us. It's hard to refute that claim with evidence to the contrary because most people -- other than a few brave souls like Elon Musk -- are reluctant to find out what happens if they don't cling to patents. Against that background, it's great to see Aarhus University in Denmark announce a new open science initiative that will eschew patents on researchers' work completely:

The platform has been established with funds from the Danish Industry Foundation and it combines basic research with industrial innovation in a completely new way, ensuring that industry and the universities get greater benefit from each other's knowledge and technology.

University researchers and companies collaborate across the board to create fundamental new knowledge that is constantly made available to everyone -- and which nobody may patent. On the contrary, everyone is subsequently freely able to use the knowledge to develop and patent their own unique products.

According to Aarhus University, Danish industry loves it:

The idea of collaborating in such a patent-free zone has aroused enormous interest in industry and among companies that otherwise use considerable resources on protecting their intellectual property rights.

The attraction seems to be that an open platform will make it easier for companies -- particularly smaller ones -- to gain access to innovative technologies at an early stage, without needing to worry about patents and licensing. Aarhus University hopes that the approach will also allow researchers to take greater risks with their work, rather than sticking with safer, less ambitious projects, as has happened in the past. The first example is already up and running. It is called SPOMAN (Smart Polymer Materials and Nano-Composites), and has a project page hosted on the Open Science Framework site:

In this project, you will find minutes from the Open Science meetings, current status of the initiative, general presentations etc. More importantly, this project has links to the individual activities and research projects under Open Science. In these projects, the research progress, lab journals and more are found.

Combined with the no-patent promise, you don't get much more open than that.

Follow me @glynmoody on Twitter or, and +glynmoody on Google+

77 Comments | Leave a Comment..

More posts from Glyn Moody >>