Capitalist Lion Tamer’s Techdirt Profile


About Capitalist Lion TamerTechdirt Insider

List of blogs started with enthusiasm, which now mostly lie dormant:

[reserved for future use]

[recently retired]

[various side projects]

Posted on Techdirt - 29 April 2016 @ 7:39pm

Redaction Failure In FTC/Amazon Decision Inadvertently Allows Public To See Stuff It Should Have Been Able To See Anyway

from the not-even-good-enough-for-government-work dept

A court has found that Amazon engaged in deceptive practices by not obtaining "informed consent" about in-app charges, especially with apps targeted at children. The finding is perhaps unsurprising, as the world of microtransactions relies greatly on a minimum number of steps between app makers (and app purveyors like Amazon) and users' wallets.

What's more surprising is the opinion itself, which arrived in redacted form. Both the FTC and Amazon obviously wanted to keep parts of the opinion from being made public. The problem is that whoever handled the redaction process blew it.

Coughenor released two rulings -- a complete decision, which was marked as "sealed" and a decision for the public, which was marked as "redacted." That redacted version has large swaths of text covered with black bars, but the opinion can be read in its entirety by cutting and pasting it into another file.
The unintended consequence of this screw-up is that the public can now see what the government and Amazon wanted to prevent the public from knowing -- which is exactly the sort of stuff the public should know, as Public Citizen's Scott Michelman explains.
The redactions included a good deal of information that was central to court's decision, including the evidence showing what Amazon officials knew and when, the FTC's estimate of damages, the length of the injunction the FTC was seeking, and more. All of these are of great public importance to understanding what Amazon was doing, what the FTC argued to the court, and why the court ruled as it did.
It's not as though any sort of trade secrets or confidential government techniques are hiding behind the retractions. Much of what is redacted appears to have been for the benefit of Amazon, which does not come out of this surprise un-redaction looking good.
[I]n developing its Kindle Fire tablet, Amazon identified "soccer parents" as a key target customer base, referring to them as "low-hanging fruit." (Dkt. No. 121 at 8; see also Dkt. No. 122 at 3.)


[T]he evidence demonstrates that Amazon was aware that many customers did not understand in-app purchases when they were first implemented. In a confidential document regarding Amazon's marketing plan for launching in-app purchases, the company acknowledged that "'IAP' isn't a concept widely known by customers." (Dkt. No. 120 at 5.) And, despite its assertion that "[c]ustomers are not looking for apps based on how much they cost," the company was aware that customers' top searches in selecting apps indicate that customers were seeking free apps to use. (Id. at n. 2; five of the top searches included the word "free.") Amazon was aware that in many instances, the person initiating the in-app purchase was a child: in a document discussing company strategy to promote increases in in-app purchasing, Amazon acknowledged "the disconnect between the account owner (e.g., parent) and the app user (e.g., child)."


Moreover, regardless of its reputation for customer service, it is Amazon's stated policy that in-app purchases are final and nonrefundable, likely discouraging much of its customer base from attempting to seek refunds in the first place. (See Dkt. No. 127 at 275.) ("Yeah, that's the – that's our official policy, is digital content's not refundable.")


Amazon has received many complaints from adults who were surprised to find themselves charged for in-app purchases made by children. By December 2011, Aaron Rubenson referred to the amount of customer complaints as "near house on fire." (Dkt. No. 115 at 19.) Rubenson also referred to "accidental purchasing by kids" as one of two issues the company needed to solve. (Id.) Rubenson additionally stated that "we're clearly causing problems for a large percentage of our customers."
Also "withheld" is the FTC's justification of its damages estimate.
Julie Miller, a lead FTC data analyst, calculated the total in-app purchase revenue and refund amounts for seven different categories: (1) orders of $20 or more in High-Risk Non-Casino apps from the earliest date available to March 25, 2012,1 (2) orders of $19.99 and below in High-Risk Non-Casino apps from the earliest date available to February 5, 2013, (3) orders of $19.99 and below in High-Risk NonCasino apps from February 6–April 30, 2013 excluding those on the “Otter” device, (4) orders of $19.99 and below in High-Risk Non-Casino apps from May 1–July 30, 2013 excluding those on the Otter device, (5) orders of $19.98 and below in High-Risk Non-Casino apps from July 31, 2013–June 3, 2014 excluding those on the Otter device, (6) orders of $19.99 and below in High-Risk Non-Casino apps from February 6–October 9, 2013 on the Otter device, and (7) orders of $0.99 and below in High-Risk Non-Casino apps from October 10, 2013 to the latest date available on the Otter device. (Id.) These categories were selected in order to omit authorized charges. This calculation gave Ms. Miller a total of charges made without authorization by password. Ms. Miller calculated $86,575,321.38 in revenue and also found that $10,060,646.48 was provided in refunds. (Dkt. No. 110 at 3.) Ms. Miller then calculated an “unauthorized charge rate,” the rate at which users failed to properly enter a password in initiating an in-app purchase as a percentage of the overall total.
Amazon's rebuttal of the FTC's math is redacted...
Amazon argues that Ms. Miller’s estimate is so “fundamentally flawed” as to not be able to support a finding of substantial injury. (Dkt. No. 179 at 18.) In so arguing, Amazon primarily takes issue with Ms. Miller’s calculation of an “Unauthorized charge rate.” (Id.) In dividing the number of password entry “failures” and dividing that by the total number of password prompts presented, the FTC argues that it identified a “reasonable proxy for the rate at which children would incur an in-app charge without consent . . when password entry was not required.” (Dkt. No. 184 at 18.) Amazon asserts that this rate calculation “assumes that every single password failure was an attempt by a child that would otherwise have been a completed in-app purchase.” (Dkt. No. 179 at 18.) This point is well taken: many password “failures” could have occurred because the user got distracted, changed his or her mind, or simply could not remember their password. However, it is reasonable to assume that of the group of users faced with a password prompt who ultimately failed to provide a password, many were children who, absent a password prompt, would have gone on to complete an in-app purchase. is the court's partial agreement with Amazon's assessment of the assessment. [redacted portion in bold]
While, as discussed above, the general methods used by the FTC to reasonably approximate the damages to consumers by unauthorized in-app charges serve as a fair starting place, the Court finds that the unauthorized charge rate of 42% is too high. The Court has received Amazon’s “Adjustments to the FTC’s Estimates of Injury and Monetary Relief” (Dkt. No. 221 at 2) and invites further briefing on the issue of the scope of appropriate monetary relief.
Also redacted is the FTC's declaration of how long it felt Amazon should remain under the government's supervision.
The injunction sought would subject Amazon to government oversight for twenty years.
While FTC intervention has resulted in better refund policies and better notification about in-app purchases, the fact is that app makers are just as culpable as Amazon -- even if it's Amazon that will be paying the fines. There was no line of app developers at Amazon's door demanding better protections for app users. And Amazon is hardly alone in its targeting of low-hanging soccer parent fruit. When it comes to monetization of microtransactions, the lack of purchase controls is a feature, not a bug.

Then there's the question of whether we really want the government to be in the business of designing app store front-ends. While the concerns central to this case are valid, the best solution isn't necessarily the FTC setting itself up as an additional middleman for in-app purchases -- especially not for the next 20 years.

And, as for this opinion, it just goes to show courts are still far too willing to grant ridiculous redaction requests from plaintiffs and defendants -- a practice that further separates the public from the government that's supposed to be serving it.

Read More | 20 Comments | Leave a Comment..

Posted on Techdirt - 29 April 2016 @ 6:18pm

Scientists Looking To Fix The Many Problems With Forensic Evidence

from the can-you-fix-the-people-performing-the-tests? dept

Everything everyone saw in cop shows as evidence linking people to crimes -- the hair left on someone's clothing, the tire tracks leading out to the road, the shell casings at the scene, etc. -- is all proving to be about as factual as the shows themselves.

While much of it is not exactly junk science, much of it has limited worth. What appears to indicate guilt contains enough of a margin of error that it could very easily prove otherwise. Science Magazine is taking a look at the standbys of forensic science and what's being done to ensure better presentations of evidence in the future.

On a September afternoon in 2000, a man named Richard Green was shot and wounded in his neighborhood south of Boston. About a year later, police found a loaded pistol in the yard of a nearby house. A detective with the Boston Police Department fired the gun multiple times in a lab and compared the minute grooves and scratches that the firing pin and the interior of the gun left on its cartridge casings with those discovered on casings found at the crime scene. They matched, he would later say at a pretrial hearing, “to the exclusion of every other firearm in the world.”


So how could the detective be sure that the shots hadn’t been fired from another gun?

The short answer, if you ask any statistician, is that he couldn’t. There was some unknown chance that a different gun struck a similar pattern. But for decades, forensic examiners have sometimes claimed in court that close but not identical ballistic markings could conclusively link evidence to a suspect—and judges and juries have trusted their expertise. Examiners have made similar statements for other forms of so-called pattern evidence, such as fingerprints, shoeprints, tire tracks, and bite marks.
Six years ago, the National Academy of Sciences found that these forensic standbys had a much larger margin of error than was portrayed in court by detectives and expert witnesses. It recommended the margin of error be delivered along with the testimony to head off future verdicts based on faulty evidence.

To date, not much has changed. While actual junk science like bite marks has largely been discarded by prosecutors, the others remain, even as their reliability has been constantly questioned. The FBI loved hair analysis, right up to the point that it determined its witnesses had overstated test results 90% of the time in the two decades prior to 2000.

Even fingerprints, which have long been considered unassailable because of their supposed uniqueness, aren't much better. Some of it has to do with the presumption that every fingerprint is so unique even a partial print can eliminate suspects. The rest of its issues lie with those matching the prints.
One study of 169 fingerprint examiners found 7.5% false negatives—in which examiners concluded that two prints from the same person came from different people—and 0.1% false positives, where two prints were incorrectly said to be from the same source. When some of the examiners were retested on some of the same prints after 7 months, they repeated only about 90% of their exclusions and 89% of their individualizations.
The NIST has given $20 million to the Center for Statistics and Applications in Forensic Evidence (CSAFE) to come up with a better way to present this sort of evidence -- one that clearly accounts for any uncertainties in the results or processes. CSAFE is still trying to figure out how to present this as a number/rating. But that might not be the only problem. The other issue is that juries and judges may not find specifics about forensic reliability to play much of a part in deciding guilt or innocence.
In a 2013 study, for instance, online participants had to rate the likelihood of a defendant’s guilt in a hypothetical robbery based on different kinds of testimony from a fingerprint examiner. It didn’t seem to matter whether they were simply told that a print at the scene “matched” or was “individualized” to the defendant, or whether the examiner offered further justification—the chance of an error is “so remote that it is considered to be a practical impossibility,” for example. In all those cases, jurors rated the likelihood of guilt at about 4.5 on a 7-point scale. “As a lawyer, I would have thought the specific wording would have mattered more than it did,” Garrett says. But if subjects were told that the print could have come from someone else, they seemed to discount the fingerprint evidence altogether.
The other part of the problem is the people who perform the tests. Multiple incidents where evidence was falsified or not properly tested have been uncovered. The evidence is only as good as the processes, and if steps are skipped because of sloppiness or laziness, the evidence's credibility becomes highly questionable -- not just for the specific instance where results were faked, but for every test this person has touched.

There's no possible way to eliminate honest errors, much less prevent anyone from falsifying results. In both cases, the problems are caught after the damage has been done. Humans are the most unpredictable part of the chain of evidence but also an irreplaceable part. CSAFE will be working with forensics labs to create best practices, but it can do nothing to prevent the lazy and/or incompetent from completely ignoring the proper steps.

Problems are also present higher up the chain. When bad science or bad practices result in questionable evidence, it's often extremely difficult to have convictions resulting from them overturned.
What’s troubling, [federal judge Nancy] Gertner says, is that when judges accept junk science, an appeals court rarely overrules them. Attaching a numerical probability to evidence, as CSAFE hopes to do, “would certainly be interesting,” she says. But even a standard practice of critically evaluating evidence would be a step forward. “The pattern now is that the judges who care about these issues are enforcing them, and the judges who don’t care about these issues are not.”
In this way, the courts are no better than labs where shoddy work is done. Variations in personality undermine the dispassionate nature of science, making it susceptible to human prejudices rather than the strength of the evidence itself.

32 Comments | Leave a Comment..

Posted on Techdirt - 29 April 2016 @ 3:42pm

FBI Used FISA Warrant To Prosecute Boeing Employee For Child Porn Possession

from the regular-crime;-special-warrant dept

Ellen Nakashima of the Washington Post has the disturbing story of former Boeing employee Keith Gartenlaub, whose home was searched for evidence of his alleged spying for the Chinese. Specifically, the FBI was looking for documents about the military's C-17 transport plane. Instead, FBI agents came across something else.

[S]ince the search in January 2014, no spy or hacking charges have been brought against him.

Instead, seven months later, he was charged with the possession and receipt of child pornography. He has denied the charges, but a jury convicted him in December.
Questions have been raised about the evidence obtained during the search.
In Gartenlaub’s case, the defense unsuccessfully argued that he could not be linked to identical copies of child pornography videos found on four hard drives in his house. Two of the hard drives had been in a computer that was kept at a beach house where numerous people had access to it, Gartenlaub said.


Jeff Fischbach, a forensic technologist for the defense, said there is no evidence that the child pornography was ever seen by anyone who used the computer, much less Gartenlaub.

The government’s own forensic expert, Bruce W. Pixley, said he could not find any evidence of the material being downloaded onto any of the computers, the defense noted. That means it had to have been copied onto the computer — but by whom is unknown.
The defense had more difficulty than usual in challenging the evidence. The search wasn't performed with a standard FBI warrant, but instead -- due to its supposed national security implications -- with a warrant issued by the FISA court. That the FBI found child pornography instead is unfortunate, but that fact shouldn't nullify the original warrant or result in the suppression of the evidence, at least according to the DOJ.

While the DOJ is correct in the fact that the FBI wasn't going to call off the search after it uncovered evidence of other wrongdoing, its defense of the way the evidence was obtained is disingenuous. Unlike a regular warrant, a FISA warrant is almost completely unchallengeable. The entire process is ex parte, including the submission of evidence obtained -- even if the evidence has nothing to do with national security.

In Gartenlaub's case, every submission by the government was done under seal. His legal representation had no access to the government's presentation of evidence. The possession of child porn is certainly nothing the government takes lightly, but once the focus of the investigation shifted away from alleged espionage, the process likewise should have changed. At the very least, the FBI should have had a new warrant issued, signed by a regular magistrate judge -- one that would have allowed the defense to examine the affidavit and the results of the search.

JoAnne Musick of Fault Lines points out just how much the FISA Court's involvement screwed Gartenlaub.
Once the warrant issued, there was virtually no means by which Gartenlaub could challenge the basis for the warrant. Of course, the court found the pornography material “obtained pursuant to FISA was lawfully acquired” and did not violate the defendant’s Fourth Amendment rights. Additionally, after ex parte pre-trial briefings between the court and government, the judge found:

"[T]here is no indication of any false statements having been included in the FISA materials."

Surely the government would not have proven any false statements in their private discussions with the court. Perhaps had the defense had an opportunity to review or challenge the basis for the warrant, the court might have found false statements. Yet, we will never know as the defense was unable to review the evidence or otherwise challenge it. It’s disturbing that the accused was unable to obtain even basic information on how the information was obtained and why the warrant was issued.
The ability to challenge presented evidence is a key part of the justice system. Wrongs committed by the government during the search for evidence can only be righted through this process. But the use of a FISA warrant deprives the accused of that potential remedy. When it became apparent the investigation was no longer focused on matters of national security, the FBI should have unsealed documents and turned over evidence to Gartenlaub's legal reps. Instead, it chose to keep operating under the pretense it was investigating espionage and availed itself of all the advantages that come with national security-related investigations.

Then there's this: even though the FBI had enough evidence of child porn possession to prosecute (successfully) Gartenlaub and nothing in the way of evidence he was involved in spying for the Chinese, it still attempted to leverage what it had obtained to turn Gartenlaub into a government informant.
During his initial appearance in a federal courthouse in Santa Ana, Calif., the prosecutors indicated a willingness to reduce or drop the child pornography charges if he would tell them about the C-17, said Sara Naheedy, Gartenlaub’s attorney at the time.
So, not only did the government use its additional national security benefits to keep Gartenlaub from mounting a serious challenge to submitted evidence, but it also used evidence it gathered with an unrelated search to pressure him into admitting he was a spy -- something it had no evidence of at all.

24 Comments | Leave a Comment..

Posted on Techdirt - 29 April 2016 @ 12:41pm

Supreme Court Approves Rule 41 Changes, Putting FBI Closer To Searching Any Computer Anywhere With A Single Warrant

from the impeccable-timing dept

The DOJ is one step closer to being allowed to remotely access computers anywhere in the world using a normal search warrant issued by a magistrate judge. The proposed amendments to Rule 41 remove jurisdiction limitations, which would allow the FBI to obtain a search warrant in, say, Virginia, and use it to "search" computers across the nation using Network Investigative Techniques (NITs).

This won't save evidence obtained in some high-profile cases linked to the FBI's two-week gig as child porn site administrators. Two judges have ruled that the warrants obtained in this investigation are void due to Rule 41(b) jurisdiction limitations. (Another has reached the same conclusion in an unrelated case in Kansas). The amendments recently approved by the US Supreme Court would strip away the jurisdiction limitation, making FBI NIT use unchallengeable, at least on jurisdiction grounds.

Rule 41. Search and Seizure

(b) Venue for a Warrant Application. At the request of a federal law enforcement officer or an attorney for the government:

(6) a magistrate judge with authority in any district where activities related to a crime may have occurred has authority to issue a warrant to use remote access to search electronic storage media and to seize or copy electronically stored information located within or outside that district if:

(A) the district where the media or information is located has been concealed through technological means; or

(B) in an investigation of a violation of 18 U.S.C. § 1030(a)(5), the media are protected computers that have been damaged without authorization and are located in five or more districts.
The DOJ claims the updates are needed because suspects routinely anonymize their connections, making it difficult to determine where they're actually located. Opponents of the changes point out that this significantly broadens the power of magistrate judges, who would now be able to approve search warrants targeting any computer anywhere in the world.

The real problem, though, is this: there's no significant Congressional opposition (save Ron Wyden) to the proposed amendments.
“These amendments will have significant consequences for Americans’ privacy and the scope of the government’s powers to conduct remote surveillance and searches of electronic devices. I plan to introduce legislation to reverse these amendments shortly, and to request details on the opaque process for the authorization and use of hacking techniques by the government,” said Wyden.

“Under the proposed rules, the government would now be able to obtain a single warrant to access and search thousands or millions of computers at once; and the vast majority of the affected computers would belong to the victims, not the perpetrators, of a cybercrime. These are complex issues involving privacy, digital security and our Fourth Amendment rights, which require thoughtful debate and public vetting. Substantive policy changes like these are clearly a job for Congress, the American people and their elected representatives, not an obscure bureaucratic process.”
Worse, the amendments will be adopted if Congress does what it frequently does best: nothing. Congress actually needs to take action to block the amendments, but seeing as it only has until December 1, 2016, to do it, it seems highly unlikely that it will make the effort to do so -- not during an election year and certainly not during the annual struggle of approving a budget.

On the bright side, Ron Wyden is generally pretty good at mobilizing opposition, even when there appears to be little support for his efforts. We can also expect a variety of civil liberties groups and activists to start pushing Congress to "opt out" of the proposed changes.

Read More | 18 Comments | Leave a Comment..

Posted on Techdirt - 29 April 2016 @ 8:32am

Reputation Management Revolution: Fake News Sites And Even Faker DMCA Notices

from the the-dishonest-leading-the-dishonest-into-a-new-world-of-unaccountability! dept

Pissed Consumer has uncovered another apparent case of bad reputation management, this one revolving around bogus websites facilitating bogus DMCA takedowns. It previously exposed a pair of lawyers using shell companies and highly-questionable defamation lawsuits to force Google to delist negative reviews hosted around the web. These faux litigants always managed to not only find the supposed "defamers," but to also obtain a signed admission within 48 hours of the lawsuit being filed -- a process that usually takes weeks or months, especially if the alleged "defamer" utilizes anything other than their real name when posting negative reviews.

In this case, the reputation management scheme involves the use of hastily-set up "news" sites that contain a blend of scraped content and negative reviews hosted at sites like Yelp, Ripoff Report and Pissed Consumer.

Frankfort Herald, is a newspaper website that, despite its trustworthy name, has never really existed, for all intents and purposes, before January 2016 (according to However, this did not stop them from sending a DMCA notice to Google claiming that they were the owners of the copyrighted material from Pissed Consumer that was published back in 2012.

On April 15, 2016 Pissed Consumer received a takedown notice for a review where claimed that they originally wrote the piece of news in question back on January 5, 2012. The review is about Brad Kuskin, and they claimed they had it published only 2 days prior to the article appearing on
Here's the supposed news article Frankfort Herald claims it owns in its bogus DMCA takedown notice.

The scheme is just as stupid as convicted fraudster Sean Gjerde's rep management Hail Mary: post copies of reviews or articles you want to see vanished at your own website and then issue DMCA notices claiming you own the words of others. It seldom works and tends to draw more attention to the content someone's trying to hide. (Of course, Sean Gjerde went the extra mile and tried to have the FBI's press release about his conviction delisted by Google…)

That's not the only negative content masquerading as "news" at the Frankfort Herald. There's also a negative Yelp review about a Spanish language school, a Ripoff Report review of a Georgia law firm and a CBS story about an apparent scam artist who suckered parents into shelling out thousands of dollars by pretending he was scouting talent for Disney. Disney disavowed any connection to the event. All of these have been targeted by bogus takedown notices under several names linked to the definitely-not-a-local-news-site "Frankfort Herald."

Whoever's behind that site has issued bogus takedown notices under the name "Heart Broadcasting" (a name that can only be found in the Frankfort Herald's site footer), "Frankfort Herald News Corp.," and "Frankfort News Corp." Perhaps most idiotically, it has co-opted the name of one of the world's biggest publishers in hopes of giving its bogus takedowns a veneer of respectability: "Hearst Media LLC."

Other fake "news" sites containing a jumble of scraped content and completely unrelated negative reviews have also issued bogus takedown notices within the last 30 days.

AthaNews sent one on March 25th where the sender claims the following is the result of their journalistic efforts:

Bought a house from Lala Ragimov and her “Developer” Husband “Tod”. On the surface their renovatinos seem solid but there were several red flags that I now wish we listened to. 1) “The Ragimov’s” are effectively the same entity. The claim of a seperate relator vs. develoiper and the games they play about “checking with the developer” are a joke. They are husband and wife! 2) We were told our roof was new but the condition was listed as “unknown” in discolsures. We were told this is common since the roof was repaired not replaced. The building was also conviently too tall to bring an inspector with a ladder without a special fee. The result? Leaks almost immideatley! [...]
Of course, the alleged infringer is none other than Ripoff Report, which shamelessly claimed this "journalist's" misspelling-laden "exposé" into a local realtor as its own. [eyeroll] AthaNews' mission statement -- found in the website's footer -- is lorem ipsum translated into English.

SEI World News is doing the same thing. It issued a DMCA notice to Google on April 7th, claiming one of its "news articles" was being "copied."

I am senior editor and my article is copied . Just to harm my reputation online . The article owner anonymously copied my content . Please look into this matter .
Once again, Ripoff Report is home to the targeted URL. SEI World has been playing this game for several months now, targeting negative reviews at other site with bogus claims of "copied" articles.

Searching Google's DMCA database using Ripoff Report as the target uncovers all sorts of "news" sites claiming negative reviews hosted elsewhere are the genuine byproduct of their journalistic endeavors. "Mass Communications Inc.?" Bogus takedown of a Ripoff Report review. Some site called "Global Girl Magazine" wants Ripoff Report to stop ripping off its "journalist's" work -- which is apparently something about a fund manager with an alleged penchant for scamming clients after taking their retainer fees, written in the first person. The same thing goes for the "Lewisburg Tribune." And so on...

The clustering of DMCA notices seems to point to a single reputation management bozo pulling the strings on multiple websites like a more focused Patrick Zarrelli. On the other hand, the scattershot approach and slippery grasp of the English language exhibited in the DMCA notices may indicate this is nothing more than a bunch of Fiverr freelancers making reputation management promises they can't keep. In some cases, it appears to have worked. Several of the bogus takedowns show Google has taken action and delisted links. But those victories will only be temporary. Any challenge from a legitimate site should see these decisions swiftly reversed.

15 Comments | Leave a Comment..

Posted on Techdirt - 29 April 2016 @ 3:30am

USTR: Foreign Governments Engaging In Censorship And Rights Abuses Should Add IP Enforcement To Their 'To Do' Lists

from the let-the-USTR-set-your-priorities-for-you dept

If it's mid-spring, it means it's time for the US Trade Representative's "Special 301 Report," the annual "event" that names and shames countries who don't live up to US industries' intellectual property protection ideals. The same countries that have made the list for years still make the list, although a few have moved up a notch from the "Priority Watch" list to just the normal "Watch" list.

There are lots of familiar names on the lists, including such perennial favorites as China, India, Russia and… Canada. The report offers congratulations to countries like Italy, which has managed to steer clear of the watchlists by instituting censorious IP enforcement procedures like site-blocking. And it pats other countries on the head for ceding to the USTR's IP imperialism in exchange for upgraded 301 listings.

USTR has noted the willingness of two Watch List countries, Turkmenistan and Tajikistan, to work with the United States on improving their IPR protection and enforcement regimes and will conduct an OCR for each country to evaluate whether specific steps taken merit their removal from the Watch List.
The USTR has no interest in determining whether the US's IP laws are actually a good fit for other nations, especially those with a host of more pressing problems. All it cares about it whether they live up to the American ideal, as stated by the loudest "more-is-better" IP enforcement proponents. All in all, it's a completely ridiculous bit of paper rattling, served up annually for maximum theatricality.

Sadly, many of those who have landed on the USTR's Naughty 301 list take this process far too seriously. Even at its gravest, the USTR's only real threat is that if things don't change, it will be forced to print out Country X's name under a different bold sub-header in next year's report.
USTR extends the current OCR of Paraguay, which is currently on the Watch List, to provide additional time for conclusion of a bilateral IPR Memorandum of Understanding (MOU). USTR encourages Paraguay to conclude the MOU by June 30, 2015, and notes that if Paraguay does not do so, USTR will evaluate possible implications accordingly, including with respect to Paraguay’s status under Special 301.
Meanwhile, the USTR wants governments with histories of human rights abuses to institute stricter IP-related policies -- one that should better aid them in achieving their censorious ends. Thailand, which has already put mass internet surveillance in place to make sure its king remains unbesmirched, is encouraged to put its police force to use to round up infringers. Pakistan, itself engaged in censorship and mass surveillance of its citizens, is told it should hand over ex officio power to law enforcement to move against infringers without having to wait around for rights holder complaints. Ecuador, which already knows a thing or two about abusing the DMCA process, is elevated to the "Priority" list for not treating other nations' IP as worthy of the same sort of censorious actions. The USTR wants Mexico to divert law enforcement resources to combating counterfeiting and piracy, as if dealing with the consequences of four decades of US drug warring wasn't enough to keep it busy. And the USTR issues demands to Venezuela, as if that dumpster fire of a government has any interest in listening to what a US representative has to say -- especially one acting on behalf of a handful of US industries.

Like every year, the report is a joke. And it's not even a report -- not in the normal definition of the word. There's no independent action by the USTR to investigate IP laws and violations elsewhere in the world. Instead, it relies on submissions from entities like the MPAA and BSA and writes their accusations up as a "report" on the state of IP protections elsewhere in the world. Unfortunately, there aren't enough countries in on the joke. Canada, for one, at least issues nothing more than an eye roll in response to being listed as one of the world's top offenders, despite having IP laws at least as stringent as the United States'. And there's something both surreal and ugly about a process that includes the executive vice president of the American Apparel and Footwear Association -- whose members depend heavily on cheap foreign labor -- complaining that other countries aren't doing enough to prevent citizens from purchasing affordable knockoffs of the same clothes they're making for US companies, but can't actually afford to buy.

Read More | 12 Comments | Leave a Comment..

Posted on Techdirt - 28 April 2016 @ 10:38am

So Much For The Fifth Amendment: Man Jailed For Seven Months For Not Turning Over Password

from the enjoying-your-rights,-citizen? dept

The FBI recently spent more than $1 million for assistance in decrypting a device's contents. It may have overpaid. Alternatives exist, whether it's a $5 wrench or indefinite imprisonment for not helping the government with its prosecution efforts.

A Philadelphia man suspected of possessing child pornography has been in jail for seven months and counting after being found in contempt of a court order demanding that he decrypt two password-protected hard drives.

The suspect, a former Philadelphia Police Department sergeant, has not been charged with any child porn crimes. Instead, he remains indefinitely imprisoned in Philadelphia's Federal Detention Center for refusing to unlock two drives encrypted with Apple's FileVault software in a case that once again highlights the extent to which the authorities are going to crack encrypted devices. The man is to remain jailed "until such time that he fully complies" with the decryption order.
The Fifth Amendment should prevent the government from punishing a person for not testifying against themselves, which is what's being argued by the defendant's representation in its appeal to the Third Circuit. (Although it's actually indirect representation. The government's case is actually against Doe's devices ["United States of America v. Apple MacPro Computer, et al"] and his lawyer is hoping for a stay of the contempt order during the appeal process.)
Mr. Doe… has a strong likelihood of success on the second issue: whether compelling the target of a criminal investigation to recall and divulge an encryption passcode transgresses the Fifth Amendment privilege against self-incrimination. Supreme Court precedent already instructs that a suspect may not be compelled to disclose the sequence of numbers that will open a combination lock — clearly auguring the same rule for any compelled disclosure of the sequence of characters constituting an encryption passcode.
Doe's rep also argues that the All Writs order obtained by the government has no jurisdiction over Doe or his devices.
Mr. Doe’s first claim is that the district court lacked subject matter jurisdiction. The claim stems from the government’s apparently unprecedented use of an unusual procedural vehicle to attempt to compel a suspect to give evidence in advance of potential criminal charges. Specifically, the government took resort not to a grand jury, but to a magistrate judge pursuant to the All Writs Act, 28 U.S.C. § 1651. (Ex. F at 1).

It is black letter law that the All Writs Act never supplies “any federal subject-matter jurisdiction in its own right[.]” Sygenta Crop Protection, Inc. v. Henson, 537 U.S. 28, 31 (2002) (citation omitted). It is equally well-settled that the Act has no application where other provisions of law specifically address the subject matter concerned. Pennsylvania Bureau of Correction v. United States Marshals Service, 474 U.S. 34, 40-42 (1985). The compelled production of evidence in advance of criminal charges is specifically addressed by Rules 6 and 17 of the Federal Rules of Criminal Procedure, which authorize the issuance and enforcement of grand jury subpoenas; and by 28 U.S.C. § 1826(a), which specifies the authorized penalties for a witness who refuses without good cause to give the evidence demanded by the grand jury.
As it stands now, Doe is still being held in contempt of court for refusing to decrypt his devices for investigators. The district court that held him in contempt has refused direct appeal of that order, resulting in the labyrinthine legal strategy of using the government's case against Doe's devices as a vehicle for challenging the lower court's contempt order.

Doe has not been charged, yet he's in prison. Backing up the government's assertions for holding him in contempt are two dubious pieces of hearsay. One is from his estranged sister, who claims to have seen child porn on Doe's computer, but can't actually say whether it was located on the devices the government is seeking to have decrypted. The other is from some sort of law enforcement encryption whisperer, who can apparently see things in the scrambled bits.
The government’s second witness was Detective Christopher Tankelewicz, a forensic examiner with the Delaware County District Attorney’s Office. He testified only that it was his “best guess” child pornography would be found on the hard drives. (Ex. J at 346). According to Tankelewicz’s understanding of the Freenet online network (in which he admits having no training), there were signs on an Apple Mac Pro computer seized with the hard drives of a user accessing or trying to access message boards with names suggestive of child pornography. (Ex. J at 306, 311-312, 339-340). In rather ambiguous testimony, Tankelewicz did not appear to say this meant any image traded over these boards was on the hard drives. (See Ex. J at 303-317, 336-340, 345-350). Instead, he identified a single image he believed there to be a “possibility” was on the drives. (Ex. J at 308-309). As he described it, the image was of “a four or five-year-old girl with her dress lifted up, but the image itself was small so you really couldn’t see what was going on with the image.” (Ex. J at 308).
No one wants to see a sex offender walk away from charges, but at this point, Doe hasn't even been officially charged with anything more than contempt. The problem with that charge is it has no end date. He can either stay in jail or comply with the order, even when the order conjures jurisdiction out of nowhere and violates his Fifth Amendment rights. If the government doesn't have enough evidence to pursue a case against Doe, it should cut him loose until it does.

Read More | 140 Comments | Leave a Comment..

Posted on Techdirt - 27 April 2016 @ 12:46pm

Rhode Island Attorney General Pushing For A State-Level CFAA That Will Turn Researchers, Whistleblowers Into Criminals

from the 'unauthorized-access'-isn't-always-a-bad-thing... dept

We recently wrote about the Rhode Island attorney general's "cybercrime" bill -- a legislative proposal that seeks to address cyberbullying, revenge porn, etc. with a bunch of broadly -- and poorly -- written clauses. Two negative comments written months apart could be viewed as "cyber-harassment" under the law, separating it from the sustained pattern of abuse that one normally considers "harassment."

In addition, the proposed law would criminalize "non-consensual communications." If the sender does not obtain the recipient's permission to send a message, it's a criminal act if the recipient finds the message to be distressing -- which could mean anything from emailing explicit threats to posting a negative comment on someone's Facebook page.

But that's not Attorney General Peter F. Kilmartin's only bad idea. It appears he's behind another legislative proposal -- one that would amend the state's computer crime laws into something more closely resembling the catastrophic federal equivalent: the CFAA.

Here's the worst part of the suggested amendments:

Whoever intentionally and without authorization or in excess of one's authorization, directly or indirectly accesses a computer, computer program, computer system, or computer network with the intent to either view, obtain, copy, print or download any confidential information contained in or stored on such computer, computer program, computer system, or computer network, shall be guilty of a felony and shall be subject to the penalties set forth in §11-52-5.
This would make the following Google search illegal:
filetype:pdf site:*.gov "law enforcement use only"
Anything deemed "confidential information" -- if accessed by people not "authorized" to do so -- falls under the protection of this legislation, even if it can be accessed by any member of the public without actually "breaking into" a company/government/etc. server.

The definition of "confidential information" makes the legislation even more problematic.
"Confidential Information" means data that is protected from disclosure on a computer, computer program, computer system or computer network and that the computer, computer program, computer system or computer network does not transmit or disclose unless initiated by the owner of such computer, computer program, computer system or computer network.
Something accessible by a Google search is not "protected from disclosure" by any stretch of the imagination. But this phrase, "unless initiated by the owner of such computer…," makes it illegal to obtain documents not otherwise protected. Uploading a sensitive document to a public-facing website crawled by Google is stupid and the person doing the uploading should take any "unauthorized access" as a learning experience. But under the law, it could successfully be argued that the uploading of a document to a publicly-accessible website is not the same thing as "initiating transmission."

The proposal makes several exemptions for service providers, software manufacturers and (no kidding) advertisers, so that their trawling of confidential information in the course of their businesses won't be viewed as criminal acts. But what it doesn't do is carve out an exception for security researchers, who often access confidential information during the course of their work.

In this form, the legislation is dangerous. It will criminalize security research and punish citizens for the stupidity of others. On top of that, the law would pretty much turn every whistleblower into a criminal by treating the access of confidential information as a crime, no matter what the circumstances are. Running it through an editing process involving politicians surrounded by "cyberwar" hype is unlikely to improve it.

Read More | 22 Comments | Leave a Comment..

Posted on Techdirt - 26 April 2016 @ 2:13pm

EFF, ACLU And Public Records Laws Team Up To Expose Hidden Stingray Use By The Milwaukee Police Department

from the acronyms-to-the-rescue! dept

The EFF and ACLU -- along with the assistance of a very fortuitous public records request by Stingray-tracker extraordinaire Mike Katz-Lacabe -- have uncovered more hidden use of IMSI catchers by law enforcement. A criminal prosecution relying on real-time tracking of a suspect's cell phone has finally led to the admission by Wisconsin police that they used a Stingray to locate defendant Damian Patrick.

The information wasn't handed over to the court until the EFF, ACLU, and Katz-Lacabe's FOIAed documents forced the government to admit it used the device. Up until that point, testimony given by officers gave the impression that tracking Patrick down only involved the use of records from his service provider. They also claimed the information pinpointing Patrick's location in a parked vehicle was just a tip from an "anonymous source."

As we’ve seen in other cases involving Stingrays, the government did everything it could in this case to hide the fact that it used a Stingray—from the court that issued the pen register/trap and trace order, the court that heard Patrick’s motion to suppress the evidence, and even from Patrick, himself. In police reports, the officers said only that they “‘obtained information’ of Patrick’s location; . . . had ‘prior knowledge’ that Patrick was occupying the vehicle; . . . [and] ‘obtained information from an unknown source’ that Patrick was inside the vehicle at that location.”
This charade continued through an evidentiary hearing, where the judge refused to allow the defense to coax more specific information out of the testifying officer.
[E]ven at an evidentiary hearing where officers admitted to cellphone tracking, they would only acknowledge, cryptically, that they’d received “electronic information” confirming Patrick was in the vehicle. When Patrick’s attorney asked what “electronic information” meant, the officer on the stand would say only that it involved “tracking [a] cell phone.” The judge cut off any further questioning at that point.
And that's where Katz-Lacabe's FOIA request played a significant role. Katz-Lacabe had obtained Stingray logs using Wisconsin's public records laws. Contained in those logs were Stingray deployments matching up to the government's tracking and locating of Damian Patrick. The government has now begrudgingly admitted as much, via a letter from the DOJ to the court regarding the Milwaukee Police Department's Stingray deployment.
Per our conversation last week, the government has determined that on October 28, 2013, the Milwaukee Police Department used a cell site simulator to locate Damian Patrick. At this time, we do not intend to seek leave to supplement the record pursuant to Federal Rule of Appellate Procedure 10.
The government is still arguing that the MPD complied with the Fourth Amendment, even if it never obtained a search warrant to deploy the Stingray. In any event, the affidavit it submitted (for what appears to be a pen register order, rather than a warrant) did not mention the use of a Stingray. Still, it argues no evidence should be suppressed… because circular reasoning.
[T]he government also argues it didn’t violate the Fourth Amendment in this case because it actually got a warrant—or maybe, in the alternative, the equivalent of a warrant (the police had a warrant to arrest (not search) Patrick and a court order (not a search warrant) to track Patrick’s phone). In a confusing and somewhat circular argument, the government asserts that because it submitted a “sworn affidavit” in support of its request for the pen/trap order, the order must have actually been a search warrant—if it hadn’t been a warrant, then it “wouldn’t have needed a finding of probable cause, which it contained.”
Dumping probable cause into a pen register application is a nice nod to the Fourth Amendment, but it's not required and it doesn't turn a court order into a warrant. An arrest warrant is not a search warrant, and it's likely the MPD would not have been able to serve its arrest warrant without the use of its Stingray-obscuring pen register order. The admission that Stingray surveillance should require the use of a warrant is, again, a nice nod to the Fourth Amendment, but it means nothing if that's not how the Milwaukee PD actually operates. And, yet again, the long battle to uncover evidence of Stingray tracking makes it clear the PD is hiding this information from judges when applying for court orders and warrants.

Read More | 15 Comments | Leave a Comment..

Posted on Techdirt - 25 April 2016 @ 2:04pm

Illinois Police Department Pulls Plug On Body Cameras Because Accountability Is 'A Bit Burdensome'

from the unconditional-surrender-to-administrative-complaints dept

Police body cameras aren't the cure-all for bad policing. However, they are an important addition to any force, providing not only a means for accountability (albeit an imperfect one) but also documentation of day-to-day police work. They can help weed out those who shouldn't be cops as well as protect officers from bogus complaints.

It's not enough to just have the cameras, though. Effort must be made to keep them in working order (and to prevent intentional damage/disabling). The footage must also be preserved and provided to the public when requested. This does mean there's additional workload and expenses to be considered, but the potential benefits of increased documentation should outweigh the drawbacks.

Not so, apparently, for the Minooka Police Department in Illinois. The agency has decided to end its body camera program because accountability and transparency are just too much work.

Minooka Police Chief Justin Meyer said Friday the issue was not with the functionality of the cameras, but that it became a burden for staff to fill the many requests for video footage.
How much of a burden?
"I was happy [with the body cameras]," Meyer said. "It just became a bit burdensome for our administrative staff."
That's all it takes to let cops off the accountability hook: "a bit" of a burden. King Camera has been overthrown and the public's access to information is first against the wall.

Chief Meyer might want to hire a spokesperson because he's not exactly doing a great job explaining how burdensome the cameras were.
Meyer described a hypothetical example of the extra work it created for department staff.

"You could have four officers on a call for a domestic incident," Meyer said. "If they are on scene for an hour -- whether there's an arrest or not -- that's four hours of video that has to be uploaded."
Meyer could possibly be referring to redaction efforts, which could be time-consuming. He couldn't possibly be referring to the "burden" of uploading film because that's, well, non-existent.
The cameras could record up to nine hours of continuous footage with 16 GB of storage. They were plugged into a USB port at the department after a shift to collect the footage and recharge the battery.
Because the state doesn't mandate the use of body cameras, the Minooka PD -- which was the first in its county to deploy the technology -- may be the leading edge of a new wave of abandonment, both of body cameras and the accountability that goes with them. All because of an increased workload deemed by the abandoning agency as "a bit burdensome." When the going gets tough, the tough say, "Fuck it," apparently.

Policing is adversity defined. I can't muster up much sympathy for a law enforcement agency that calls it quits the moment it faces a logistical hurdle. To me, this abandonment says the department's heart was never in it. Meyer may say he "liked" the cameras, but he sure didn't put up much of a fight when someone in the office complained about the extra work. This is an agency that was looking for an excuse to ditch the cameras and took the first "offer" that came along: a bit of a burden.

68 Comments | Leave a Comment..

Posted on Techdirt - 25 April 2016 @ 11:38am

Court Tells Cops They Can't Open A Flip Phone Without A Warrant


Lower courts appear to be taking the Supreme Court's Riley decision seriously -- give or take the occasional "there's no Constitution at the border" decision. If the Supreme Court says there's a warrant requirement for cell phone searches, there's a warrant requirement for cell phone searches.

The Central District of Illinois has just handed down a decision that makes it clear, in no uncertain terms, that any examination of a cell phone's contents, no matter how brief, is a search covered by Riley.

The Pekin Police Department participated in a couple of FBI-assisted controlled buys of weapons and drugs involving defendant Demontae Bell. Shortly thereafter, Bell was arrested.

Upon Bell’s arrest, a black mobile flip phone was located on his person. After Bell was arrested, he was transported to the Peoria Police Department and placed in an interview room. Shortly thereafter, Officer Sinks arrived at the police station (he was not the arresting officer). At the suppression hearing Sinks testified that before interviewing Bell with agent Nixon, he opened the door to the interview room, grabbed Bell’s cell phone from a bag or container outside the door, opened the phone (purportedly to turn it off) and showed the home screen depicting the rifle to Bell with an inquisitive look.
Officer Sinks then powered off the phone. He handed it over to FBI Special Agent Nixon and told him about the photo he had seen. Sinks then removed the phone's battery and recorded the serial number. A little more than a week later, the FBI obtained a warrant to search the phone. Five months later, another search warrant was obtained specifically targeting date/time information related to the photo Officer Sinks saw on Bell's phone.

Seems like a cursory examination of a flip phone would be covered, but Judge James Shadid points out the Supreme Court only allowed warrantless examination of cell phones if there were exigent circumstances or to ensure the phone did not pose a threat to officers (i.e., contain a concealed weapon). The government argued that opening a flip phone is not a "search" and that the photo of a gun the officer saw was in "plain view." The court disagrees, pointing out that "plain view" means "plain view" without law enforcement interaction of any sort.
The government’s response to Bell’s Motion asserts that Officer Sinks’ opening of the flip phone did not constitute a search. While it is true that a “cursory inspection—one that involves merely looking at what is already exposed to view, without disturbing it—is not a ‘search’ for Fourth Amendment purposes,” Officer Sinks’ opening of Bell’s cell phone exceeded a “cursory inspection” because he exposed to view concealed portions of the object—i.e., the screen. See Arizona v. Hicks, 480 U.S. 321, 328-29 (1987). The Supreme Court specifically addressed this issue in Hicks, noting that the “distinction between ‘looking’ at a suspicious object in plain view and ‘moving’ it even a few inches is much more than trivial for purposes of the Fourth Amendment.” Id. at 325. Officer Sinks’ opening of the flip phone, like the officer moving the stereo equipment in Hicks, “exposed to view concealed portions of the [object]” and thus “produced a new invasion of [defendant’s] privacy.”
Even though the court finds Bell to have a diminished expectation of privacy in the home screen of his phone (as opposed to its contents), that's still not enough to ignore the stipulations of the Riley decision. Lock screens or homescreens may only show limited information in relation to the contents of a phone, but they can still display a wealth of information law enforcement can only obtain with a warrant.
The lens through which all information on a cell phone is observed is the screen. On both flip phones and more modern, advanced devices, “notifications” are regularly displayed on the home screen or lock screen indicating text messages, missed calls, and other alerts. The position that the government advances here—that officers can always open a phone and look at the screen to turn the phone off without conducting a “search” at all—is inconsistent with Riley’s requirement that “unlike the search incident to arrest exception, the exigent circumstances exception requires a court to examine whether an emergency justified a warrantless search in each particular case.”

Just as Riley analyzed and rejected California’s attempt to create across the board exceptions, such as a rule allowing police to search call logs, without a warrant, the Court sees no reason to allow law enforcement to circumvent the warrant requirement in every case under the guise that they discovered evidence when they opened the phone or turned on the screen to turn the phone off.
The government attempted to use two exceptions provided by the Riley decision: officer safety and threat of remote destruction of evidence. Both of these arguments are dismissed just as quickly and soundly as the government's "plain view" argument. The court notes that Officer Sink's actions gave no indication he was worried about a concealed weapon or data being wiped from the phone.

In any case, if remote wiping was a concern, officers could have removed the battery without opening the phone, as was clearly demonstrated by Officer Sinks himself.
Officer Feehan testified that the policy was put in place partly because snooping software could be used to listen in on conversations when the phone is turned off but still connected to the battery, and other methods could “compromise data” on the phone. While the procedure may be outdated as applied to modern cell phones that lack removable batteries, that problem was not present here, and the video later showed Officer Sinks removing the battery. Where officers have two equally effective options to turn off a phone, they should choose the less intrusive option. That was not done in this case, and as a result, incriminating evidence was found.
The result is suppression of the evidence specific to the Constitutional violation: the picture of an AK-47 Officer Sinks saw when he opened the phone. Because warrants were obtained for a more thorough search, supported by probable cause unrelated to the photo Sinks saw, the suppressed evidence is pretty much reinstated in whole as the incriminating photo was located on Bell's phone. While it doesn't do much for Bell, it does at least send a message to law enforcement that the Riley decision is to be respected and that cutting corners or skirting around the edges of the ruling won't be tolerated.

Read More | 18 Comments | Leave a Comment..

Posted on Techdirt - 25 April 2016 @ 10:37am

Practical Applications For Massive Surveillance Databases: Timely Birthday Cards, Travel Diaries

from the the-ultimate-vanity-search-engine dept

If you want to get a feel for the gobsmacking amount of information being collected by UK surveillance agencies (MI5, M16, GCHQ), all you have to do is see how it's being misused. Privacy International, which has been steadily suing the UK government over domestic surveillance, has received another set of documents that show the banality of dragnet surveillance evil. The banality is not so much the dragnet itself (although that's not to say it isn't its own form of evil) as it is the uses it's put to.

Ryan Gallagher, writing for The Intercept, points out that spies are using surveillance collections as backup Day-Timers -- apparently with enough frequency they've had to be warned to knock it off.

The documents include internal guidance codes for spies who have access to the surveillance systems. One memo, dated June 2014, warns employees of MI6, the U.K.’s equivalent of the CIA, against performing a “self-search” for data on themselves, offering a bizarre example that serves to illustrate the scope of what some of the repositories contain.

“An example of an inappropriate ‘self search’ would be to use the database to remind yourself where you have traveled so you can update your records,” the memo says. “This is not a proportionate use of the system, as you could find this information by another means (i.e. check the stamps in your passport or keep a running record of your travel) that would avoid collateral intrusion into other people’s data.”
The information collected includes data that could reveal political preferences, sexual orientation, religious beliefs, memberships in associations or groups, mental/physical health along with biometric data and financial documents. With a little digging, the massive database could be used to uncover journalists' sources and privileged communications.

The wealth of information at the fingertips of British spies helps explain why they never seem to forget important dates.
“We’ve seen a few instances recently of individual users crossing the line with their database use, looking up addresses in order to send birthday cards, checking passport details to organise personal travel, checking details of family members for personal reasons…"
The world's greatest search engine isn't Google. It's GCHQ. Of course, the documents also point to various levels of oversight, none of which appear to have much of a deterrent effect. A monitoring system of some sort appears to be in place and it's likely what flagged agents' self-searches. But it's unlikely to catch other inappropriate searches involving someone other than the person performing the search. These, too, are forbidden, but it's likely these violations were part of a pattern of sustained abuse, rather than one-off searches -- which would likely have slipped under the radar as being just another intelligence-related search.

What's worse is access to these vast data stores apparently went oversight-free for several years, and it's not entirely clear from what's been released that comprehensive oversight is even in place at this point in time.
One 2010 policy paper from MI6 states there is “no external oversight” of it or its partners’ “bulk data operations,” though the paper adds that this was subject to review.
This may not seem completely terrible -- after all, six years government time is like 30 days real time -- until you realize the GCHQ has data sets dating back nearly 20 years (harvesting began in 1998) and MI5's bulk collection is more than a decade old at this point. And it continues onward, getting more massive by the moment. The GCHQ wants to collect 50 billion records every day, utilizing people's web browsing, phone calls, and email. While the agencies insist this is all for fighting terrorism and international crime, the cold reality is that it's just as useful for reconciling travel expenses and making sure Mom always gets her birthday card on time.

11 Comments | Leave a Comment..

Posted on Techdirt - 25 April 2016 @ 9:31am

FBI Hides Its Surveillance Techniques From Federal Prosecutors Because It's Afraid They'll Become Defense Lawyers

from the code-of-silence dept

We know the FBI isn't willing to share its investigative techniques with judges. Or defendants. Or the general public. Or Congress. The severely restrictive NDAs it forced law enforcement agencies to sign before allowing them to obtain IMSI catchers is evidence of the FBI's secrecy. Stingray devices were being used for at least a half-decade before information starting leaking into the public domain.

The FBI doesn't want to hand over details on its hacking tools. Nor does it want to discuss the specifics of the million-dollar technique that allowed it to break into a dead terrorist's phone (which held nothing of interest).

USA Today's Brad Heath has obtained documents showing the FBI's tech secrecy extends even further than its nominal opponents (judges, defense lawyers, defendants). Its secrecy even involves freezing out other players on the same team.

A supervisor also cautioned the bureau’s “technically trained agents” in a 2003 memo not to reveal techniques for secretly entering and bugging a suspect’s home to other agents who might be forced to reveal them in court. “We need to protect how our equipment is concealed,” the unnamed supervisor wrote.

The records, released this year as part of a Freedom of Information Act lawsuit, offer a rare view of the extent to which the FBI has sought to keep its most sensitive surveillance capabilities secret, even from others within federal law enforcement.
Yes, the FBI is so determined to keep its techniques secret that it won't even share it with high-ranking prosecutors, like Assistant US Attorneys (AUSAs). But it gets even better. The reason stated in the memo for locking out AUSAs is schadenfreuderiffic.

In case you can't see or read the picture above, here's what the memo says:

Over the past few months, ERF [Engineering Research Facility] has expressed concern about Tech Agents revealing technical details to Case Agents and especially to AUSAs. There have been several instances of AUSAs becoming familiar with our techniques, then resigning and becoming defense lawyers. There also is concern about retiring Agents performing investigative work for defense counsel (i.e. right here in MP).
One conclusion that could be drawn from AUSAs "becoming familiar" with FBI surveillance techniques, then switching sides to work as defense lawyers, is that the FBI's techniques are so intrusive and pervasive that AUSAs no longer find it conscionable to act on the behalf of the FBI.

That's not the only damning paragraph in the two-page set of responsive documents. There's also this, which again shows the FBI openly encourages obfuscation and omission in federal courtrooms.
Over the past week, I have received two ECs [electronic communications] form the field which describe in GREAT detail surreptitious entries and special project concealments installed in the target locations. These ECs describe the equipment concealed, item in which the equipment was concealed, and where the concealments were placed. These ECs were drafted by case agents, uploaded in ACS, and placed in the case file.

TTAs [technically trained agents] should not be providing such detail to case agents. One reason TTAs do not testify is to protect our trade craft. If the case agents have this information, they will be required to reveal it during cross examination at trial. Also, an AUSA may require the EC be turned over during discovery before trial. We need to protect how our equipment is concealed and where our is concealed.

It is sufficient for the case agent to simple state that, pursuant to a court order, equipment was installed in the target location.
So, the FBI will hide information from their own case agents in order to prevent defendants from obtaining the details of the surveillance used to build a case against them. Needless to say, preventing the defense from obtaining these details also prevents judges and juries from hearing them and using those to weigh the Constitutionality of the techniques.

This secrecy undercuts defendants' rights by denying them the opportunity to challenge the evidence or the methods used to obtain it. It also blows right by the Fourth Amendment by obfuscating the techniques used, a process that begins with search warrant affidavits that deliberately leave out essential details in order to protect the FBI's surveillance secrets. The FBI's cavalier attitude towards the rights of Americans traces back to the days of J. Edgar Hoover. While the agency has moved ahead in terms of technical prowess, the underlying "ends justifies the means" attitude appears unchanged.

Read More | 24 Comments | Leave a Comment..

Posted on Techdirt - 22 April 2016 @ 7:39pm

Court: Border Search Warrant Exception Beats Riley In The 'Constitution-Free Zone'

from the protections-arbitrarily-applied-to-protect-inland-electronic-devices dept

The Supreme Court declared in 2014 that law enforcement could no longer perform searches of cellphones incident to arrest without a warrant. The exceptions to this ruling are making themselves apparent already.

The area of the United States where the Constitution does not apply -- while still being fully within the borders of the US -- apparently exempts law enforcement from following this ruling in regards to cellphone searches. The Southern District of California has come to the conclusion that border searches are not Fourth Amendment searches and that the government has no need to seek a warrant before searching a cellphone.

The court notes the Riley decision says one thing but the "border exception" says another.

Heading in one direction is the Supreme Court’s bright line rule in Riley: law enforcement officers must obtain a warrant to search a cell phone incident to an arrest. Heading on a different course is the border search exception. The border search exception describes an exception to general Fourth Amendment principles. It is the notion that the government may search without a warrant anyone and anything coming across its border to protect its national sovereignty.
Balancing the two competing interests in this case, the court ultimately finds the government's national security interest outweighs citizens' privacy interests. As it weighs this against cases dealing with more elaborate and lengthy device searches at the border, the court basically finds that if the Fourth Amendment is violated by "cursory" searches of devices, it is only violated a little.
Reviewing the totality of the circumstances, the Caballero cell phone search: (1) took place at a port of entry; (2) was based on reasonable suspicion of criminal activity; (3) was conducted manually and appeared to be a cursory search of the device’s contents; (4) did not involve the application of forensic software; (5) did not destroy the cell phone; (6) was performed in minutes, as opposed to hours or days; (7) was performed upon a device being brought into the country, rather than being taken out of the country; and (8) was performed approximately four hours after Caballero was placed under arrest. Other than the last factor, each of these factors was either similar to or less intrusive than the warrantless search Cotterman decided was reasonable.
The "border exception" the court carves out for a warrantless search of cellphones at the border somehow relies on exceptions carved out in the original Riley decision, despite saying Riley doesn't control border searches.
The two cases can be reconciled. The most obvious path for reconciliation is to conclude that the border search exception is among the traditional exceptions to which Riley’s warrant requirement does not apply. This approach finds safe footing in the Supreme Court’s statement that “other” “exceptions” may continue to justify a warrantless search. Riley, 134 S. Ct. at 2494 (“Moreover, even though the search incident to arrest exception does not apply to cell phones, other case-specific exceptions may still justify a warrantless search of a particular phone.”). It also is consistent with the observation from Montoya de Hernandez, (473 U.S. at 539), about when balancing individual privacy rights against rights of the sovereign, the balance “is qualitatively different . . . than in the interior” and the balance is “struck much more favorably to the Government.” This approach also avoids the spectacle of deeming that Riley undercut 200 years of border search doctrine without even a mention.
Not as much of a "spectacle" as the California court would think, considering the Riley decision set aside years and years of the government relying on dubious analogies like "containers" or "pair of pants" to justify the search and seizure of anything carried on or near a person (like in their vehicle) incident to arrest. It may not date back 200 years, but it does date back to the Fourth Amendment itself, which is a controlling authority with more than 200 years worth of history.

The case here deals with an actual border crossing (Calexico, California) but the government has basically declared any area within 100 miles of a border can be called "the border" for the sake of searches and detainments predicated on reasonable suspicion or, in many cases, law enforcement hunches.

As for this case, it's hardly the ideal test for balancing Riley against the border search exception. For one, the defendant challenging the warrantless cellphone search was already neck-deep in reasonable suspicion, thanks to the discovery of drugs in his vehicle. Officers on scene had more than enough reason to detain him and likely had uncovered enough damning evidence to support a warrant affidavit. Of course, they did not seek one. Instead, they briefly browsed his phone until they found further suspicious content.

Some courts refuse to give officers a pass when they could have gotten a warrant but choose not to. Those courts are in the minority. This court is part of the majority.
Here, illicit narcotics had been discovered. Caballero had been arrested. Reasonable suspicion had jelled into probable cause. For the time being, he and his cell phone were safely in the hands of government agents. Other than the increased administrative work required, there is no apparent reason why Riley’s search warrant requirement could not be applied without undercutting the interests supporting the border search doctrine. One can certainly say that Riley casts doubt on Cotterman’s approval of warrantless searches where an arrest is made. Nevertheless, as long as this Court can apply circuit precedent without running afoul of intervening authority, it must do so.
Being within 100 miles of the border means never having to seek a warrant, even if the government has both the time and the probable cause to do so… at least until someone manages to push a challenge up to the appeals court level or beyond.

Read More | 55 Comments | Leave a Comment..

Posted on Techdirt - 22 April 2016 @ 3:37pm

Court Says Government Needs More Than The Permission Of A Couple Of Underperforming Drug Dogs To Justify Seizure Of $276,000

from the evidence:-it's-a-thing dept

The Seventh Circuit Appeals Court has done something few courts do: told law enforcement it can't have that sweet, sweet "drug" money it lifted from two brothers for no other reason than that it felt there was something shady about its very existence.

Police responded to a call about a home invasion at the residence of Pedro and Abraham Cruz-Hernandez. While inside the house, officers came across a handgun, a small amount of marijuana and a scale. This apparently prompted the arrival of two drug dogs, considering they're not usually standard equipment for home invasion investigations.

When searching the brothers' van, police found $276,080. So, they took it. Why? Because their dogs said they could.

A police drug dog signaled the presence of drugs in Pedro’s van, which was parked outside the house. After obtaining a search warrant, the police discovered in the van a safe containing $271,080 in currency and two pages of handwritten notes including dates and numbers. The cash was bundled with rubber bands in stacks of $5,000. A second dog alerted to the safe. No drugs, however, were found in either the van or the safe.
It didn't matter that neither drug dog could adequately perform the single task required of them. The "alerts" were all the justification law enforcement needed to rob the brothers of their money.

As is standard operating procedure in asset seizures, no charges were brought but the "guilty" money remained in the possession of law enforcement. The brothers challenged the forfeiture. The government then tried to use a mistake any person could have made to justify its possession of the brothers' money.
The government also pointed to two alleged disavowals of ownership by Abraham: (1) a record created by U.S. Immigration and Customs Enforcement (ICE) six weeks before the police seized the safe, in which Abraham had said that he did not have any “equities” in the United States, and (2) Abraham’s application for cancellation of removal, filed with the assistance of immigration counsel six months after the seizure, in which he lists only $2,000 in “cash assets.” The government represented that Pedro and Abraham, when deposed, had testified that they told the truth to the police and to immigration officials.
The problem here is that Abraham was asked to list his cash assets. A normal person not versed in the convoluted fuckery that is asset forfeiture would reasonably conclude that his assets only include what's actually in his possession. As the government was still in possession of the $276,080 at the point Abraham was asked, he reasonably concluded he could only legally claim the $2,000 he had access to. The government took this reasonable conclusion and twisted it to mean Abraham had relinquished his claim on the $276,000.

In legal terminology, the government claimed the contradictory statements constituted a "sham affidavit." The court doesn't see it the government's way, though. (Emphasis in original.)
Changes in testimony normally affect the witness’s credibility rather than the admissibility of the testimony, and thus the sham-affidavit rule applies only when a change in testimony “is incredible and unexplained,” not when the change is “plausible and the party offers a suitable explanation such as confusion, mistake, or lapse in memory.”


Abraham’s explanation for his answer on the immigration form is not only plausible, but is correct. We suspect that many people—in particular immigrants completing a form for ICE—would not be aware that a legal claim is an asset. But whether they would or not, a legal claim—even a claim to money—is not itself a “cash asset.” See CASH, Black's Law Dictionary (10th ed. 2014) (defining cash as “1. Money or its equivalent; 2. Currency or coins, negotiable checks, and balances in bank accounts.”). Moreover, when deposing Abraham the government’s lawyer did not ask about his understanding of a “cash asset” or whether his attorney had explained the definition of that term. There is no reason, therefore, to reject out-of-hand Abraham’s deposition testimony that he had been truthful in his immigration filings.
Not only that, the court notes, but an element common to nearly all asset forfeitures doesn't help the government's case much.
It is also telling that the government has presented virtually no evidence that the brothers are involved in drug trafficking. There was nothing to indicate past or current drug dealing by the brothers or anyone else living with them in the house, nor was there any suggestion that either brother used the bedroom where the apparent drug paraphernalia was found. Though drug dogs had alerted to the safe and currency, the government did not submit to the court any evidence of the dogs’ training, methodology, or field performance.
(Given this limited sampling includes both dogs "claiming" drugs were present when no drugs were, it's likely the government felt records on training, methodology and field performance would only have made its case weaker.)
Neither did the government point to evidence (e.g., an experienced drug investigator’s opinion) to substantiate its assumptions that the notes found in the safe were a “drug ledger” or that counting and bundling currency is something that only drug dealers would do. “Absent other evidence connecting the money to drugs, the existence of money or its method of storage are not enough to establish probable cause for forfeiture,” much less enough to meet the now-heightened standard of a preponderance of the evidence.
And, finally this little kick to the government's ribs:
Courts have concluded that the government failed to meet its burden in cases with better evidence than this [one]...
And with that, the Appeals Court sends it back to the lower court with the judgment in favor of the government vacated. Considering the brothers appear to have "substantially prevailed," the government may find itself cutting a check for legal fees. In hindsight, it would have been smarter to have returned the money when it became apparent there was no basis for a criminal case. But that's not how the government rolls. Cash is presumed guilty until proven innocent, even when it's little more than two underperforming drug dogs and a small baggie of marijuana "justifying" the seizure.

Read More | 42 Comments | Leave a Comment..

Posted on Techdirt - 21 April 2016 @ 11:39am

Judge Says FBI's Hacking Tool Deployed In Child Porn Investigation Is An Illegal Search

from the can't-just-go-wherever-you-damn-well-please dept

The judicial system doesn't seem to have a problem with the FBI acting as admins for child porn sites while conducting investigations. After all, judges have seen worse. They've OK'ed the FBI's hiring of a "heroin-addicted prostitute" to seduce an investigation target into selling drugs to undercover agents. Judges have, for the most part, allowed the ATF to bust people for robbing fake drug houses containing zero drugs -- even when the actual robbery has never taken place. Judges have also found nothing wrong with law enforcement creating its own "pedophilic organization," recruiting members and encouraging them to create child pornography.

So, when the FBI ran a child porn site for two weeks last year, its position as a child porn middleman was never considered to be a problem. The "network investigative technique" (NIT) it used to obtain identifying information about anonymous site visitors and their computer hardware, however, has resulted in a few problems for the agency.

While the FBI has been able to fend off one defendant's attempt to suppress evidence out in Washington, it has just seen its evidence disappear in another case related to its NIT and the "PlayPen" child porn site it seized (and ran) last year.

What troubles the court isn't the FBI acting as a child porn conduit in exchange for unmasking Tor users. What bothers the court is the reach of its NIT, which extends far outside the jurisdiction of the magistrate judge who granted the FBI's search warrants. This decision benefits defendant Alex Levin of Massachusetts directly. But it could also pay off for Jay Michaud in Washington.

The warrants were issued in Virginia, which is where the seized server resided during the FBI's spyware-based investigation. Levin, like Michaud, does not reside in the district where the warrant was issued (Virginia - Eastern District) and where the search was supposed to be undertaken. As Judge William Young explains, the FBI's failure to restrict itself to the location where the NIT warrants were issued makes them worthless pieces of paper outside of that district. (via Chris Soghoian)

The government argues for a liberal construction of Rule 41(b) that would authorize the type of search that occurred here pursuant to the NIT Warrant. See Gov’t’s Resp. 18-20. Specifically, it argues that subsections (1), (2), and (4) of Rule 41(b) are each sufficient to support the magistrate judge’s issuance of the NIT Warrant. Id. This Court is unpersuaded by the government’s arguments. Because the NIT Warrant purported to authorize a search of property located outside the Eastern District of Virginia, and because none of the exceptions to the general territorial limitation of Rule 41(b)(1) applies, the Court holds that the magistrate judge lacked authority under Rule 41(b) to issue the NIT Warrant.
The government deployed some spectacular theories in its effort to salvage these warrants, but the court is having none of it.
The government advances two distinct lines of argument as to why Rule 41(b)(1) authorizes the NIT Warrant. One is that all of the property that was searched pursuant to the NIT Warrant was actually located within the Eastern District of Virginia, where the magistrate judge sat: since Levin -- as a user of Website A -- “retrieved the NIT from a server in the Eastern District of Virginia, and the NIT sent [Levin’s] network information back to a server in that district,” the government argues the search it conducted pursuant to the NIT Warrant properly can be understood as occurring within the Eastern District of Virginia. Gov’t’s Resp. 20. This is nothing but a strained, after-the-fact rationalization.
As the government attempts to portray it, the search was wholly contained in Virginia because the NIT was distributed by the seized server in the FBI's control. But, as the judge notes, the search itself -- via the NIT -- did not occur in Virginia. The NIT may have originated there, but without grabbing info and data from Levin's computer in Massachusetts, the FBI would have nothing to use against the defendant.
That the Website A server is located in the Eastern District of Virginia is, for purposes of Rule 41(b)(1), immaterial, since it is not the server itself from which the relevant information was sought.
And, according to Judge Young, that's exactly what the FBI has now: nothing.
The Court concludes that the violation at issue here is distinct from the technical Rule 41 violations that have been deemed insufficient to warrant suppression in past cases, and, in any event, Levin was prejudiced by the violation. Moreover, the Court holds that the good-faith exception is inapplicable because the warrant at issue here was void ab initio.
The judge has more to say about the FBI's last ditch attempt to have the "good faith exception" salvage its invalid searches.
Even were the Court to hold that the good-faith exception could apply to circumstances involving a search pursuant to a warrant issued without jurisdiction, it would decline to rule such exception applicable here. For one, it was not objectively reasonable for law enforcement -- particularly “a veteran FBI agent with 19 years of federal law enforcement experience[,]” Gov’t’s Resp. 7-8 -- to believe that the NIT Warrant was properly issued considering the plain mandate of Rule 41(b).
The court doesn't have a problem with NITs or the FBI's decision to spend two weeks operating a seized child porn server. But it does have a problem with the government getting warrants signed in one jurisdiction and using them everywhere but.

The decision here could call into question other such warrants used extraterritorially, like the DEA's dozens of wiretap warrants obtained in California but used to eavesdrop on targets located on the other side of the country. And it may help Jay Michaud in his case, seeing as he resides a few thousand miles away from where the search was supposedly performed.

Read More | 25 Comments | Leave a Comment..

Posted on Techdirt - 21 April 2016 @ 9:21am

DHS Claims Open Source Software Is Like Giving The Mafia A Copy Of FBI Code; Hastily Walks Back Statement

from the psst...-your-ignorance-is-showing dept

Late last week, the DHS's Chief Information Officer Luke McCormack (or someone from his office) posted comments to GitHub arguing against the proposed policy of making 20% of its code (whatever that means) open source in the interest of better sharing between agencies. The rationale is that shared code could save tax dollars by preventing paying developers to perform redundant work. The DHS felt strongly about this and said as much using an Excel-based parade of horrors.

Many private companies (especially security companies) do not publish their source code is because it allows attackers to (a) construct highly targeted attacks against the software, or (b) build-in malware directly into the source code, compile, then replace key software components as 'doppelgangers' of the original. How will this be prevented? Government-specific examples: citizenship anti-fraud rules that are coded into software, identification of special codes used to flag law enforcement actions, APT threat indicator scripts, Mafia having a copy of all FBI system code, terrorist with access to air traffic control software, etc. How will this be prevented?
Contrary to the CIO's statements, open source software can actually be safer than closed source options. More eyes on the source means more people finding flaws and holes and working towards fixes, rather than simply compiling internal discoveries and forwarding them to the vendor and allowing the company to determine which holes/flaws should be repaired and in which order.

The DHS has now walked back this unfortunate comment, claiming it was just one of those mysterious things that somehow materialized out of the ether.
Those comments were "incorrectly posted" and do not represent DHS' position, agency spokesman Justin Greenberg told Nextgov in an email. McCormack's new comments "serve as the department’s official stance on the policy," the spokesman said. In his new comment, McCormack said the earlier comments reflected "a variety of individual positions across DHS components."
This explains next to nothing and leaves readers with the impression that the DHS has been publicly embarrassed by the "source code sky is falling [pending proposal approval]" emailed in by its CIO.

The DHS has a history of walking back things after they've received public criticism. This is good, but the walkbacks seem to be accompanied by obfuscatory statements that give everyone involved a pass on their misguided actions. Back in 2014, DHS component ICE started soliciting bids for a national license plate database (built from the hundreds of automatic license plate readers in use around the nation). Backlash ensued and DHS Secretary Jeh Johnson quickly issued a statement claiming the posting was done without the approval of "ICE leadership." In other words, the issuance was just a governmental glitch and the hasty retreat being beaten entirely unrelated to the public outcry.

Here, the same thing seems to be happening. The DHS CIO posts comments full of alarmism, is called out for it and a spokesperson appears on scene to say that comments released by a DHS official are not the official comments of the agency he represents. To borrow the blame-shifting parlance of law enforcement, a misguided comment "discharged" and no one should have to own up to actually pulling the trigger. Yes, mistakes were made. But apparently no government official should need to acknowledge they were just flat-out wrong.

25 Comments | Leave a Comment..

Posted on Techdirt - 21 April 2016 @ 6:26am

Indian Government Agencies Demand Access To WhatsApp Messaging Groups

from the we-can't-have-people-bad-mouthing-the-government-and-getting-away-with-it dept

Here comes the inevitable government backlash against WhatsApp rolling out end-to-end encryption for one billion users worldwide: if governments can no longer demand access to communications, the next best thing is to demand access to WhatsApp users.

According to India resident Prasanto K. Roy, local governments are demanding that administrators of WhatsApp groups (the latest beneficiaries of the encryption rollout) register with the local magistrate, and will apparently hold them accountable for any "irresponsible remarks" or "untoward actions" by members of the group.

The government's unsubtle man-in-the-middle approach to accessing WhatsApp communications also involves placing a literal government man in the middle, according to the Times of India.
The spokesperson also said that a government representative might also have to be added to the WhatsApp group as an admin. "If any government admin is present in a WhatsApp group, it will immediately prevent any sort of rumour-mongering," he said.
Whenever a government agency develops an overweening urge to curb "rumor-mongering," one can be sure that particular government is fucking something up somewhere. And, indeed, that is the case here.
The government had imposed a blackout on mobile internet in the troubled area after clashes between security forces and protestors claimed the lives of five people. The area had seen protests after the alleged molestation of a teenager by security personnel. The mobile internet blackout had been aimed at curbing the spread of potentially inflammatory messages that could spark further tension in the area.
It would seem to me the tension was created by the alleged molestation, the government's lack of interest in investigating/punishing the wrongdoer and the killing of five people. The government appears to be more interested in saving itself from its constituency, so the obvious move is to shut down any communication platform that it can't monitor or control. It can't kill WhatsApp, so it's demanding to be inserted into these conversations -- either directly or by lurking just offscreen whispering legal threats.

Not only that, but the quelling of dissent extends to the government itself. The flier also notes punishment awaits government employees who find the registration demand heavy-handed.
Govt. Employees serving in the district are directed to restrain from making any comments/remarks with regard to the policies and decisions of government on these WhatsApp groups running in the district and if anyone found involved in such activities, strict action will be initiated against them as required under rules.
Looking beyond this local dispute that has managed to drag in the world's most popular messaging service, one can see why it is essential that citizens have communication platforms that keep the government locked out. Encryption doesn't just "protect" criminals from law enforcement and innocent people from criminals. It also protects the innocent from their governments' self-serving overreach.

21 Comments | Leave a Comment..

Posted on Techdirt - 20 April 2016 @ 3:39pm

Law Enforcement Forced To Hand Over $41K It Seized From Businessman At Airport, Plus Another $10K In Legal Fees

from the felled-by-their-own-bullshittery dept

An unidentified Techdirt reader sends in the news that Arizona law enforcement is going to be handing over $10,000 to Madji Khaleq as a result of a failed asset forfeiture attempt. This would be in addition to the $41,870 the DEA already handed back to Khaleq -- every cent of the cash federal agents seized from him at the Tucson airport.

Khaleq had a legitimate reason to be carrying $40K in cash on him.

Court documents show Khaleq told the Drug Enforcement Administration agent who seized the money that he owned a convenience store and check-cashing business in Denver, as well as a wholesale electronics distributorship in California. He said he came to Tucson to buy a smoke shop on South 12th Avenue, but the deal fell through.
But the DEA firmly believes cash = drugs even when there's no evidence pointing towards illicit sources or uses for the funds, so it relieved Khaleq of his burdensome bankroll. Local law enforcement then swooped in to claim its part in the haul… only to return it when Khaleq lawyered up.
Khaleq challenged the seizure in Pima County Superior Court and the Pima County Attorney’s Office withdrew its request for forfeiture of the money in November.
The $10,000 in legal fees due Khaleq will come from the County Attorney's Office and a Tucson-based counter-narcotics task force. Apparently the DEA has washed its hands of the whole affair after giving Khaleq the money it took from him.

The government is unhappy to be paying a drug trafficker an additional $10,000. Oh, yeah. It still believes Khaleq is involved in drug trafficking despite losing this lawsuit and 10 grand in discretionary spending.
In the March 10 stipulation of dismissal, Deputy County Attorney Edward Russo said the $10,000 is not an admission that Khaleq has shown he is “entitled to an award of attorney’s fees, costs or damages in this action.”
When asked if the County Attorney’s Office still suspected Khaleq of being involved in illicit activity, Johnson said: “Yes, we don’t just take money from people for no reason.”
The hell you don't.

And that's not the full extent of the government's BS. During its fight to keep the uncharged Khaleq from recovering his money, the County Attorney's Office attempted to keep Khaleq as far away from his money and his Fifth Amendment rights as possible.
Russo had asked Aragon on Jan. 11 if the county could present a report on a federal investigation purportedly of Khaleq without Khaleq or his attorney present.

[Judge Gus] Aragon denied the request, saying that to grant the request “would violate basic concepts of fairness and due process.”
Fortunately, a judge stepped in and prevented the government from further abusing Khaleq. Even when the government is clearly in the wrong, it still insists it's right. No admission of wrongdoing despite losing badly enough that the plaintiff was awarded legal fees on top of his original funds, and law enforcement still insists Khaleq is involved in illegal activity despite his lack of a rap sheet and zero evidence in hand to support its claims.

34 Comments | Leave a Comment..

Posted on Techdirt - 20 April 2016 @ 12:47pm

EFF Sues DOJ Over Its Refusal To Release FISA Court Documents Pertaining To Compelled Technical Assistance

from the EFF:DOJ::Coroner:Dr.-Nick dept

Given the heightened interest in the government's efforts to compel companies like Apple to break into their own products for them, the EFF figured it would be a good time to ask the government whether it had used FISA court orders to achieve these ends.

Naturally, the government would rather not discuss its efforts to force Apple, et al. to cough up user data and communications. Hence the secrecy surrounding its use of NSLs, subpoenas and gag orders. Hence, also, its desire to keep cases involving All Writs Acts orders under seal if possible. Hence also (also) its refusal to discuss the secret happenings in its most secret court.

The EFF filed FOIA requests with the DOJ in October of last year and followed up with more in March. The October request sought documents about FISA court decisions related to requests for technical assistance from US companies. After a few months of back-and-forth, the DOJ claimed only two responsive documents were found -- neither of which was a court decision… and neither of which would be released.

The EFF's most recent request broadened the search parameters in hopes of landing more responsive documents.

Any “decision, order, or opinion issued by the Foreign Intelligence Surveillance Court or the Foreign Intelligence Surveillance Court of Review (as defined in section 601(e)),” issued from 1978 to June 1, 2015, “that includes a significant construction or interpretation of any provision of law, including any novel or significant construction or interpretation of the term ‘specific selection term.’”
It also requested the same documents for the period of June 2015 to the present. As of this point, it has yet to receive a response.

The EFF is now suing the DOJ for wrongfully withholding responsive documents and other violations of FOIA regulations, including "performing inadequate searches."

The documents it's seeking are of significant public interest, especially now that the FBI's All Writs-enabled technical assistance demands are at the center of a second war over encryption. As staff attorney Nate Cardozo points out, the devices and services used by millions of Americans shouldn't be subject to the whims of secret courts relying wholly on government ex parte presentations and submissions.
“If the government is obtaining FISC orders to force a company to build backdoors or decrypt their users’ communications, the public has a right to know about those secret demands to compromise people’s phones and computers,” said Cardozo. “The government should not be able to conscript private companies into weakening the security of these devices, particularly via secret court orders.”

Read More | 7 Comments | Leave a Comment..

More posts from Capitalist Lion Tamer >>