Airplanes have been commonplace for quite some time now, and we've grown accustomed to what an airplane should look like. Ask any kid to draw a plane, and you'll probably get familiar results. However, this doesn't mean we've reached the end of novel plane designs. Plenty of unconventional planes are being designed and tested, and here are just a few.
Among other sweeping new requirements to enhance digital privacy, the bill notably imposes a warrant requirement before police can access nearly any type of digital data produced by or contained within a device or service.
In other words, that would include any use of a stingray, also known as a cell-site simulator, which can not only used to determine a phone’s location, but can also intercept calls and text messages. During the act of locating a phone, stingrays also sweep up information about nearby phones—not just the target phone.
Despite similar bills being killed by governor vetoes in 2012 and 2013, California legislators are still looking to reform the state's privacy laws. For one thing, this new bill would put the state's Electronic Communication Privacy Act in compliance with the Supreme Court's recent Riley v. California decision (warrant requirement for cell phone searches incident to arrest), as Cyrus Farivar points out.
The committee passed it with a 6-1 vote, suggesting there's broader support for privacy and Fourth Amendment protections now than there were in the pre-Snowden days. Of course, the usual opposition was on hand to portray those pushing for a warrant requirement as being in favor of sexually abusing children.
[Marty] Vranicar [California District Attorneys Association] told the committee that the bill would "undermine efforts to find child exploitation," specifically child pornography.
"SB 178 threatens law enforcement’s ability to conduct undercover child porn investigation. the so-called peer-to-peer investigations," he said. "Officers, after creating online profiles—these e-mails provide metadata that is the key to providing information. This would effectively end online undercover investigations in California."
Vranicar failed to explain how an officer conducting an ongoing investigation would be unable to obtain a warrant for PTP user data… unless, of course, the "investigation" was nothing more than unfocused trolling or a sting running dangerously low on probable cause. Nothing in the bill forbids officers from using other methods -- Fourth Amendment-respecting methods -- to pursue those suspected of child exploitation. What it does do is make it more difficult to run stings and honeypots, both of which are already on shaky ground in terms of legality.
Additionally, the bill demands extensive reporting requirements pertaining to government requests for data, and makes an effort to strip away the secrecy surrounding search warrants.
1546.2 (a) Except as otherwise provided in this section, any government entity that executes a warrant or wiretap order or issues an emergency request pursuant to Section 1546.1 shall contemporaneously serve upon, or deliver by registered or first-class mail, electronic mail, or other means reasonably calculated to be effective, the identified targets of the warrant, order, or emergency request, a notice that informs the recipient that information about the recipient has been compelled or requested, and states with reasonable specificity the nature of the government investigation under which the information is sought. The notice shall include a copy of the warrant or order, or a written statement setting forth facts giving rise to the emergency.
(b) If there is no identified target of a warrant, wiretap order, or emergency request at the time of its issuance, the government entity shall take reasonable steps to provide the notice, within three days of the execution of the warrant, to all individuals about whom information was disclosed or obtained.
This isn't blanket coverage or without exceptions. Officers can still offer sworn affidavits in support of sealing to the court, which may then seal warrants on a rolling 90-day basis at its discretion.
Law enforcement will continue to fight this bill, but its opposition seemingly had no effect on the Public Safety Committee. This bill brings the government into a much tighter alignment with the wording and the intent of the Fourth Amendment. The arguments against it demonstrate that the law enforcement community continues to prize efficient policing over the public's (supposedly) guaranteed rights.
NSA director Mike Rogers testified in front of a Senate committee this week, lamenting that the poor ol’ NSA just doesn’t have the “cyber-offensive” capabilities (read: the ability to hack people) it needs to adequately defend the US. How cyber-attacking countries will help cyber-defense is anybody’s guess, but the idea that the NSA is somehow hamstrung is absurd.
Yes, we (or rather, our representatives) are expected to believe the NSA is just barely getting by when it comes to cyber-capabilities. Somehow, backdoors in phone SIM cards, backdoors in networking hardware, backdoors in hard drives, compromised encryption standards, collection points on internet backbones, the cooperation of national security agencies around the world, stealth deployment of malicious spyware, the phone records of pretty much every American, access to major tech company data centers, an arsenal of purchased software and hardware exploits, various odds and ends yet to be disclosed and the full support of the last two administrations just isn't enough. Now, it wants the blessing of lawmakers to do even more than it already does. Which is quite a bit, actually.
The NSA runs sophisticated hacking operations all over the world. A Washington Post report showed that the NSA carried out 231 “offensive” operations in 2011 - and that number has surely grown since then. That report also revealed that the NSA runs a $652m project that has infected tens of thousands of computers with malware.
That was four years ago -- a lifetime when it comes to an agency with the capabilities the NSA possesses. Anyone who believes the current numbers are lower is probably lobbying increased power. And they don't believe it. They'd just act like they do.
Unfortunately, legislators may be in a receptive mood. CISA -- CISPA rebranded -- is back on the table. The recent Sony hack, which caused millions of dollars of embarrassment, has gotten more than a few of them fired up about the oft-deployed term "cybersecurity." Most of those backing this legislation don't seem to have the slightest idea (or just don't care) how much collateral damage it will cause or the extent to which they're looking to expand government power.
The NSA knows, and it wants this bill to sail through unburdened by anything more than its requests for permission to fire.
The bill will do little to stop cyberattacks, but it will do a lot to give the NSA even more power to collect Americans’ communications from tech companies without any legal process whatsoever. The bill’s text was finally released a couple days ago, and, as EFF points out, tucked in the bill were the powers to do the exact type of “offensive” attacks for which Rogers is pining.
In the meantime, Section 215 languishes slightly, as Trevor Timm points out. But that's the least of the NSA's worries. It has tech companies openly opposing its "collect everything" approach. Apple and Google are both being villainized by security and law enforcement agencies for their encryption-by-default plans. More and more broad requests for user data are being challenged, and (eventually) some of the administration's minor surveillance tweaks will be implemented.
Section 215 may die. (Or it may keep on living even in death, thanks to some ambiguous language in the PATRIOT Act.) But I would imagine the bulk phone metadata is no longer a priority for the NSA. It has too many other programs that harvest more and face fewer challenges. The NSA wants to be a major cyberwar player, which is something that will only increase its questionable tactics and domestic surveillance efforts. If it gets its way via CISA, it will be able to make broader and deeper demands for information from tech companies. Under the guise of "information sharing," the NSA will collect more and share less. And what it does share will be buried under redactions, gag orders and chants of "national security." Its partnerships with tech companies will bear a greater resemblance to parasitic relationships than anything approaching equitable, especially when these companies will have this "sharing" foisted upon them by dangerously terrible legislation.
But until it reaches that point, the NSA will keep claiming it's under-equipped to handle the modern world. And it will continue to make the very dubious claim that the best defense is an unrestrained offense.
As I noted earlier this week, at the launch of the Copia Institute a couple of weeks ago, we had a bunch of really fascinating discussions. I've already posted the opening video and explained some of the philosophy behind this effort, and today I wanted to share with you the discussion that we had about free expression and the internet, led by three of the best people to talk about this issue: Michelle Paulson from Wikimedia; Sarah Jeong, a well-known lawyer and writer; and Dave Willner who heads up "Safety, Privacy & Support" at Secret after holding a similar role at Facebook. I strongly recommend watching the full discussion before just jumping into the comments with your assumptions about what was said, because for the most part it's probably not what you think:
Internet platforms and free expression have a strongly symbiotic relationship -- many platforms have helped expand and enable free expression around the globe in many ways. And, at the same time, that expression has fed back into those online platforms making them more valuable and contributing to the innovation that those platforms have enabled. And while it's easy to talk about government attacks on freedom of expression and why that's problematic, things get really tricky and really nuanced when it comes to technology platforms and how they should handle things. At one point in the conversation, Dave Willner made a point that I think is really important to acknowledge:
I think we would be better served as a tech community in acknowledging that we do moderate and control. Everyone moderates and controls user behavior. And even the platforms that are famously held up as examples... Twitter: "the free speech wing of the free speech party." Twitter moderates spam. And it's very easy to say "oh, some spam is malware and that's obviously harmful" but two things: One, you've allowed that "harm" is a legitimate reason to moderate speech and two, there's plenty of spam that's actually just advertising that people find irritating. And once we're in that place, it is the sort of reflexive "no restrictions based on the content of speech" sort of defense that people go to? It fails. And while still believing in free speech ideals, I think we need to acknowledge that that Rubicon has been crossed and that it was crossed in the 90s, if not earlier. And the defense of not overly moderating content for political reasons needs to be articulated in a more sophisticated way that takes into account the fact that these technologies need good moderation to be functional. But that doesn't mean that all moderation is good.
This is an extremely important, but nuanced point that you don't often hear in these discussions. Just today, over at Index on Censorship, there's an interesting article by Padraig Reidy that makes a somewhat similar point, noting that there are many free speech issues where it is silly to deny that they're free speech issues, but plenty of people do. The argument then, is that we'd be able to have a much more useful conversation if people admit:
Don't say "this isn't a free speech issue", rather "this is a free speech issue, and I’m OK with this amount of censorship, for this reason.” Then we can talk."
Soon after this, Sarah Jeong makes another, equally important, if equally nuanced, point about the reflexive response by some to behavior that they don't like to automatically call for blocking of speech, when they are often confusing speech with behavior. She discusses how harassment, for example, is an obvious and very real problem with serious and damaging real-world consequences (for everyone, beyond just those being harassed), but that it's wrong to think that we should just immediately look to find ways to shut people up:
Harassment actually exists and is actually a problem -- and actually skews heavily along gender lines and race lines. People are targeted for their sexuality. And it's not just words online. It ends up being a seemingly innocuous, or rather "non-real" manifestation, when in fact it's linked to real world stalking or other kinds of abuse, even amounting to physical assault, death threats, so and so forth. And there's a real cost. You get less participation from people of marginalized communities -- and when you get less participation from marginalized communities, you lead to a serious loss in culture and value for society. For instance, Wikipedia just has fewer articles about women -- and also its editors just happen to skew overwhelmingly male. When you have great equality on online platforms, you have better social value for the entire world.
That said, there's a huge problem... and it's entering the same policy stage that was prepped and primed by the DMCA, essentially. We're thinking about harassment as content when harassment is behavior. And we're jumping from "there's a problem, we have to solve it" and the only solution we can think of is the one that we've been doling out for copyright infringement since the aughties, and that's just take it down, take it down, take it down. And that means people on the other end take a look at it and take it down. Some people are proposing ContentID, which is not a good solution. And I hope I don't have to spell out why to this room in particular, but essentially people have looked at the regime of copyright enforcement online and said "why can't we do that for harassment" without looking at all the problems that copyright enforcement has run into.
And I think what's really troubling is that copyright is a specific exception to CDA 230 and in order to expand a regime of copyright enforcement for harassment you're going to have to attack CDA 230 and blow a hole in it.
She then noted that this was a major concern because there's a big push among many people who aren't arguing for better free speech protections:
That's a huge viewpoint out right now: it's not that "free speech is great and we need to protect against repressive governments" but that "we need better content removal mechanisms in order to protect women and minorities."
From there the discussion went in a number of different important directions, looking at other alternatives and ways to deal with bad behavior online that get beyond just "take it down, take it down," and also discussed the importance of platforms being able to make decisions about how to handle these issues without facing legal liability. CDA 230, not surprisingly, was a big topic -- and one that people admitted was unlikely to spread to other countries, and the concepts behind which are actually under attack in many places.
That's why I also think this is a good time to point to a new project from the EFF and others, known as the Manila Principles -- highlighting the importance of protecting intermediaries from liability for the speech of their users. As that project explains:
All communication over the Internet is facilitated by intermediaries such as Internet access providers, social networks, and search engines. The policies governing the legal liability of intermediaries for the content of these communications have an impact on users’ rights, including freedom of expression, freedom of association and the right to privacy.
With the aim of protecting freedom of expression and creating an enabling environment for innovation, which balances the needs of governments and other stakeholders, civil society groups from around the world have come together to propose this framework of baseline safeguards and best practices. These are based on international human rights instruments and other international legal frameworks.
In short, it's important to recognize that these are difficult issues -- but that freedom of expression is extremely important. And we should recognize that while pretty much all platforms contain some form of moderation (even in how they are designed), we need to be wary of reflexive responses to just "take it down, take it down, take it down" in dealing with real problems. Instead, we should be looking for more reasonable approaches to many of these issues -- not in denying that there are issues to be dealt with. And not just saying "anything goes and shut up if you don't like it," but that there are real tradeoffs to the decisions that tech companies (and governments) make concerning how these platforms are run.
Well, this is (potentially) good news. New York is going forward with the first "right to repair" bill in the nation, as pointed out on Twitter by Amanda Levendowski. The bill will allow constituents to bypass manufacturer-authorized dealers/repair centers and use smaller (and cheaper) repair outlets. Or, if neither seems within the price range, they're more than welcome to perform these repairs -- using previously-hidden manufacturer specs and instructions -- themselves.
Perhaps the best thing about the bill (if it passes with as few loopholes as possible) is that it will eliminate the sort of ridiculousness that has been the end result of this tight grip on repair "permission." Like Immigrations and Customs Enforcement (ICE) raiding repair shops for using aftermarket products. Or teens being sued by multi-billion dollar companies for doing the same. Or local governments requiring unrelated licenses to be obtained before a person can start offering repairs.
Here's what's being authorized before the exceptions kick in. (ALL CAPS in the original.)
MANUFACTURERS OF DIGITAL ELECTRONIC PARTS AND MACHINES SOLD OR USED IN THE STATE OF NEW YORK SHALL:
I. MAKE AVAILABLE FOR PURCHASE BY INDEPENDENT REPAIR FACILITIES OR OTHER OWNERS OF PRODUCTS MANUFACTURED BY SUCH MANUFACTURER DIAGNOSTIC AND REPAIR INFORMATION, INCLUDING REPAIR TECHNICAL UPDATES, UPDATES AND CORRECTIONS TO FIRMWARE, AND RELATED DOCUMENTATION, IN THE SAME MANNER SUCH MANUFACTURER MAKES AVAILABLE TO ITS AUTHORIZED REPAIR CHANNEL. EACH MANUFACTURER SHALL PROVIDE ACCESS TO SUCH MANUFACTURER'S DIAGNOSTIC AND REPAIR INFORMATION SYSTEM FOR PURCHASE BY OWNERS AND INDEPENDENT REPAIR FACILITIES UPON FAIR AND REASONABLE TERMS; AND
II. MAKE AVAILABLE FOR PURCHASE BY THE PRODUCT OWNER, OR THE AUTHORIZED AGENT OF THE OWNER, SUCH SERVICE PARTS, INCLUSIVE OF ANY UPDATES TO THE FIRMWARE OF THE PARTS, FOR PURCHASE UPON FAIR AND REASONABLE TERMS…
EACH MANUFACTURER OF DIGITAL ELECTRONIC PRODUCTS SOLD OR USED IN THE STATE OF NEW YORK SHALL MAKE AVAILABLE FOR PURCHASE BY OWNERS AND INDEPENDENT REPAIR FACILITIES ALL DIAGNOSTIC REPAIR TOOLS INCORPORATING THE SAME DIAGNOSTIC, REPAIR AND REMOTE COMMUNICATIONS CAPABILITIES THAT SUCH MANUFACTURER MAKES AVAILABLE TO ITS OWN REPAIR OR ENGINEERING STAFF OR ANY AUTHORIZED REPAIR CHANNELS. EACH MANUFACTURER SHALL OFFER SUCH TOOLS FOR SALE TO OWNERS AND TO INDEPENDENT REPAIR FACILITIES UPON FAIR AND REASONABLE TERMS.
That's the good part. But there are potential loopholes in the bill already, including a major exception for one of the most tightlipped industries: auto manufacturers.
NOTHING IN THIS SECTION SHALL APPLY TO MOTOR VEHICLE MANUFACTURERS OR MOTOR VEHICLE DEALERS AS DEFINED IN THIS SECTION.
If any industry needs to be covered under a "right to repair," it's the auto industry, which has continually abusedintellectual property laws to keep the general public from diagnosing their own vehicles in order to perform their own repairs.
There's other potential bad news in there as well.
NOTHING IN THIS SECTION SHALL BE CONSTRUED TO REQUIRE A MANUFACTURER TO DIVULGE A TRADE SECRET.
Yeah. Guess what's going to start being declared "trade secrets?" Probably almost everything the bill orders manufacturers to make available to the public. Even if this bill passes, there's going to be a ton of litigation over what does and does not define a "trade secret." In the meantime, the public will be no better off than they were before the bill's passage.
And there's this exception, which would seem to pick up whatever slack "trade secrets" can't.
NOTHING IN THIS SECTION SHALL BE CONSTRUED TO REQUIRE MANUFACTURERS OR AUTHORIZED REPAIR PROVIDERS TO PROVIDE AN OWNER OR INDEPENDENT REPAIR PROVIDER ACCESS TO NON-DIAGNOSTIC AND REPAIR INFORMATION PROVIDED BY A MANUFACTURER TO AN AUTHORIZED REPAIR PROVIDER PURSUANT TO THE TERMS OF AN AUTHORIZING AGREEMENT.
"Non-diagnostic" could become the new "diagnostic." And the use of the word "and" seems to make "repair information" off-limits if any agreements are already in place with authorized dealers and repair shops.
There's also a good chance the bill's "fair and reasonable terms" will be construed as permission to price independent repair shops and the general public out of the market. Legislators obviously can't set base prices (or even determine a fair market price -- that information is kept under wraps as well), so the suggestion of a "fair" price is open to advantageous interpretation. There's an attempt to set some limits in the bill's definitions, with the most significant one being "THE ABILITY OF AFTERMARKET TECHNICIANS OR SHOPS TO AFFORD THE INFORMATION," but that, again, is going to generate a lot of friction (possibly of the litigious variety) when manfacturers and the rest of the public repeatedly fail to agree on the definition of "affordable."
Still, it's more than most governments are willing to attempt. Massachusetts passed one in 2013 -- one that targeted auto manufacturers and dealers. It met with the usual resistance from the auto industry (both ends) but gathered 86% of the public's votes, clearly signaling unhappiness with the automakers' closed systems. A federal "right to repair" law has been mooted several times, but has never gained significant traction.
If this bill is going to succeed as a law, legislators need to do some loophole stitching pre-passage, and regulators will need to keep a very close eye on reticent manufacturers after it becomes law.
Today's Daily Deal focuses on 88% off of the Cyber Security Developer Course Bundle. For $49, you get 60 hours of training through 6 separate courses on subjects ranging from secure PHP coding to learning all about VPNs to training to take the CISA certification exam. It's never a bad idea to brush up on your security skills. And don't forget to use the code TECHDIRT10 to receive an additional 10% off of your first purchase.
Note: We earn a portion of all sales from Techdirt Deals. The products featured do not reflect endorsements by our editorial team.
Since the Snowden leaks began, there have been several efforts made -- legislative and administrative -- in response to the exposure of the NSA's domestic surveillance programs. Some have been real fixes. Some have been fake fixes. Others have targeted the thing the NSA desires even more than seemingly limitless access to data from all over the world: funding.
The bill would completely repeal the Patriot Act, the sweeping national security law passed in the days after Sept. 11, 2001, as well as the 2008 FISA Amendments Act, another spying law that the NSA has used to justify collecting vast swaths of people's communications through the Internet.
If anything's due for a complete revamp, if not a complete repeal, it's the Patriot Act. It wasn't even good legislation back when it was passed. At best, it was "timely," which is a term that gives the rushed, secretive, knee-jerk legislation far more credit than it deserves. Pocan and Massie's (the latter of which has just introduced a new phone-unlocking bill with Rep. Zoe Lofgren to replace the bad one passed by the House in 2014) "Surveillance State Repeal Act" doesn't waste any time "tinkering around the edges."
Not only would the bill repeal the law, it would reset anything (amendments/additional government powers) brought into force by the Patriot Act and the FISA Amendments Act of 2008. On top of that, it would demand the immediate deletion of tons of data from the NSA's collections.
DESTRUCTION OF CERTAIN INFORMATION.—The Director of National Intelligence and the Attorney General shall destroy any information collected under the USA PATRIOT Act (Public Law 107-56) and the amendments made by such Act, as in effect the day before the date of the enactment of this Act, concerning a United States person that is not related to an investigation that is actively ongoing on such date.
The bill, oddly, also describes a path towards FISA Judge For Life positions.
TERMS; REAPPOINTMENT.—Section 103(d) of the Foreign Intelligence Surveillance Act of 1978 (50 U.S.C. 1803(d)) is amended— (1) by striking ‘‘maximum of seven’’ and inserting ‘‘maximum of ten’’; and (2) by striking ‘‘and shall not be eligible for re-designation’’.
Which is fine (not really) if you like the judges already appointed. But this is the sort of thing that leads to the permanent appointment of judges favored by either side of the surveillance question. And so far, presidential administrations have come down in favor of domestic surveillance. Removing the term limits just encourages the appointment of permanent NSA rubber stamps.
The bill creates a warrant requirement for the acquisition of US persons' data under the FISA Amendments Act and Executive Order 12333. It also expressly forbids a government mandate for encryption backdoors, although the first sentence of this section seems to be a rather large loophole.
Notwithstanding any other provision of law, the Federal Government shall not mandate that the manufacturer of an electronic device or software for an electronic device build into such device or software a mechanism that allows the Federal Government to bypass the encryption or privacy technology of such device or software.
If this bill somehow manages to pass a round or two of scrutiny, language tweaks will certainly be requested -- possibly leading to a complete subversion of the bill's intent. But that's a huge "if." Very few legislators have the stomach to gut the Patriot Act or the FISA Amendments Act. Many will be happy to entertain smaller fixes, but most won't be willing to essentially strip the NSA of its domestic surveillance powers. No one wants to be the "yea" vote that's pointed to in the wake of a terrorist attack and only a few more are actually willing to go head-to-head with the intelligence agency.
from the all-the-'news'-that's-fit-to-cram-into-a-24-hour-sprawl dept
CNN and Fox had the market cornered on ridiculous airplane crash theories, up until recently. When Malaysia Airlines Flight 17 just up and vanished, CNN produced wall-to-wall coverage seemingly cribbed from low-rent conspiracy theory sites. UFO? Black hole? Any and all theories were entertained.
Fox News hasn't exactly been the epitome of restraint, either. While it managed to avoid following CNN down these plane crash rabbit holes, it too has entertained some theories better left to operations that don't claim "news" to be a major part of their offerings. Fox News host Anna Kooiman suggested the metric system was to blame, what with kilometers being different than miles and Celsius and Fahrenheit not seeing eye-to-eye, potentially leading to some sort of in-flight calculation error.
“There’s one possibility that no one has brought up, and I wonder could this be a hacking incident?” former commercial pilot Jay Rollins told MSNBC’s Diaz-Balart. “This is very similar in my mind to what happened when the U.S. lost that drone over Iran. The same thing, suddenly the aircraft was responding to outside forces…"
Rollins said that the plane’s descent was “worrisome” because “it makes me think about hacking, some sort of interference into the computer system.”
Or not. Teso's demonstration involved sending flight information to airborne planes with these instructions (in a simulated environment, of course) via ACARS (Aircraft Communications and Response Addressing System) to the FMS (Flight Management System). But there were multiple problems with his plan. First of all, the flight computer has to accept the new instructions and, secondly, pilots would have to be unable to override bad instructions. Neither of which are a distinct possibility.
The problem is, the FMS — and certainly not ACARS — does not directly control an airplane the way people think it does, and the way, with respect to this story, media reports are implying. Neither the FMS nor the autopilot flies the plane. The crew flies the plane through these components. We tell it what to do, when to do it, and how to do it. Whatever data finds its way into the FMS, and regardless of where it’s coming from, it still needs to make sense to the crew. If it doesn’t, we’re not going to allow the plane, or ourselves, to follow it.
The sorts of disruptions that might arise aren’t anything a crew couldn’t notice and easily override. The FMS cannot say to the plane, “descend toward the ground now!” or “Slow to stall speed now!” or “Turn left and fly into that building!” It doesn’t work that way. What you might see would be something like an en route waypoint that would, if followed, carry you astray of course, or an altitude that’s out of whack with what ATC or the charts tells you it ought to be. That sort of thing. Anything weird or unsafe — an incorrect course or altitude — would be corrected very quickly by the pilots.
So, the problem isn't that hacking is impossible. It's just very, very unlikely. And in this case, hacking had nothing to do with the plane crash.
No, the problem is that news agencies looking to wring every bit of ratings possible from a tragedy are willing to make viewers stupider under the guise of "news." When facts just aren't available, 24-hour news teams lean heavily on whatever theory will provide the most entertainment (for lack of a better word). Former pilot Jay Rollins may have three decades of experience, but his speculation draws on none of it. Instead, it just takes a bit of what's selling right now (anything "cyber") and what has always sold (fear) and leaves the viewers with less information than they would have obtained by skipping the coverage completely. The truth, however, is simultaneously more horrific (in that there's little that can be done to thwart a pilot determined to crash a plane) than the "hacked plane" theory and more mundane -- at least in terms of "exciting" news coverage.
For many years, we've noted that while some in the legacy entertainment industry seem to think that there's a "battle" between "Hollywood" and "Silicon Valley" it's a very weird sort of war in which one of those parties -- Silicon Valley -- keeps supplying more and more "weapons" to the other party to help it adapt and succeed in a changing world. There are many examples of this, but the clearest is with the VCR, which the MPAA fought hard to outlaw in the 1970s and 1980s. The MPAA's Jack Valenti famously said in 1982 that "the VCR is to the American film producer and the American public as the Boston strangler is to the woman home alone." It was just four years later that home video revenue surpassed box office revenue for Hollywood. It wasn't the Boston strangler, it was the savior. Similar stories can be told elsewhere. The legacy entertainment industry has sued over MP3 players and YouTube, yet has now (finally) embraced online music and video years later than it should have.
And yet, that same legacy industry keeps trying to do everything to hamstring innovation that will only help it. A few years ago, we wrote about a fantastic post (sadly now gone from the internet) by Tyler Crowley, talking about the entrepreneur's view of innovation options and how many areas are welcoming for innovation -- which he described using the analogy of islands:
For tech folks, from the 35,000' view, there are islands of opportunity. There's Apple Island, Facebook Island, Microsoft Island, among many others and yes there's Music Biz Island. Now, we as tech folks have many friends who have sailed to Apple Island and we know that it's $99/year to doc your boat and if you build anything Apple Island will tax you at 30%. Many of our friends are partying their asses off on Apple Island while making millions (and in some recent cases billions) and that sure sounds like a nice place to build a business.
But what about Music Biz Island? Not so much:
Now, we also know of Music Biz Island which is where the natives start firing cannons as you approach, and if not stuck at sea, one must negotiate with the chiefs for 9 months before given permission to dock. Those who do go ashore are slowly eaten alive by the native cannibals. As a result, all the tugboats and lighthouses (investors, advisors) warn to stay far away from Music Biz Island, as nobody has ever gotten off alive. If that wasn't bad enough, while Apple and Facebook Island are built with sea walls to protect from the rising oceans, Music Biz Island is already 5 ft under and the educated locals are fleeing for Topspin Island.
As we pointed out, this leads to the legacy entertainment companies poisoning the well that contains the innovation water it desperately needs.
There's a parallel to this in terms of copyright laws. As the legacy entertainment industry keeps pushing for more draconian copyright laws, it only serves to scare more investors away. When we get good results, like the ruling in the Cablevision case saying that cloud-based services were legal, it resulted in a huge growth in investment in cloud services -- in contrast to much less spending in Europe, where the laws were a lot more ambiguous.
A new study from Fifth Era and Engine takes this finding even further, highlighting how bad or vague copyright laws are seriously scaring off investment in necessary platforms and innovation. A big part of this appears to be worries about absolutely insane statutory damages awards. The study surveyed tons of investors around the globe and they found an obvious concern about investing in areas where lawsuits could so easily destroy platforms:
In all eight countries surveyed, early stage investors view the risk of
uncertain and potentially large damages as of significant concern as they look to invest in [Digital Content Intermediaries]. 85% agree or
strongly agree that this is a major factor in making them uncomfortable about investing in [Digital Content Intermediaries].
And they're very specific about how the direct concern involves music and videos and the threat of a lawsuit that could simply put those companies out of business:
88% of worldwide investors surveyed said they are uncomfortable investing in [Digital Content Intermediaries] that offer user generated music
and video given an ambiguous regulatory framework.
This is really unfortunate on a number of different levels:
First, it limits the necessary innovation in services and business models that are likely to create the success stories of tomorrow. We need more experiments and platforms that allow places for artists and creators to create, promote, connect with fans and make money for their efforts. Yet if the legacy industry is scaring away all the investors, that's not going to happen.
Second, it locks in the few dominant players of today. Want to build the next YouTube? Good luck. You'll need lots of money to do so, but you're less likely to get it at this stage. The legacy players keep hating the big successful platforms, but don't realize that their own moves lock those players in the dominant positions.
Third, without competition in these spaces and platforms, content creators are less likely to get the best deals. When the legacy industry basically allows one player to become dominant, then it can set terms that are more in its favor. This is what so many from the legacy content industry are complaining about today -- without recognizing that their own actions regarding copyright law have helped create that situation.
Of course, many in those legacy industries actually see this sort of thing as a feature not a bug of pushing for greater copyright protectionism. They think -- ridiculously -- that by hamstringing innovation and investment they get to hold onto their perch longer. This is just wrong. It's trying to hold back the tide, while driving fans to alternative and often unauthorized platforms instead. Rather than supporting the innovation they need, pushing for bad copyright laws only helps to alienate the innovators the industry needs the most and the biggest fans whose support the content industry needs to thrive.
from the keeping-you-safe...-or-keeping-you-vulnerable dept
Back in October, we highlighted the contradiction of FBI Director James Comey raging against encryption and demanding backdoors, while at the very same time the FBI's own website was suggesting mobile encryption as a way to stay safe. Sometime after that post went online, all of the information on that page about staying safe magically disappeared, though thankfully I screenshotted it at the time:
If you really want, you can still see that information over at the Internet Archive or in a separate press release the FBI apparently didn't track down and memory hole yet. Still, it's no surprise that the FBI quietly deleted that original page recommending that you encrypt your phones "to protect the user's personal data," because the big boss man is going around spreading a bunch of scare stories about how we're all going to be dead or crying if people actually encrypted their phones:
Calling the use of encrypted phones and computers a “huge problem” and an affront to the “rule of law,” Comey, painted an apocalyptic picture of the world if the communications technology isn’t banned.
“We’re drifting to a place where a whole lot of people are going to look at us with tears in their eyes,” he told the House Appropriations Committee, describing a hypothetical in which a kidnapped young girl’s phone is discovered but can’t be unlocked.
So, until recently, the FBI was actively recommending you encrypt your data to protect your safety -- and yet, today it's "an affront to the rule of law." Is this guy serious?
More directly, this should raise serious questions about what Comey thinks his role is at the FBI (or the FBI's role is for the country)? Is it to keep Americans safe -- or is it to undermine their privacy and security just so it can spy on everyone?
Not surprisingly, Comey pulls out the trifecta of FUD in trying to explain why it needs to spy on everyone: pedophiles, kidnappers and drug dealers:
“Tech execs say privacy should be the paramount virtue,” Comey continued, “When I hear that I close my eyes and say try to image what the world looks like where pedophiles can’t be seen, kidnapper can’t be seen, drug dealers can’t be seen.”
Except we know exactly what that looks like -- because that's the world we've basically always lived with. And yet, law enforcement folks like the FBI and various police departments were able to use basic detective work to track down criminals.
If you want to understand just how ridiculous Comey's arguments are, simply replace his desire for unencrypted devices with video cameras in every corner of your home that stream directly into the FBI. Same thing. Would that make it easier for the FBI to solve some crimes? Undoubtedly. Would it be a massive violation of privacy and put many more people at risk? Absolutely.
It's as if Comey has absolutely no concept of a cost-benefit analysis. All "bad people" must be stopped, even if it means destroying all of our freedoms, based on what he has to say. That's insane -- and raises serious questions about his competence to lead a government agency charged with protecting the Constitution.