Rep. Mark Takano Introduces Bill That Would Keep Companies From Blocking Defendants' Access To Evidence
from the we're-from-the-private-sector-and-we're-here-to-help-violate-your-rights dept
When the government doesn’t want to talk about its law enforcement tech, it dismisses cases. The FBI has done this on several occasions. First, it told local law enforcement to dismiss cases rather than discuss Stingray use in court. Then it did the same thing with its homegrown malware in child porn cases.
But the government can’t do everything itself. It purchases software and outsources forensic investigation. All well and good except when it comes to prosecutions. Defendants have a right to access the evidence being used against them. But in court cases where third-party tech is in play, private companies are inserting themselves into the proceedings to demand the courts protect their “trade secrets.”
Obviously, this makes a mockery of the adversarial system. If defendants can’t challenge the evidence being used against them, the government will be encouraged to stack the deck in its favor by offshoring as much of its forensic and investigative work as possible.
Fortunately, someone is actually trying to do something about this. Rep. Mark Takano (California) is introducing a bill that would prevent tech companies from helping the federal government screw criminal defendants out of their Constitutional rights.
Takano’s Justice in Forensic Algorithms Act of 2019 was introduced with this rather clever tweet, featuring a bit of pseudo-coding to drive the point home.
To address this injustice, I introduced the Justice in Forensic Algorithms Act.
This bill ensures that defendants can access source code to challenge evidence used against them. It also sets standards and testing to assess whether forensic algorithms are fair enough to be used. pic.twitter.com/nkkDCi577Y
— Mark Takano (@RepMarkTakano) September 17, 2019
If the government is using third-party tech to prosecute citizens, citizens shouldn’t be denied access to information just because some company thinks any examination at all might undercut its market advantage.
“The trade secrets privileges of software developers should never trump the due process rights of defendants in the criminal justice system,” said Rep. Mark Takano. “Our criminal justice system is an adversarial system. As part of this adversarial system, defendants are entitled to confront and challenge any evidence used against them. As technological innovations enter our criminal justice system, we need to ensure that they don’t undermine these critical rights. Forensic algorithms are black boxes, and we need to be able to look inside to understand how the software works and to give defendants the ability to challenge them. My legislation will open the black box of forensic algorithms and establish standards that will safeguard our Constitutional right to a fair trial.”
Congress can’t force the court to side with defendants in cases where access to third-party software is at stake. But it can prevent companies from invoking trade secret privileges to prevent defendants from accessing evidence. The bill goes further than just blocking trade privilege interjections. It also would create a national standard for forensic algorithms to ensure they are robust and fair. And that they actually do what they say they do.
This process could bring a bit more science to a field that’s been mostly mumbo and/or jumbo. And it won’t allow law enforcement to create their own forensic black boxes to replace the ones they used to purchase from third parties. It will require input from a number of parties not in the law enforcement profession, ensuring this won’t end up being another half-assed effort that shores up the government’s belief that all accused parties are guilty until proven guilty.
Directs NIST to establish Computational Forensic Algorithms Standards and a Computational Forensic Algorithms Testing Program and requires federal law enforcement to comply with these standards and testing requirements in their use of forensic algorithms. In developing standards NIST is directed to:
– collaborate with outside experts in forensic science, bioethics, algorithmic discrimination, data privacy, racial justice, criminal justice reform, exonerations, and other relevant areas of expertise identified through public input;
– address the potential for disparate impact across protected classes in standards and testing; and
– gather public input for the development of the standards and testing program and publicly document the resulting standards and testing of software.
This part could take awhile to get up and running. But it’s far better than the system currently being used, which has allowed the government’s expert forensic witnesses to overstate the certainty of their findings for years on end.
The more immediate effect will be the constraints placed on private companies who wish to intercede in criminal cases. The government — working with its vendors — will be obligated to provide defendants with a report on the software used, an executable version of the software itself, and its source code. If companies are worried their trade secrets might be exposed in criminal cases, they might want to rethink their partnerships and decide whether the tradeoffs they have to make in court to continue doing business with the government are worth it.