Skip to content(if available)orjump to list(if available)

NSA, NIST, and post-quantum crypto: my second lawsuit against the US government

jcranmer

If anyone is curious, the courtlistener link for the lawsuit is here: https://www.courtlistener.com/docket/64872195/bernstein-v-na...

(And somebody has already kindly uploaded the documents to RECAP, so it costs you nothing to access.)

Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.

Natsu

> Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.

I just want to second that and thank you for the link. Most reporting is just horribly bad at covering legal stuff because all the stuff that makes headlines that people click on is mostly nonsense.

AndyMcConachie

And a big thank you to the wonderful people at the Free Law Project for giving us the ability to find and link to this stuff. They're a non-profit and they accept donations. (hint hint)

tptacek

It's just a vanilla FOIA lawsuit, of the kind hundreds of people file every month when public bodies fuck up FOIA.

If NIST puts up any kind of fight (I don't know why they would), it'll be fun to watch Matt and Wayne, you know, win a FOIA case. There's a lot of nerd utility in knowing more about how FOIA works!

But you're not going to get the secrets of the Kennedy assassination by reading this thing.

chasil

I will draw to your attention two interesting facts.

First, OpenSSH has disregarded the winning (crystals) variants, and implemented hybrid NTRU-Prime. The Bernstein blog post discusses hybrid designs.

"Use the hybrid Streamlined NTRU Prime + x25519 key exchange method by default ("sntrup761x25519-sha512@openssh.com"). The NTRU algorithm is believed to resist attacks enabled by future quantum computers and is paired with the X25519 ECDH key exchange (the previous default) as a backstop against any weaknesses in NTRU Prime that may be discovered in the future. The combination ensures that the hybrid exchange offers at least as good security as the status quo."

https://www.openssh.com/releasenotes.html

Second, Daniel Bernstein has filed a public complaint against the NIST process, and the FOIA stonewalling adds more concern and doubt that the current results are fair.

https://www.google.com/url?q=https://groups.google.com/a/lis...

What are the aims of the lawsuit? Can the NIST decision on crystals be overturned by the court, and is that the goal?

djmdjm

We (OpenSSH) haven't "disregarded" the winning variants, we added NTRU before the standardisation process was finished and we'll almost certainly add the NIST finalists fairly soon.

tptacek

What are the aims of the lawsuit? NIST fucked up a FOIA response. The thing you do when a public body gives you an unsatisfactory FOIA response is that you sue them. I've been involved in similar suits. I'd be surprised if NIST doesn't just cough up the documents to make this go away.

"Can NIST's decisions on crystals be overturned by the court?" Let me help you out with that: no, you can't use a FOIA suit to "overturn" a NIST contest.

OpenSSH implemneted NTRU-Prime? What's your point? That we should just do whatever the OpenSSH team decides to do? I almost agree! But then, if that's the case, none of this matters.

LinuxBender

It's not the first time either and it won't be the last. NIST chose Rijndael over Serpent for the AES standard even though Serpent won. I vaguely recall they gave some smarmy answer. I don't think anyone submitted a FOIA not that it would matter. I've been through that bloated semi-pseudo process and saw how easy it was to stall people not answer a simple question.

Thorrez

>What are the aims of the lawsuit? Can the NIST decision on crystals be overturned by the court, and is that the goal?

It sounds to me like the goal is to find out if there's any evidence of the NSA adding weaknesses into any of the algorithms. That information would allow people to avoid using those algorithms.

tptacek

I may believe almost all of this is overblown and silly, as like a matter of cryptographic research, but I'll say that Matt Topic and Merrick Wayne are the real deal, legit the lawyers you want working on something like this, and if they're involved, presumably some good will come out of the whole thing.

Matt Topic is probably best known as the FOIA attorney who got the Laquan McDonald videos released in Chicago; I've been peripherally involved in some work he and Merrick Wayne did for a friend, in a pretty technical case that got fierce resistance from CPD, and those two were on point. Whatever else you'd say about Bernstein here, he knows how to pick a FOIA lawyer.

A maybe more useful way to say the same thing is: if Matt Topic and Merrick Wayne are filing this complaint, you should probably put your money on them having NIST dead-to-rights with the FOIA process stuff.

daneel_w

> "I may believe almost all of this is overblown and silly, as like a matter of cryptographic research ..."

Am I misunderstanding you, or are you saying that you believe almost all of DJB's statements claiming that NIST/NSA is doctoring cryptography is overblown and silly? If that's the case, would you mind elaborating?

tptacek

I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.

I believe that NIST is obligated to be responsive to FOIA requests, even if the motivation behind those requests is risible.

jeffparsons

> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.

Is that even a claim here? I'm on mobile right now so it's a bit hard for me to trawl through the DJB/NIST dialogue, but I thought his main complaint is that NIST didn't appear to have a proper and clear process for choosing the algorithms they did, when arguably better algorithms were available.

So the suggestion wouldn't necessarily be that one of the respected contestants was bribed or otherwise compromised, but rather that NIST may have been tapped on the shoulder by NSA (again) with the suggestion that they should pick a specific algorithm, and that NSA would make the suggestion they have because their own cryptographers ("true believers" on NSA payroll) have discovered flaws in those suggested algorithms that they believe NSA can exploit but hopefully not adversaries can exploit.

There's no need for any novel conspiracies or corruption; merely an exact repeat of previous NSA/NIST behaviour consistent with NSA policy positions.

It's simultaneously about as banal as it gets, and deeply troubling because of that.

ckastner

> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.

Could you elaborate on this? I didn't get this from the article at all. There's no researcher(s) being implicated as far as I can tell.

What I read is the accusation of NIST's decision-making process possibly being influenced by the NSA, something that we know has happened before.

Say N teams of stellar researchers submit proposals, and they review their peers. For the sake of argument, let's say that no flaw is found in any proposal; every single one is considered perfect.

NIST then picks algorithm X.

It is critical to understand the decision making process behind the picking of X, crucially so when the decision-making body has a history of collusion.

Because even if all N proposals are considered perfect by all possible researchers, if the NSA did influence NIST in the process, history would suggest that X would be the least trustable of all proposals.

And that's the main argument I got from the article.

Yes, stone-walling a FOIA request may be common, but in the case of NIST, there is ample precedent for malfeasance.

jmprspret

I believe you have a very naive and trusting view of these US governmental bodies. I don't intend that to be an insult, but by now I think the jury is out that these agencies cannot be trusted (the NSA less so, than NIST).

Semaphor

> risible

just in case someone else never heard this word before:

> arousing or provoking laughter

null

[deleted]

api

I don't think it's a bad thing to push back and demand transparency. At the very least the pressure helps keep NIST honest. Keep reminding them over and over and over again about dual-EC and they're less likely to try stupid stuff like that again.

xt00

Speaking of dual-EC -- it does seem like 2 questions seem to be often debated, but it can't be neglected that some of the vocal debaters may be NSA shills:

1. does the use of standards actually help people, or make it easier for the NSA to determine which encryption method was used?

2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?

It seems that these question often have piles of people ready to jump in saying "oh, don't roll your own encryption, ooh scary... fear uncertainty doubt... and oh whatever you do, don't encrypt something 3X that will probably make it easier to decrypt!!" .. but it would be great if some neutral 3rd party could basically say, ok here is an algorithm that is ridiculously hard to break, and you can crank up the number of bits to a super crazy number.. and then also you can run the encryption N times and just not knowing the number of times it was encrypted would dramatically increase the complexity of decryption... but yea how many minutes before somebody jumps in saying -- yea, don't do that, make sure you encrypt with a well known algorithm exactly once.. "trust me"...

tptacek

1. Formal, centralized crypto standards, be they NIST or IETF, are a force for evil.

2. All else equal, fewer dependencies on randomness are better. But all else is not equal, and you can easily lose security by adding determinism to designs willy-nilly in an effort to minimize randomness dependencies.

Nothing is, any time in the conceivable future, change to make a broken RNG not game-over. So the important thing remains ensuring that there's a sound design for your RNG.

None of our problems have anything to do with how "much" you encrypt something, or with "cranking up the number of bits". That should be good news for you; generally, you can run ChaPoly or AES-CTR and trust that a direct attack on the cipher isn't going to be an issue for you. Most of our problems are in the joinery, not the beams themselves.

Thorrez

>2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?

I think all block ciphers (e.g. AES) meet that definition. For AES, for a specific key, there's a 1-to-1 mapping of plaintexts to ciphertexts. It's impossible that running a plaintext through AES produces a ciphertext with less entropy, because if the ciphertext had less entropy, it would be impossible to decrypt to get back the plaintext, but AES always allows decryption.

null

[deleted]

logifail

> some neutral 3rd party

Unfortunately, this would appear to be the bit we've not yet solved, nor are we likely to.

MauranKilom

> are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?

Unless you can prove that all e.g. 2^256 possible 256 bit inputs map to 2^256 different 256 bit outputs (for every key, in the case of encryption), then chances are you lose strength with every application because multiple inputs map to the same output (and consequently some outputs are not reachable).

tptacek

Transparency is good, and, as Bernstein's attorneys will ably establish, not optional.

ddingus

It's as optional as the people can be convinced to not worry about it.

encryptluks2

I have no doubt that they are great at their job, but when it comes to lawsuits the judge(s) are equally as important. You could get everything right but a judge has extreme power to interpret the law or even ignore it in select cases.

NolF

I wouldn't say they ignore the law, but legislation like FOIA has a lot of discretion to balance competing interests and that's where a judge would make the most different despite all the great articulations of the most brilliant lawyers.

tptacek

There are very few public bodies that do a solid, to-the-letter job of complying with their open records requirements. Almost all FOIA failings are due to the fact that it isn't staffed adequately; FOIA officers, clerks, and records attorneys are all overworked. When you do a bunch of FOIA stuff, you get a feel for what's going on with the other side, and you build a lot of empathy (which is helpful in getting your data over the long run).

And then other times you run into bloody-mindedness, or worse.

I don't think NIST has many excuses here. It looks like they botched this straightforwardly.

It's a straightforward case. My bet is that they'll lose it. The documents will get delivered. That'll be the end of it.

sigil

Near the end of the post – after 50 years of axe grinding – djb does eventually get to the point wrt pqcrypto. I find the below excerpt particularly damning. Why not wrap nascent pqcrypto in classical crypto? Suspect!

--

The general view today is that of course post-quantum cryptography should be an extra layer on top of well-established pre-quantum cryptography. As the French government cybersecurity agency (Agence nationale de la sécurité des systèmes d'information, ANSSI) put it at the end of 2021:

Acknowledging the immaturity of PQC is important: ANSSI will not endorse any direct drop-in replacement of currently used algorithms in the short/medium term. However, this immaturity should not serve as an argument for postponing the first deployments. ANSSI encourages all industries to progress towards an initiation of a gradual overlap transition in order to progressively increase trust on the post-quantum algorithms and their implementations while ensuring no security regression as far as classical (pre-quantum) security is concerned. ...

Given that most post-quantum algorithms involve message sizes much larger than the current pre-quantum schemes, the extra performance cost of an hybrid scheme remains low in comparison with the cost of the underlying post-quantum scheme. ANSSI believes that this is a reasonable price to pay for guaranteeing an additional pre-quantum security at least equivalent to the one provided by current pre-quantum standardized algorithms.

But NSA has a different position: it says that it "does not expect to approve" hybrids. Publicly, NSA justifies this by

- pointing to a fringe case where a careless effort to add an extra security layer damaged security, and

- expressing "confidence in the NIST PQC process".

Does that mean the original NISTPQC process, or the current NISTPQC process in which NIST, evidently surprised by attacks, announced plans to call for new submissions?

Of course, if NSA/IDA have secretly developed an attack that works for a particular type of post-quantum cryptosystem, then it makes sense that they'd want people to start using that type of cryptosystem and turn off the existing pre-quantum cryptosystem.

tptacek

This is the least compelling argument Bernstein makes in the whole post, because it's simply not the job of the NIST PQC program to design or recommend hybrid classical/PQC schemes. Is it fucky and weird if NSA later decides to recommend against people using hybrid key establishment? Yes. Nobody should listen to NSA about that, or anything else. But NIST ran a PQC KEM and signature contest, not a secure transport standardization. Sir, this is a Wendy's.

sigil

It’s compelling in context. If the NSA influenced NIST standards 3x in the past — DES, DSA, Dual EC — then shouldn’t we be on high alert this 4th time around?

That NSA is already recommending against hybrid, instead of waiting for the contest results, might signal they’ve once again managed to game the standardization process itself.

At the very least — given the exhaustive history in this post — you’d like to know what interactions NSA and NIST have had this time around. Thus, djb’s FOIA. And thus the lawsuit when the FOIA went unanswered. It all seems very reasonable to me.

What’s that old saying, “fool me thrice…”?

tptacek

Everybody is on high alert. Being on high alert doesn't make Bernstein right.

I don't even support the premise of NIST crypto standardization, let alone trust them to do it.

xiphias2

An interesting thing that is happening on Bitcoin mailing list is that although it would be quite easy to add Lamport signatures as an extra safety feature for high value transactions, as they would be quite expensive and easy to misuse (they can be used only once, which is a problem if money is sent to the same address twice), the current concensus between developers is to ,,just wait for NSA/NIST to be ready with the algorithm''. I haven't seen any discussion on the possibility of never being ready on purpose because of a sabotage.

potatototoo99

Why not start that discussion yourself?

jack_pp

Indeed as potato said, link this article in the ML for them to see that NIST can not be fully trusted

lizardactivist

An expert, prominent, and someone who the whole cryptography community listens to, and he calls out the lies, crimes, and blatant hypocrisy of his own government.

I genuinely fear that he will be suicided one of these days.

ok_dad

I think the United States is more about charging people with crimes and ruining their lives that way rather than disappearing people. Russia might kill you with Polonium and make sure everyone knows it, but America will straight up “legally“ torture you in prison via several means and then argue successfully that those methods were legal and convince the world you weren’t tortured. Anyone who’s a target for that treatment, though, knows that’s a lie.

dmix

The FBI will just interview you over whatever and then charge you for lying to a federal agent or dig up some other unrelated dirt. While the original investigation gets mysteriously dropped a year later.

danuker

McAfee and Epstein pop to mind. Maybe also Aaron Swartz.

discordance

Assange too.

oittaa

It seems silly to me how so many people immediately dismiss anyone even suggesting that something fishy was going on with those cases, when we already know about MKUltra, Tuskegee expirement, etc.

josh2600

I just want to say, the problem here is worldwide standards bodies for encryption need to be trustworthy. It is incredibly hard to know what encryption is actually real without a deep mathematics background and even then, a choir of peers must be able to present algorithms, and audits of those algorithms with a straight face.

Presenting broken-by-design encryption undermines public confidence in what should be one of our most sacrosanct institutions: the National Institute of Standards and Technology (NIST). Many enterprises do not possess the capability to audit these standards and will simply use whatever NIST recommends. The danger is that we could be engineering embedded systems which will be in use for decades which are not only viewable by the NSA (which you might be ok with depending on your political allegiance) but also likely viewable by any capable organization on earth (which you are probably not ok with irrespective of your political allegiance).

In short, we must have trustworthy cryptography standards. If we do not, bedlam will follow.

Please recall, the last lawsuit that DJB filed was the one that resulted in essentially "Code is speech" in our world (https://en.wikipedia.org/wiki/Bernstein_v._United_States).

tptacek

There's an easier problem here, which is that our reliance on formal standards bodies for the selection of cryptography constructions is bad, and, not hardly just at NIST, has been over the last 20 years mostly a force for evil. One of the most important "standards" in cryptography, the Noise Protocol Framework, will probably never be a formal standard. But on the flip side, no formal standards body is going to crud it up with nonsense.

So, no, I'd say that bedlam will not follow from a lack of trustworthy cryptography standards. We've trusted standards too much as it is.

javajosh

Believing both "Don't roll your own crypto" and "Don't trust the standards" would seem to leave the average developer in something of a quandry, no?

tptacek

No. I don't think we should rely on formal standards, like FIPS, NIST, and the IETF. Like Bernstein himself, I do think we should rely on peer-reviewed expert cryptography. I use Chapoly, not a stream cipher I concocted myself, or some bizarro cipher cascade posted to HN. This is what I'm talking about when I mentioned the Noise Protocol Framework.

If IETF standards happen to end up with good cryptography because they too adopt things like Noise or Ed25519, that's great. I don't distrust the IETF's ability to standardize something like HTTP/3. I do deeply distrust the process they use to arrive at cryptographic architectures. It's gotten markedly better, but there's every reason to believe it'll backslide a generation from now.

(There are very excellent people who contribute to things like CFRG and I wouldn't want to be read as disparaging any of them. It's the process I have an issue with, not anything happening there currently.)

bananapub

how could NIST possibly be "one of our most sacrosanct institutions" after the NSA already fucked them with Dual_EC_DRBG?

whoever wants to recommend standards at any point since 2015 needs to be someone else

https://en.wikipedia.org/wiki/NIST_SP_800-90A for this who have forgotten.

josh2600

Look, my point is that there are lots of companies around the world who can’t afford highly skilled mathematicians and cryptographers on staff. These institutions rely on NIST to help them determine what encryption systems may make sense. If NIST is truly adversarial, the public has a right to know and determine how to engage going forward.

tptacek

They don't have to (and shouldn't) retain highly skilled mathematicians. Nobody is suggesting that everyone design their own ciphers, authenticated key exchanges, signature schemes, and secure transports. Peer review is good; vital; an absolute requirement. Committee-based selection processes are what's problematic.

null

[deleted]

jacooper

Flippo valrosida and Matthey green aren't too happy.

https://twitter.com/matthew_d_green/status/15556838562625208...

jeffparsons

I think this is a sloppy take. If you read the full back-and-forth on the FOI request between D.J. Bernstein and NIST, it becomes readily apparent that there is _something_ rotten in the state of NIST.

Now of course that doesn't necessarily mean that NIST's work is completely compromised by the NSA (even though it has been in the past), but there are other problems that are similarly serious. For example, if NIST is unable to explain how certain key decisions were made along the way to standardisation, and those decisions appear to go against what would be considered by prominent experts in the field as "good practice", then NIST has a serious process problem. This is important work. It affects everyone in the world. And certain key parts of NIST's decision making process seem to be explained with not much more than a shrug. That's a problem.

tptacek

All you're saying here is that NIST failed to comply with FOIA. That's not unusual. No public body does a reliably good job of complying with FOIA, and many public bodies seem to have a bad habit of pre-judging the "merits" of FOIA requests, when no merit threshold exists for their open records requirements.

NIST failing to comply with FOIA makes them an intransigent public body, like all the rest of them, from your local water reclamation board to the Department of Energy.

It emphatically does not lend support to any of this litigants concerns about the PQC process. I don't know enough (really, anything) about the PQC "contest" to judge claims about its validity, but I do know enough --- like, the small amount of background information needed --- to say that it's risible to suggest that any of the participating teams were compromised by intelligence agencies; that claim having been made in this post saps its credibility.

So, two things I think a reasonable person would want to establish here: first, that NIST's behavior with respect to the FOIA request is hardly any kind of smoking gun, and second that the narrative being presented in this post about the PQC contest seems somewhere between "hand-wavy" and "embarrassing".

jeffparsons

> It emphatically does not lend support to any of this litigants concerns about the PQC process.

I agree with most of what you're saying except for this. In my view, unlike some of the other organisations you mentioned, the _only value_ of NIST is in the quality and transparency of its processes. My reading of the DJB/NIST FOI dialogue is that there is reason to believe NIST has serious process problems that go far beyond simply handling an FOI well. From their own responses, it reads as if they aren't able to articulate themselves why they would choose one contestant's algorithm over another's. That kind of undermines the entire point of having an open contest.

silisili

What's with the infighting here? Nothing about the post comes across as conspiracy theory level or reputation ruining. It makes me question the motives of those implying he's crazy, to be honest.

tptacek

Post-quantum cryptography is essentially a full-employment program for elite academic public key cryptographers, which is largely what the "winning" PQC teams consist of. So, yeah, suggesting that one of those teams was compromised by an intelligence agency is "conspiracy theory level".

Nobody is denying the legitimacy of the suit itself. NIST is obligated to follow public records law, and public records law is important. Filippo's message, which we're all commenting on here, says that directly.

7373737373

Has the general notion of "conspiracy theory" ever carried any positive value? It only seems to exist to discredit "doubters against the majority consensus" without substance. But I guess words like "crank" wouldn't even exist if there weren't many people like it, so it carries some "definitional" value.

Because they show total disregard for someones opinion (in a more formal way: "unlike you/them, i completely agree with the (apparent) majority consensus (which it also implies), these words probably don't belong into a serious discussion.

throwaway654329

Dismissing this lawsuit as a conspiracy theory is embarrassing for both of them.

There is ample evidence to document malfeasance by the involved parties, and it’s reasonable to ask NIST to follow public law.

detaro

> Dismissing this lawsuit as a conspiracy theory is embarrassing for both of them.

They are not dismissing the lawsuit.

throwaway654329

One says he’s doing it wrong. The other says he hopes that he wins, of course!

Meanwhile they go on to attack Bernstein, mischaracterize his writing, completely dismiss his historical analysis, mock him with memes as a conspiracy theorist, and to top it off they question his internal motivations (which they somehow know) as some kind of a sore loser which is demonstrably false.

The plot twist for the last point: he is still in the running for round four and his former PhD students did win major parts of round three.

svnpenn

Filippo Valsorda seems to be happy to ignore the fact that NIST already let an NSA backdoor in, as recently as 2014:

https://wikipedia.org/wiki/Dual_EC_DRBG

is he really just going to ignore something from 8 years ago?

throwaway654329

Yes, he appears to be unreasonably dismissive of the blindly obvious history and the current situation.

As an aside, this tracks with his choice of employers - at least one of which was a known and documented NSA collaborator (as well as a victim, irony of irony) before he took the job with them.

As Upton Sinclair remarked: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”

Joining Google after Snowden revealed PRISM and BULLRUN, as well as MUSCULAR, is almost too rich to believe, Meanwhile he asserts and dismisses Bernstein as a conspiracy theorist. It’s a classic bad faith ad-hominem coincidence theory.

oittaa

tptacek

First, last I checked, Filippo does not in fact work at Google.

Second: the guidelines on this site forbid you to write comments like this; in fact, this pattern of comments is literally the most frequent source of moderator admonitions on HN.

Filippo hardly needs me to defend his reputation, but, as a service to HN and to you in particular, I'd want to raise your awareness of the risk of beclowning yourself by suggesting that he, of all people, is somehow compromised.

ghoward

Thanks for letting me know. I think I'll consider both of them compromised.

jacooper

Man, mobile typos suck.

er4hn

> The same people tend to have trouble grasping that most of the vulnerabilities exploited and encouraged by NSA are also exploitable by the Chinese government. These people start with the assumption that Americans are the best at everything; ergo, we're also the best at espionage. If the Chinese government stole millions of personnel records from the U.S. government, records easily usable as a springboard for further attacks, this can't possibly be because the U.S. government made a policy decision to keep our computer systems "weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques".

I'm not sure if I understand this part. I was under the impression that the OPM hack was a result of poor authn and authz controls, unrelated to cryptography. Was there a cryptography component sourced somewhere?

danielheath

If, rather than hoarding offensive tools & spying, the NSA had interpreted its mission as being to harden the security of government infrastructure (surely even more firmly within the remit of national security) and spent its considerable budget in that direction, would authn and authz controls have been used at the OPM?

woodruffw

This is my understanding as well. I asked this very same question less than a week ago[1], and now it's the first Google result when you search "OPM Dual_EC_DRBG."

The response to my comment covers some circumstantial evidence. But I'm not personally convinced; human factors are a much more parsimonious explanation.

[1]: https://news.ycombinator.com/item?id=32286528

efitz

Why don’t we invert FOIA?

Why don’t we require that all internal communications and records be public, available within 24 hours on the web, and provide a very painful mechanism involving significant personal effort of high level employees for every single communication or document that is to be redacted in some way? The key is requiring manual, personal (non-delegatable) effort on the part of senior bureaucrats, and to allow a private cause of action for citizens and waiver of immunity for bureaucrats.

We could carve out (or maybe not) specific things like allowing automatic redaction of employee PII and PII of citizens receiving government benefits.

After many decades, it’s clear that the current approach to FOIA and sunshine laws just isn’t working.

[ed] fixed autocorrect error

chaps

The carve-out you mention is a decent idea on paper, but in practice is a difficult process. There's really no way to do it in any significant degree without basically putting all gov to a complete halt. Consider that government is not staffed with technical people, nor necessarily critically minded people to implement these systems.

There are ways to push for FOIA improvements that don't require this sort of drastic approach. Problem is, it takes a lot of effort on the parts of FOIA requesters, through litigation and change in the laws. Things get surprisingly nuanced when you really get down into what a "record" is, specifically for digital information. I definitely wouldn't want to have "data" open by default in this manner, because it would lead to privacy hell.

Another component of this all is to consider contractors and subcontractors. Would they fall under this? If so, to what degree? If not, how do we prevent laundering of information through contractors/subcontractors?

To a large degree, a lot of "positive" transparency movements like the one you suggest can ironically lead to reduced transparency in some of the more critical sides of transparency. A good example of that is "open data", which gives an appearance of providing complete data, but without the legal requirements to enforce it. Makes gov look good but it de-incentivizes transparency pushback and there's little way to identify whether all relevant information is truly exposed. I would imagine similar would happen here.

efitz

A private right of action and waiver of immunity solves most of the “bad actor” problems.

The big issue is how to preserve what actually needs to be secret (in the interest of the USA, not the interests of the bureaucracy) while forcing everything else to be public.

A lot of things are secret that don’t need to be secret; that’s a side effect of mandatory data classification and normal bureaucratic incentives- you won’t get in trouble for over-classifying, and classified information is a source of bureaucratic power. So you have to introduce a really strong personal incentive to offset that or nothing will ever change.

Personally, I don’t think that information should be classified if it came from public sources. Or maybe only allow such information to be classified for a short period of time, eg one year.

The longer and/or higher the classification level, the more effort should be involved, to create disincentives to over-classification.

chaps

I'm sorry, but very little of what you're saying makes sense in practice. I suggest submitting some FOIA requests to your local government to get some context and understanding of the difficulties.

gorgoiler

The old Abe rhetoric was powerful but it always felt like it was only hitting home on two of the three points. Obviously government, by definition really, is of the people. The much better parts were for the people and by the people.

null

[deleted]

voz_

chmod775

Qualifiers such as evil aren't really useful when there hasn't been a country acting honorably on that stage for a long time, if ever.

Here's a phrasing that might be more appropriate:

"Since we're backstabbers and scoundrels, we should exercise caution around each other."

vasco

Do you think it's tough for those regimes to pay someone to do FOIA requests for them? Or to get jobs at government agencies?

efitz

We should rethink the concept of a “secret”. If it’s really a secret, it will still be worth the effort to protect.

acover

They are erroring on the side of caution because people have determined secret information from public information - like the energy in a nuclear bomb (censored) by the blast radius (public).

Another example is they want to protect their means and methods. But those means and methods are how they know most information. Often times it's easy to work backwards from they know x therefore y is compromised.

It's a hard problem similar to how to release anonymized data. See K-anonymity attacks and caveats.

https://en.wikipedia.org/wiki/K-anonymity

denton-scratch

Surely "keeping things a little more hidden" depends on reliable cryptography.

nix23

Not sure if the US with it's torture-base aka Guantanamo and torture-safe-houses around the world really has the right to call someone else "evil", i don't mean that as "whataboutissm" but that human lives are not more "worth" in the US as in Mainland China

bsaul

holy crap, i wondered why the post didn't mention work by dj bernstein outing flaws in curves submitted by nsa...

Well, didn't expect the post to actually be written by him.

bsaul

side question :

I've only recently started to digg a bit deeper into crypto algorithms ( looking into various types of curves etc), and it gave me the uneasing feeling that the whole industry is relying on the expertise of only a handful of guys to actually ensure that crypto schemes used today are really working.

Am i wrong ? are there actually thousands and thousands of people with the expertise to actually proove that the algorithms used today are really safe ?

kibibyte

I don’t know if that’s easily quantifiable, but I had a cryptography professor (fairly well-known nowadays) several years ago tell us that she only trusted 7 people (or some other absurdly low number), one of them being djb, to be able to evaluate the security of cryptographic schemes.

Perhaps thousands of people in the world can show you proofs of security, but very few of them may be able to take into account all practical considerations like side channels and the like.

benlivengood

There may be thousands of people in the entire world who understand cryptanalysis well enough to accurately judge the security of modern ciphers. Most aren't living or working in the U.S.

It's very difficult to do better. The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography. The best we can achieve is heuristic judgements about what the best possible attacks are, and P?=NP is an open question.

Tainnor

> The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography.

No unconditional proofs (except for the OTP ofc), but there are quite a few conditional proofs. For example, it's possible to show that CBC is secure if the underlying block cipher is.

aumerle

Proof! the entire field of cryptography can prove absolutely nothing other than that a single use of One time pad is secure. the rest is all hand waving, that boils down to no-one I know knows how to do this, and I cant do it myself, so I believe it's secure.

So the best we have in cryptography is trusting "human instincts/judgements" about various algorithms. Which then further reduces to trusting humans.

null

[deleted]

chasil

This "monoculture" post raised this point several years ago.

https://lwn.net/Articles/681616/

NavinF

Most programmers don't need to prove crypto algorithms. There are many situations where you can just use TLS 1.3 and let it choose the ciphers. If you really need to build a custom protocol or file format, you can still use libsodium's secretbox, crypto_box, and crypto_kx functions which use the right algorithms.

ziddoap

This is completely unrelated to the question being asked by the parent. They aren't asking about the average programmer. They are asking how many people in the world can truly 'prove' (to some reasonable degree) that the cryptography in use and the algorithms that are implementing that cryptography are 'secure' (to some reasonable degree).

Put another way, they are asking how many people in the world could verify that the algorithms used by libsodium, crypto_box, etc. are secure.

NavinF

My point was that you don't need "thousands and thousands of people with the expertise to actually proove that the algorithms used today are really safe".

If the demand existed, there would be a lot more of those people.

l33t2328

The grandparent post is asking about the people who need to know enough to program TLS to

> let it choose

gred

This guy is the best kind of curmudgeon. I love it.