The US National Security Agency headquarters at Fort Meade, Maryland SAUL LOEB/AFP via Getty Images
A prominent cryptography expert has told New ŅĮČ˾þà that a US spy agency could be weakening a new generation of algorithms designed to protect against hackers equipped with quantum computers.
at the University of Illinois Chicago says that the US National Institute of Standards and Technology (NIST) is deliberately obscuring the level of involvement the US National Security Agency (NSA) has in developing new encryption standards for āpost-quantum cryptographyā (PQC). He also believes that NIST has made errorsĀ ā either accidental or deliberateĀ ā in calculations describing the security of the new standards. NIST denies the claims.
āNIST isnāt following procedures designed to stop NSA from weakening PQC,ā says Bernstein. āPeople choosing cryptographic standards should be transparently and verifiably following clear public rules so that we donāt needĀ to worry about their motivations. NIST promised transparency and then claimed itĀ had shown all its work, but thatĀ claim simply isnāt true.ā
Advertisement
The mathematical problems we use to protect data are practically impossible for even the largest supercomputers to crack today. But when quantum computers become reliable and powerful enough, they will be able to break them in moments.
Although it is unclear when such computers will emerge, NISTĀ has been running a project to standardise a new generation of algorithms that resist their attacks. Bernstein, who in 2003 to refer to these kinds of algorithms, says theĀ NSA is actively engaged in putting secret weaknesses into new encryption standards that will allow them to be more easily cracked with the right knowledge. NISTās standards are used globally, so flaws could have a large impact.
Free newsletter
Sign up to The Daily
The latest on whatās new in science and why it matters each day.

Bernstein alleges that NISTās calculations for one of the upcoming PQC standards, Kyber512, are āglaringly wrongā, making it appear more secure than it really is. He says that NIST multiplied two numbers together when it would have been more correct to add them, resulting in an artificially high assessment of Kyber512ās robustness to attack.
āWe disagree with his analysis,ā says at NIST. āItās aĀ question for which there isnāt scientific certainty and intelligent people can have different views. We respect Danās opinion, but donāt agree with what he says.ā
Moody says that Kyber512 meets NISTās ālevel oneā security criteria, which makes it at least as hard toĀ break as a commonly used existing algorithm, AES-128. ThatĀ said, NIST recommends that, in practice, people should use a stronger version, Kyber768, which Moody says was a suggestion fromĀ the algorithmās developers.
NIST is currently in a period ofĀ public consultation and hopes to reveal the final standards for PQC algorithms next year so that organisations can begin to adopt them. The Kyber algorithm seems likely to make the cut as it has already progressed through several layers of selection.
Given its secretive nature, it is difficult to say for sure whether orĀ not the NSA has influenced theĀ PQC standards, but there haveĀ long been suggestions andĀ rumours that the agency deliberately weakens encryption algorithms. In 2013, The New York Times reported that the agency , and intelligence agency documents leaked by Edward Snowden in the same year contained references to the NSA deliberately placing a backdoor inĀ a cryptography algorithm, although that algorithm was .
Moody denies that NIST would ever agree to deliberately weaken aĀ standard at the behest of the NSA and says that any secret weakness would have had to be inserted without its knowledge. He also says that in the wake of the Snowden revelations, NIST has tightened guidelines to ensure transparency and security and toĀ rebuild confidence with cryptographic experts.
āWe wouldnāt have ever intentionally done anything likeĀ that,ā says Moody, but he acknowledges the Snowden leaksĀ caused a backlash. āAnytime the NSA gets brought up, thereās aĀ number of cryptographers that areĀ concerned and weāve tried to be open and transparent about our interactions.ā
Moody says that the NSA hasĀ also ā as far as a secretive intelligence agency can ā tried toĀ be more open. But the agency declined to comment when approached by New ŅĮČ˾þĆ.
āAll we can do is tell people thatĀ NIST are the ones in the roomĀ making the decisions, but ifĀ you donāt believe us, thereās no way you could verify that without being inside NIST,ā says Moody.
However, Bernstein alleges thatĀ NIST hasnāt been open about the level of input by the NSA, āstonewallingā him when he hasĀ asked for information. As a result, he has made freedom of information requests and taken NIST to court, .
Documents released to Bernstein indicate that a group described as the āPost Quantum Cryptography Team, National Institute of Standards and Technologyā included many NSAĀ members and that NIST had met with someone from the UKās Government Communications Headquarters (GCHQ), the UKĀ equivalent of the NSA.
at the University of Surrey, UK, says there are reasons to be wary of encryption algorithms. For example, the GEA-1 code used in mobile phone networks during the 1990s and 2000s was found to have a flaw that made it millions of times less computationally intensive than itĀ should have been to crackĀ ā although a culprit who put it thereĀ has never been identified.
But Woodward says that the current PQC candidates have been heavily scrutinised by academics and industry and havenāt yet beenĀ found lacking, while other algorithms that featured in earlier stages of the competition have been demonstrated to be flawed and were eliminated.
āIntelligence agencies have a history of weakening encryption, but thereās been such a lot of security analysis done on these candidates that I would be surprised if Kyber were somehow booby-trapped,ā he says.
Topics:



