Re: #SharedCrypto 3rd party library standards #SharedCrypto

Bob Summerwill <bob@...>

Government crypto is something which came up within the EEA.

Essentially you have NIST standards and then there are Chinese and Russian equivalents which are mandatory for regulated (mainly banking) industry in those countries.

Masterchain, built by the Russian FinTech consortium, with Sberbank in a lead role, forked Geth and switched to GOST cryptography.  More that that, even, they did some architectural fixes to make all the cryptography pluggable, so you could have different "modes".   Kirill also had some ideas about how you could bridge those different modes in a manner analogous to a network gateway, so you could have International backbones connecting country-specific networks with different cryptography.  Bridge nodes.

On Fri., Oct. 26, 2018, 8:28 a.m. Middleton, Dan, <dan.middleton@...> wrote:

Yes, that criteria list was heavily biased towards standard implementations. To accommodate the other 2 tiers it should be expanded to quantify the level of review of the algorithm.

When it comes to government required implementations, I’m willing to take the position we declare that as out of scope.

On the one hand, if someone wants to contribute something that’s great. On the other hand, each thing we add costs more overhead in a variety of areas including maintaining the build and CI much less security review. The implications, both good and bad, of government implementations are probably significant but they are beyond what I have time to consider right now. I think the fail-safe is to declare them out of scope for the time being and re-evaluate in the future.





From: "Montgomery, Hart" <hmontgomery@...>
Date: Thursday, October 25, 2018 at 9:01 AM
To: Dan Middleton <dan.middleton@...>, "labs@..." <labs@...>
Subject: RE: #SharedCrypto 3rd party library standards


This is a good point.  However, there are many other criteria that we might use to assess the confidence we have in standards or libraries we are using—the ones Dan lists here are only related to the practical issues around code implementation.  Other questions to consider include things like how much the algorithms being implemented have been studied and/or peer reviewed, whether the library implements a “sketchy” government-designed algorithm that has the potential for back doors, and what cryptographic assumptions the implementations are based upon.


I’m generally in favor of being more permissive in terms of implementations we add to the project (if someone wants to contribute them and they are useful and seemingly secure, then why not).  However, whatever build processes we use should heavily flag nonstandard or nontraditional implementations, and it should be impossible for a user to “build” crypto-lib with such algorithms unintentionally.


Exactly how we want to rank or rate code dependencies (including ones that we potentially write!) is a good thing for open discussion.  We can discuss our approach for something like this at our meeting next week if people like.  I doubt we’ll get universal agreement, but that’s OK—our goal should again be to just inform people using the less standard stuff and make sure they are aware rather than dictate exactly what they should use.





From: labs@... [mailto:labs@...] On Behalf Of Middleton, Dan
Sent: Thursday, October 25, 2018 6:03 AM
To: labs@...
Subject: [Hyperledger Labs] #SharedCrypto 3rd party library standards


I propose we establish some standards for libraries we will incorporate in crypto-lib (or Ursa or whatever we will soon call it :)  )


As a motivating example there’s a PR to add a blake2 library. I’ve not independently verified the performance claims but it looks like it is quite fast. In the risk department, though, the source repo indicates a single contributor and only 2-3 months of history. The latter raises risks that the code is not hardened and the former is a risk that it won’t be maintained.


The different tiers we establish complicate having a single list of criteria. Without being too rigid we could probably make a matrix of what degree applies to which tier. Here’s a starter list of criteria:


  • Maturity (how long has this code existed)
  • Maintainer count (how likely is the code to be maintained and issues responded to)
  • Community size (are there active mail lists and users that indicate it’s in active use)
  • Bug reporting (is there a way to submit security bugs)
  • What is the maintenance history (regular updates, patches, responsiveness for CVEs)?
  • Known issues (due diligence that the code is sound)
  • Are there protected releases (can we depend on signed libraries)


Taking `maturity` as a simple example we could set the levels for the 3 tiers as

Standard:            1 year

Semi-Trusted:   3 months

Research:            NA


Interested in feedback on this approach.






Join to automatically receive all group messages.