Re: #SharedCrypto 3rd party library standards #SharedCrypto

Hart Montgomery

This is a good point.  However, there are many other criteria that we might use to assess the confidence we have in standards or libraries we are using—the ones Dan lists here are only related to the practical issues around code implementation.  Other questions to consider include things like how much the algorithms being implemented have been studied and/or peer reviewed, whether the library implements a “sketchy” government-designed algorithm that has the potential for back doors, and what cryptographic assumptions the implementations are based upon.


I’m generally in favor of being more permissive in terms of implementations we add to the project (if someone wants to contribute them and they are useful and seemingly secure, then why not).  However, whatever build processes we use should heavily flag nonstandard or nontraditional implementations, and it should be impossible for a user to “build” crypto-lib with such algorithms unintentionally.


Exactly how we want to rank or rate code dependencies (including ones that we potentially write!) is a good thing for open discussion.  We can discuss our approach for something like this at our meeting next week if people like.  I doubt we’ll get universal agreement, but that’s OK—our goal should again be to just inform people using the less standard stuff and make sure they are aware rather than dictate exactly what they should use.





From: labs@... [mailto:labs@...] On Behalf Of Middleton, Dan
Sent: Thursday, October 25, 2018 6:03 AM
To: labs@...
Subject: [Hyperledger Labs] #SharedCrypto 3rd party library standards


I propose we establish some standards for libraries we will incorporate in crypto-lib (or Ursa or whatever we will soon call it :)  )


As a motivating example there’s a PR to add a blake2 library. I’ve not independently verified the performance claims but it looks like it is quite fast. In the risk department, though, the source repo indicates a single contributor and only 2-3 months of history. The latter raises risks that the code is not hardened and the former is a risk that it won’t be maintained.


The different tiers we establish complicate having a single list of criteria. Without being too rigid we could probably make a matrix of what degree applies to which tier. Here’s a starter list of criteria:


  • Maturity (how long has this code existed)
  • Maintainer count (how likely is the code to be maintained and issues responded to)
  • Community size (are there active mail lists and users that indicate it’s in active use)
  • Bug reporting (is there a way to submit security bugs)
  • What is the maintenance history (regular updates, patches, responsiveness for CVEs)?
  • Known issues (due diligence that the code is sound)
  • Are there protected releases (can we depend on signed libraries)


Taking `maturity` as a simple example we could set the levels for the 3 tiers as

Standard:            1 year

Semi-Trusted:   3 months

Research:            NA


Interested in feedback on this approach.






Join to automatically receive all group messages.