* algorithm support table should replace the list of algorithms; minor refactor w.r.t. shebang, imports, and unnecessary global states
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* still need to fill in the content
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* first draft of alg support table
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* some refactoring
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* wrap standardization status with url to spec
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* Use split with no argument to split against any whitespace, not just space
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* documentation; make primary implementation monospaced
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* docs/algorithms/sig/sld_dsa.yml is generated from a Jinja template elsewhere
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* fixed invalid markdown anchors
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* algorithm family names will not link to docs/algorithms markdowns because Doxygen cannot handle them
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* add git diff to basic check for debugging purpose
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* resolved failure to consistently produce the same README.md
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* rephrasing standardization status for PQC third round candidates
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* improved explanation for NTRU's standardization status
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* another try at improving phrasing of standardization status
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* fixed typo
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* removed spec-url from lms.yml
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* revised specification URL to be consistent with spec-version
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
* Revised FrodoKEM standardization status to reflect ISO consideration
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
---------
Signed-off-by: Ganyu (Bruce) Xu <g66xu@uwaterloo.ca>
- [Supported Algorithms](#supported-algorithms)
- [Key encapsulation mechanisms](#key-encapsulation-mechanisms)
- [Signature schemes](#signature-schemes)
+ - [Stateful signature schemes](#stateful-signature-schemes)
- [Limitations and Security](#limitations-and-security)
- [Platform limitations](#platform-limitations)
- [Support limitations](#support-limitations)
All names other than `ML-KEM` and `ML-DSA` are subject to change. `liboqs` makes available a [selection mechanism for algorithms on the NIST standards track, continued NIST competition, or purely experimental nature by way of the configuration variable OQS_ALGS_ENABLED](CONFIGURE.md#oQS_ALGS_ENABLED). By default `liboqs` is built supporting all, incl. experimental, PQ algorithms listed below.
+<!-- OQS_TEMPLATE_FRAGMENT_ALG_SUPPORT_START -->
#### Key encapsulation mechanisms
-
-<!--- OQS_TEMPLATE_FRAGMENT_LIST_KEXS_START -->
-- **BIKE**: BIKE-L1, BIKE-L3, BIKE-L5
-- **Classic McEliece**: Classic-McEliece-348864†, Classic-McEliece-348864f†, Classic-McEliece-460896†, Classic-McEliece-460896f†, Classic-McEliece-6688128†, Classic-McEliece-6688128f†, Classic-McEliece-6960119†, Classic-McEliece-6960119f†, Classic-McEliece-8192128†, Classic-McEliece-8192128f†
-- **FrodoKEM**: FrodoKEM-640-AES, FrodoKEM-640-SHAKE, FrodoKEM-976-AES, FrodoKEM-976-SHAKE, FrodoKEM-1344-AES, FrodoKEM-1344-SHAKE
-- **HQC**: HQC-128, HQC-192, HQC-256
-- **Kyber**: Kyber512, Kyber768, Kyber1024
-- **ML-KEM**: ML-KEM-512, ML-KEM-768, ML-KEM-1024
-- **NTRU**: NTRU-HPS-2048-509, NTRU-HPS-2048-677, NTRU-HPS-4096-821, NTRU-HPS-4096-1229, NTRU-HRSS-701, NTRU-HRSS-1373
-- **NTRU-Prime**: sntrup761
-<!--- OQS_TEMPLATE_FRAGMENT_LIST_KEXS_END -->
+| Algorithm family | Standardization status | Primary implementation |
+|:-------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|
+| BIKE | Not selected by [NIST](https://bikesuite.org/files/v5.1/BIKE_Spec.2022.10.10.1.pdf) | [`awslabs/bike-kem`](https://github.com/awslabs/bike-kem) |
+| Classic McEliece | Under [ISO](https://classic.mceliece.org/iso.html) consideration | [`PQClean/PQClean@1eacfda`](https://github.com/PQClean/PQClean/commit/1eacfdafc15ddc5d5759d0b85b4cef26627df181) |
+| FrodoKEM | Under [ISO](https://frodokem.org/) consideration | [`microsoft/PQCrypto-LWEKE@b6609d3`](https://github.com/microsoft/PQCrypto-LWEKE/commit/b6609d30a9982318d7f2937aa3c7b92147b917a2) |
+| HQC | Selected by [NIST](https://pqc-hqc.org/doc/hqc_specifications_2025_08_22.pdf) for upcoming standardization | [`PQClean/PQClean@1eacfda`](https://github.com/PQClean/PQClean/commit/1eacfdafc15ddc5d5759d0b85b4cef26627df181) |
+| Kyber | Selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/Kyber-Round3.zip) as basis for ML-KEM (FIPS 203) | [`pq-crystals/kyber@441c051`](https://github.com/pq-crystals/kyber/commit/441c0519a07e8b86c8d079954a6b10bd31d29efc) |
+| ML-KEM | Standardized by [NIST](https://csrc.nist.gov/pubs/fips/203/final) | [`pq-code-package/mlkem-native@048fc2a`](https://github.com/pq-code-package/mlkem-native/commit/048fc2a7a7b4ba0ad4c989c1ac82491aa94d5bfa) |
+| NTRU | Not selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/NTRU-Round3.zip), under standardization consideration by [NTT](https://info.isl.ntt.co.jp/crypt/ntru/index.html) | [`PQClean/PQClean@4c9e5a3`](https://github.com/PQClean/PQClean/commit/4c9e5a3aa715cc8d1d0e377e4e6e682ebd7602d6) |
+| NTRU-Prime | Not selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/NTRU-Prime-Round3.zip) | [`PQClean/PQClean@4c9e5a3`](https://github.com/PQClean/PQClean/commit/4c9e5a3aa715cc8d1d0e377e4e6e682ebd7602d6) |
#### Signature schemes
-
-<!--- OQS_TEMPLATE_FRAGMENT_LIST_SIGS_START -->
-- **CROSS**: cross-rsdp-128-balanced, cross-rsdp-128-fast, cross-rsdp-128-small†, cross-rsdp-192-balanced, cross-rsdp-192-fast, cross-rsdp-192-small†, cross-rsdp-256-balanced†, cross-rsdp-256-fast, cross-rsdp-256-small†, cross-rsdpg-128-balanced, cross-rsdpg-128-fast, cross-rsdpg-128-small, cross-rsdpg-192-balanced, cross-rsdpg-192-fast, cross-rsdpg-192-small†, cross-rsdpg-256-balanced, cross-rsdpg-256-fast, cross-rsdpg-256-small†
-- **Falcon**: Falcon-512, Falcon-1024, Falcon-padded-512, Falcon-padded-1024
-- **MAYO**: MAYO-1, MAYO-2, MAYO-3, MAYO-5†
-- **ML-DSA**: ML-DSA-44, ML-DSA-65, ML-DSA-87
-- **SLH-DSA**: SLH\_DSA\_PURE\_SHA2\_128S†, SLH\_DSA\_PURE\_SHA2\_128F†, SLH\_DSA\_PURE\_SHA2\_192S†, SLH\_DSA\_PURE\_SHA2\_192F†, SLH\_DSA\_PURE\_SHA2\_256S†, SLH\_DSA\_PURE\_SHA2\_256F†, SLH\_DSA\_PURE\_SHAKE\_128S†, SLH\_DSA\_PURE\_SHAKE\_128F†, SLH\_DSA\_PURE\_SHAKE\_192S†, SLH\_DSA\_PURE\_SHAKE\_192F†, SLH\_DSA\_PURE\_SHAKE\_256S†, SLH\_DSA\_PURE\_SHAKE\_256F†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHA2\_128S†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHA2\_128F†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHA2\_192S†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHA2\_192F†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHA2\_256S†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHA2\_256F†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHAKE\_128S†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHAKE\_128F†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHAKE\_192S†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHAKE\_192F†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHAKE\_256S†, SLH\_DSA\_SHA2\_224\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA2\_256\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA2\_384\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA2\_512\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA2\_512\_224\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA2\_512\_256\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA3\_224\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA3\_256\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA3\_384\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHA3\_512\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHAKE\_128\_PREHASH\_SHAKE\_256F†, SLH\_DSA\_SHAKE\_256\_PREHASH\_SHAKE\_256F†
-- **SNOVA**: SNOVA\_24\_5\_4, SNOVA\_24\_5\_4\_SHAKE, SNOVA\_24\_5\_4\_esk, SNOVA\_24\_5\_4\_SHAKE\_esk, SNOVA\_37\_17\_2†, SNOVA\_25\_8\_3, SNOVA\_56\_25\_2†, SNOVA\_49\_11\_3†, SNOVA\_37\_8\_4†, SNOVA\_24\_5\_5†, SNOVA\_60\_10\_4†, SNOVA\_29\_6\_5†
-- **SPHINCS+-SHA2**: SPHINCS+-SHA2-128f-simple, SPHINCS+-SHA2-128s-simple, SPHINCS+-SHA2-192f-simple, SPHINCS+-SHA2-192s-simple, SPHINCS+-SHA2-256f-simple, SPHINCS+-SHA2-256s-simple
-- **SPHINCS+-SHAKE**: SPHINCS+-SHAKE-128f-simple, SPHINCS+-SHAKE-128s-simple, SPHINCS+-SHAKE-192f-simple, SPHINCS+-SHAKE-192s-simple, SPHINCS+-SHAKE-256f-simple, SPHINCS+-SHAKE-256s-simple
-- **UOV**: OV-Is, OV-Ip, OV-III, OV-V, OV-Is-pkc, OV-Ip-pkc, OV-III-pkc, OV-V-pkc, OV-Is-pkc-skc, OV-Ip-pkc-skc, OV-III-pkc-skc, OV-V-pkc-skc
-<!--- OQS_TEMPLATE_FRAGMENT_LIST_SIGS_END -->
-- **XMSS**: XMSS-SHA2_10_256, XMSS-SHA2_16_256, XMSS-SHA2_20_256, XMSS-SHAKE_10_256, XMSS-SHAKE_16_256, XMSS-SHAKE_20_256, XMSS-SHA2_10_512, XMSS-SHA2_16_512, XMSS-SHA2_20_512, XMSS-SHAKE_10_512, XMSS-SHAKE_16_512, XMSS-SHAKE_20_512, XMSS-SHA2_10_192, XMSS-SHA2_16_192, XMSS-SHA2_20_192, XMSS-SHAKE256_10_192, XMSS-SHAKE256_16_192, XMSS-SHAKE256_20_192, SHAKE256_10_256, SHAKE256_16_256, SHAKE256_20_256, XMSSMT-SHA2_20/2_256, XMSSMT-SHA2_20/4_256, XMSSMT-SHA2_40/2_256, XMSSMT-SHA2_40/4_256, XMSSMT-SHA2_40/8_256, XMSSMT-SHA2_60/3_256, XMSSMT-SHA2_60/6_256, XMSSMT-SHA2_60/12_256, XMSSMT-SHAKE_20/2_256, XMSSMT-SHAKE_20/4_256, XMSSMT-SHAKE_40/2_256, XMSSMT-SHAKE_40/4_256, XMSSMT-SHAKE_40/8_256, XMSSMT-SHAKE_60/3_256, XMSSMT-SHAKE_60/6_256, XMSSMT-SHAKE_60/12_256
-- **LMS**: LMS_SHA256_H5_W1, LMS_SHA256_H5_W2, LMS_SHA256_H5_W4, LMS_SHA256_H5_W8, LMS_SHA256_H10_W1, LMS_SHA256_H10_W2, LMS_SHA256_H10_W4, LMS_SHA256_H10_W8, LMS_SHA256_H15_W1, LMS_SHA256_H15_W2, LMS_SHA256_H15_W4, LMS_SHA256_H15_W8, LMS_SHA256_H20_W1, LMS_SHA256_H20_W2, LMS_SHA256_H20_W4, LMS_SHA256_H20_W8, LMS_SHA256_H25_W1, LMS_SHA256_H25_W2, LMS_SHA256_H25_W4, LMS_SHA256_H25_W8, LMS_SHA256_H5_W8_H5_W8, LMS_SHA256_H10_W4_H5_W8, LMS_SHA256_H10_W8_H5_W8, LMS_SHA256_H10_W2_H10_W2, LMS_SHA256_H10_W4_H10_W4, LMS_SHA256_H10_W8_H10_W8, LMS_SHA256_H15_W8_H5_W8, LMS_SHA256_H15_W8_H10_W8, LMS_SHA256_H15_W8_H15_W8, LMS_SHA256_H20_W8_H5_W8, LMS_SHA256_H20_W8_H10_W8, LMS_SHA256_H20_W8_H15_W8, LMS_SHA256_H20_W8_H20_W8
+| Algorithm family | Standardization status | Primary implementation |
+|:-------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------|
+| CROSS | Under [NIST](https://www.cross-crypto.com/CROSS_Specification_v2.2.pdf) consideration | [`CROSS-signature/CROSS-lib-oqs@c8f7411`](https://github.com/CROSS-signature/CROSS-lib-oqs/commit/c8f7411fed136f0e37600973fa3dbed53465e54f) |
+| Falcon | Selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/Falcon-Round3.zip) for upcoming standardization | [`PQClean/PQClean@1eacfda`](https://github.com/PQClean/PQClean/commit/1eacfdafc15ddc5d5759d0b85b4cef26627df181) |
+| MAYO | Under [NIST](https://csrc.nist.gov/csrc/media/Projects/pqc-dig-sig/documents/round-2/spec-files/mayo-spec-round2-web.pdf) consideration | [`PQCMayo/MAYO-C@4b7cd94`](https://github.com/PQCMayo/MAYO-C/commit/4b7cd94c96b9522864efe40c6ad1fa269584a807) |
+| ML-DSA | Standardized by [NIST](https://csrc.nist.gov/pubs/fips/204/final) | [`pq-crystals/dilithium@444cdcc`](https://github.com/pq-crystals/dilithium/commit/444cdcc84eb36b66fe27b3a2529ee48f6d8150c2) |
+| SLH-DSA | [Standardized by NIST](https://csrc.nist.gov/pubs/fips/205/final) | [`pq-code-package/slhdsa-c@a0fc1ff`](https://github.com/pq-code-package/slhdsa-c/commit/a0fc1ff253930060d0246aebca06c2538eb92b88) |
+| SNOVA | Under [NIST](https://csrc.nist.gov/csrc/media/Projects/pqc-dig-sig/documents/round-2/spec-files/snova-spec-round2-web.pdf) consideration | [`vacuas/SNOVA@1c3ca6f`](https://github.com/vacuas/SNOVA/commit/1c3ca6f4f7286c0bde98d7d6f222cf63b9d52bff) |
+| SPHINCS+ | Selected by [NIST](https://sphincs.org/data/sphincs+-r3.1-specification.pdf) as basis for SLH-DSA (FIPS 205) | [`PQClean/PQClean@1eacfda`](https://github.com/PQClean/PQClean/commit/1eacfdafc15ddc5d5759d0b85b4cef26627df181) |
+| UOV | Under [NIST](https://csrc.nist.gov/csrc/media/Projects/pqc-dig-sig/documents/round-2/spec-files/uov-spec-round2-web.pdf) consideration | [`pqov/pqov@7e0832b`](https://github.com/pqov/pqov/commit/7e0832b6732a476119742c4acabd11b7c767aefb) |
+
+#### Stateful signature schemes
+| Algorithm family | Standardization status | Primary implementation |
+|:-------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------|
+| LMS | Standardized by [IRTF](https://www.rfc-editor.org/info/rfc8554), approved by [NIST](https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-208.pdf) | [`cisco/hash-sigs`](https://github.com/cisco/hash-sigs) |
+| XMSS | Standardized by [IRTF](https://www.rfc-editor.org/info/rfc8391), approved by [NIST](https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-208.pdf) | [`XMSS/xmss-reference`](https://github.com/XMSS/xmss-reference) |
+<!-- OQS_TEMPLATE_FRAGMENT_ALG_SUPPORT_END -->
Note that for algorithms marked with a dagger (†), liboqs contains at least one implementation that uses a large amount of stack space; this may cause failures when run in threads or in constrained environments. For more information, consult the algorithm information sheets in the [docs/algorithms](https://github.com/open-quantum-safe/liboqs/tree/main/docs/algorithms) folder.
crypto-assumption: QC-MDPC (Quasi-Cyclic Moderate Density Parity-Check)
website: http://bikesuite.org/
nist-round: 4
+standardization-status: Not selected by [NIST](https://bikesuite.org/files/v5.1/BIKE_Spec.2022.10.10.1.pdf)
spec-version: 5.1
primary-upstream:
source: https://github.com/awslabs/bike-kem
website: https://classic.mceliece.org
nist-round: 3
spec-version: SUPERCOP-20221025
+standardization-status: Under [ISO](https://classic.mceliece.org/iso.html) consideration
upstream-ancestors:
- SUPERCOP-20221025 "clean" and "avx2" implementations
advisories:
website: https://frodokem.org/
nist-round: 3
spec-version: NIST Round 3 submission
+standardization-status: Under [ISO](https://frodokem.org/) consideration
primary-upstream:
source: https://github.com/microsoft/PQCrypto-LWEKE/commit/b6609d30a9982318d7f2937aa3c7b92147b917a2
spdx-license-identifier: MIT
crypto-assumption: Syndrome decoding of structure codes (Hamming Quasi-Cyclic)
website: https://pqc-hqc.org/
nist-round: 4
+standardization-status: Selected by [NIST](https://pqc-hqc.org/doc/hqc_specifications_2025_08_22.pdf)
+ for upcoming standardization
spec-version: 2023-04-30
upstream-ancestors:
- https://github.com/SWilson4/package-pqclean/tree/8db1b24b/hqc
crypto-assumption: Module LWE+R with base ring Z[x]/(3329, x^256+1)
website: https://pq-crystals.org/
nist-round: 3
+standardization-status: Selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/Kyber-Round3.zip)
+ as basis for ML-KEM (FIPS 203)
spec-version: NIST Round 3 submission
primary-upstream:
source: https://github.com/pq-crystals/kyber/commit/441c0519a07e8b86c8d079954a6b10bd31d29efc
crypto-assumption: Module LWE+R with base ring Z[x]/(3329, x^256+1)
website: https://pq-crystals.org/kyber/ and https://csrc.nist.gov/pubs/fips/203
nist-round: FIPS203
+standardization-status: Standardized by [NIST](https://csrc.nist.gov/pubs/fips/203/final)
spec-version: ML-KEM
primary-upstream:
source: https://github.com/pq-code-package/mlkem-native/commit/048fc2a7a7b4ba0ad4c989c1ac82491aa94d5bfa
- Zhenfei Zhang
crypto-assumption: NTRU in Z[x]/(q, x^n-1) with prime n and power-of-two q
website: https://ntru.org/
+standardization-status: Not selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/NTRU-Round3.zip), under standardization consideration by [NTT](https://info.isl.ntt.co.jp/crypt/ntru/index.html)
nist-round: 3
spec-version: NIST Round 3 submission
upstream-ancestors:
website: https://ntruprime.cr.yp.to
nist-round: 3
spec-version: supercop-20200826
+standardization-status: Not selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/NTRU-Prime-Round3.zip)
upstream-ancestors:
- https://github.com/jschanck/package-pqclean/tree/4d9f08c3/ntruprime
- supercop-20210604
linear codes on a finite field
website: https://www.cross-crypto.com/
nist-round: 2
+standardization-status: Under [NIST](https://www.cross-crypto.com/CROSS_Specification_v2.2.pdf)
+ consideration
spec-version: 2.2 + PQClean and OQS patches
primary-upstream:
source: https://github.com/CROSS-signature/CROSS-lib-oqs/commit/c8f7411fed136f0e37600973fa3dbed53465e54f
crypto-assumption: hardness of NTRU lattice problems
website: https://falcon-sign.info
nist-round: 3
+standardization-status: Selected by [NIST](https://csrc.nist.gov/CSRC/media/Projects/post-quantum-cryptography/documents/round-3/submissions/Falcon-Round3.zip)
+ for upcoming standardization
spec-version: 20211101
primary-upstream:
source: https://github.com/PQClean/PQClean/commit/1eacfdafc15ddc5d5759d0b85b4cef26627df181
crypto-assumption: multivariable quadratic equations, oil and vinegar
website: https://pqmayo.org
nist-round: 2
+standardization-status: Under [NIST](https://csrc.nist.gov/csrc/media/Projects/pqc-dig-sig/documents/round-2/spec-files/mayo-spec-round2-web.pdf)
+ consideration
spec-version: NIST Round 2 (February 2025)
primary-upstream:
source: https://github.com/PQCMayo/MAYO-C/commit/4b7cd94c96b9522864efe40c6ad1fa269584a807
crypto-assumption: hardness of lattice problems over module lattices
website: https://pq-crystals.org/dilithium/ and https://csrc.nist.gov/pubs/fips/204/final
nist-round: FIPS204
+standardization-status: Standardized by [NIST](https://csrc.nist.gov/pubs/fips/204/final)
spec-version: ML-DSA
primary-upstream:
source: https://github.com/pq-crystals/dilithium/commit/444cdcc84eb36b66fe27b3a2529ee48f6d8150c2
+# Generated from src/sig/slh_dsa/templates/slh_dsa_docs_yml_template.jinja
+# by copy_from_slh_dsa_c.py
name: SLH-DSA
type: signature
principal-submitters:
crypto-assumption: hash-based signatures
website: https://csrc.nist.gov/pubs/fips/205/final
nist-round: FIPS205
+standardization-status: Standardized by NIST
+spec-url: https://csrc.nist.gov/pubs/fips/205/final
spec-version: SLH-DSA
spdx-license-identifier: MIT or ISC or Apache 2.0
primary-upstream:
crypto-assumption: multivariable quadratic equations, oil and vinegar
website: https://snova.pqclab.org/
nist-round: 2
+standardization-status: Under [NIST](https://csrc.nist.gov/csrc/media/Projects/pqc-dig-sig/documents/round-2/spec-files/snova-spec-round2-web.pdf)
+ consideration
spec-version: Round 2
primary-upstream:
source: https://github.com/vacuas/SNOVA/commit/1c3ca6f4f7286c0bde98d7d6f222cf63b9d52bff
crypto-assumption: hash-based signatures
website: https://sphincs.org/
nist-round: 3
+standardization-status: Selected by [NIST](https://sphincs.org/data/sphincs+-r3.1-specification.pdf)
+ as basis for SLH-DSA (FIPS 205)
spec-version: NIST Round 3 submission, v3.1 (June 10, 2022)
spdx-license-identifier: CC0-1.0
primary-upstream:
- Bo-Yin Yang
crypto-assumption: multivariable quadratic equations, oil and vinegar
website: https://www.uovsig.org/
+standardization-status: Under [NIST](https://csrc.nist.gov/csrc/media/Projects/pqc-dig-sig/documents/round-2/spec-files/uov-spec-round2-web.pdf)
+ consideration
nist-round: 2
spec-version: NIST Round 2 (February 2025)
primary-upstream:
crypto-assumption: hash-based signatures
website: https://www.rfc-editor.org/info/rfc8554
nist-round:
+standardization-status: Standardized by [IRTF](https://www.rfc-editor.org/info/rfc8554), approved by [NIST](https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-208.pdf)
spec-version:
spdx-license-identifier:
primary-upstream:
crypto-assumption: hash-based signatures
website: https://www.rfc-editor.org/info/rfc8391
+standardization-status: Standardized by [IRTF](https://www.rfc-editor.org/info/rfc8391), approved by [NIST](https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-208.pdf)
nist-round:
spec-version:
spdx-license-identifier: (Apache-2.0 OR MIT) AND CC0-1.0
--- /dev/null
+#!/usr/bin/env python3
+# SPDX-License-Identifier: MIT
+
+"""Helper functions for rendering the Algorithm Support table in README.md
+
+This is a separate module to facilitate code formatting and other dev tools,
+but it is not meant to be run by itself. Instead, run the legacy
+scripts/update_docs_from_yaml.py to invoke update_readme in this module.
+"""
+
+import os
+
+import tabulate
+import yaml
+
+YAML_EXTS = [".yaml", ".yml"]
+ALG_SUPPORT_HEADER = [
+ "Algorithm family",
+ "Standardization status",
+ "Primary implementation",
+]
+COMMIT_HASH_LEN = 7
+
+
+def format_upstream_source(source: str) -> str:
+ """For each YAML data sheet, the primary-upstream.source field contains some
+ URL to the implementation. At this moment all URLs are links to GitHub, so
+ we can format them as follows:
+
+ <handle>/<repository>@<commit> if commit is available
+ <handle>/<repository> otherwise
+ with a link to the repository
+ """
+ # TODO: we might get GitLab or other non-GH link in the future but oh well
+ prefix = "https://github.com/"
+ if not prefix in source:
+ raise ValueError(f"Non-GitHub source {source}")
+ url_start = source.find(prefix)
+ # NOTE: split with no argument will split with all whitespaces
+ url = source[url_start:].split()[0]
+ # example: ["PQClean", "PQClean", "commit", "1eacfdaf..."]
+ tokens = url[len(prefix) :].split("/")
+ handle, repo = tokens[0], tokens[1]
+ output = f"{handle}/{repo}"
+ if "commit/" in url:
+ commit = tokens[3][:COMMIT_HASH_LEN]
+ output += f"@{commit}"
+ return f"[`{output}`]({url})"
+
+
+def render_alg_support_tbl(doc_dir: str, anchor_alg_name: bool = False) -> str:
+ """Render a markdown table summarizing the algorithms described by YAML data
+ sheets stored in the specified doc directory
+
+ :param anchor_alg_name: if set to True, then "algorithm family" will link to
+ the corresponding markdown document under docs/algorithms/<kem|sig|sig_stfl>
+ otherwise "algorithm family" will be plain text with no link.
+ """
+ # TODO: anchor_alg_name is turned off because Doxygen cannot handle links
+ # to markdown files under docs/algorithms/xxx
+ yaml_paths = [
+ os.path.abspath(os.path.join(doc_dir, filepath))
+ for filepath in os.listdir(doc_dir)
+ if os.path.splitext(filepath)[1].lower() in YAML_EXTS
+ ]
+ yaml_paths.sort()
+ rows = [ALG_SUPPORT_HEADER]
+ for yaml_path in yaml_paths:
+ with open(yaml_path) as f:
+ algdata = yaml.safe_load(f)
+ alg_name = algdata["name"]
+ dirname = "kem"
+ if "sig/" in yaml_path:
+ dirname = "sig"
+ elif "sig_stfl/" in yaml_path:
+ dirname = "sig_stfl"
+ md_basename = os.path.splitext(os.path.split(yaml_path)[1])[0]
+ md_url = f"docs/algorithms/{dirname}/{md_basename}.md"
+ std_status = algdata["standardization-status"]
+ spec_url = algdata.get("spec-url", None)
+ primary_impl = format_upstream_source(algdata["primary-upstream"]["source"])
+ rows.append(
+ [
+ f"[{alg_name}]({md_url})" if anchor_alg_name else f"{alg_name}",
+ f"[{std_status}]({spec_url})" if spec_url else std_status,
+ primary_impl,
+ ]
+ )
+ tbl = tabulate.tabulate(rows, tablefmt="pipe", headers="firstrow")
+ return tbl
+
+
+def update_readme(liboqs_dir: str):
+ """Per liboqs/issues/2045, update README.md with an algorithm support table
+
+ The algorithm support table is a summary of individual algorithms currently
+ integrated into liboqs. The primary source of information are the various
+ YAML files under docs/algorithms/<kem|sig|sig_stfl> directory. The table
+ summarizes the following attributes:
+ - Algorithm family (e.g. Kyber, ML-KEM)
+ - Standardization status, with link to specification
+ - Primary source of implementation
+ - (WIP) Maintenance status
+ """
+ kem_doc_dir = os.path.join(liboqs_dir, "docs", "algorithms", "kem")
+ kem_tbl = render_alg_support_tbl(kem_doc_dir)
+ sig_doc_dir = os.path.join(liboqs_dir, "docs", "algorithms", "sig")
+ sig_tbl = render_alg_support_tbl(sig_doc_dir)
+ sig_stfl_doc_dir = os.path.join(liboqs_dir, "docs", "algorithms", "sig_stfl")
+ sig_stfl_tbl = render_alg_support_tbl(sig_stfl_doc_dir)
+ md_str = f"""#### Key encapsulation mechanisms
+{kem_tbl}
+
+#### Signature schemes
+{sig_tbl}
+
+#### Stateful signature schemes
+{sig_stfl_tbl}
+"""
+ readme_path = os.path.join(liboqs_dir, "README.md")
+ fragment_start = "<!-- OQS_TEMPLATE_FRAGMENT_ALG_SUPPORT_START -->\n"
+ fragment_end = "<!-- OQS_TEMPLATE_FRAGMENT_ALG_SUPPORT_END -->"
+ with open(readme_path, "r") as f:
+ readme = f.read()
+ fragment_start_loc = readme.find(fragment_start) + len(fragment_start)
+ fragment_end_loc = readme.find(fragment_end)
+ with open(readme_path, "w") as f:
+ f.write(readme[:fragment_start_loc])
+ f.write(md_str)
+ f.write(readme[fragment_end_loc:])
+#!/usr/bin/env python3
# SPDX-License-Identifier: MIT
import argparse
-import sys
import glob
+import os
+
import tabulate
import yaml
-import os
+
+from update_alg_support_table import update_readme
def load_yaml(filename, encoding='utf-8'):
with open(filename, mode='r', encoding=encoding) as fh:
with open(filename, mode='r', encoding=encoding) as fh:
return fh.read()
-kem_yamls = []
-sig_yamls = []
-sig_stfl_yamls = []
-
########################################
# Update the KEM markdown documentation.
########################################
def do_it(liboqs_root):
+ kem_yamls = []
+ sig_yamls = []
+ sig_stfl_yamls = []
+
for kem_yaml_path in sorted(glob.glob(os.path.join(liboqs_root, 'docs', 'algorithms', 'kem', '*.yml'))):
kem_yaml = load_yaml(kem_yaml_path)
kem_yamls.append(kem_yaml)
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
+ update_readme(liboqs_root)
- ####################
- # Update the README.
- ####################
- print("Updating README.md")
-
- readme_path = os.path.join(liboqs_root, 'README.md')
- start_identifier_tmpl = '<!--- OQS_TEMPLATE_FRAGMENT_LIST_{}_START -->'
- end_identifier_tmpl = '<!--- OQS_TEMPLATE_FRAGMENT_LIST_{}_END -->'
-
- # KEMS
- readme_contents = file_get_contents(readme_path)
-
- identifier_start = start_identifier_tmpl.format('KEXS')
- identifier_end = end_identifier_tmpl.format('KEXS')
-
- preamble = readme_contents[:readme_contents.find(identifier_start)]
- postamble = readme_contents[readme_contents.find(identifier_end):]
-
- with open(readme_path, mode='w', encoding='utf-8') as readme:
- readme.write(preamble + identifier_start + '\n')
-
- for kem_yaml in kem_yamls:
- parameter_sets = kem_yaml['parameter-sets']
- if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
- readme.write('- **{}**: {}†'.format(kem_yaml['name'], parameter_sets[0]['name']))
- if 'alias' in parameter_sets[0]:
- readme.write(' (alias: {})'.format(parameter_sets[0]['alias']))
- else:
- readme.write('- **{}**: {}'.format(kem_yaml['name'], parameter_sets[0]['name']))
- if 'alias' in parameter_sets[0]:
- readme.write(' (alias: {})'.format(parameter_sets[0]['alias']))
- for parameter_set in parameter_sets[1:]:
- if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
- readme.write(', {}†'.format(parameter_set['name']))
- if 'alias' in parameter_set:
- readme.write(' (alias: {})'.format(parameter_set['alias']))
- else:
- readme.write(', {}'.format(parameter_set['name']))
- if 'alias' in parameter_set:
- readme.write(' (alias: {})'.format(parameter_set['alias']))
- readme.write('\n')
-
- readme.write(postamble)
-
- # Signatures
- readme_contents = file_get_contents(readme_path)
-
- identifier_start = start_identifier_tmpl.format('SIGS')
- identifier_end = end_identifier_tmpl.format('SIGS')
-
- preamble = readme_contents[:readme_contents.find(identifier_start)]
- postamble = readme_contents[readme_contents.find(identifier_end):]
-
- with open(readme_path, mode='w', encoding='utf-8') as readme:
- readme.write(preamble + identifier_start + '\n')
-
- for sig_yaml in sig_yamls:
- # SPHINCS requires special handling.
- if "SPHINCS" in sig_yaml["name"]:
- for hash_func in ['SHA2', 'SHAKE']:
- parameter_sets = [pset for pset in sig_yaml['parameter-sets'] if hash_func in pset['name']]
- if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
- readme.write('- **SPHINCS+-{}**: {}†'.format(hash_func, parameter_sets[0]['name'].replace('_','\\_')))
- else:
- readme.write('- **SPHINCS+-{}**: {}'.format(hash_func, parameter_sets[0]['name'].replace('_','\\_')))
- for parameter_set in parameter_sets[1:]:
- if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
- readme.write(', {}†'.format(parameter_set['name'].replace('_', '\\_')))
- else:
- readme.write(', {}'.format(parameter_set['name'].replace('_', '\\_')))
- readme.write('\n')
- continue
-
- parameter_sets = sig_yaml['parameter-sets']
- if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
- readme.write('- **{}**: {}†'.format(sig_yaml['name'], parameter_sets[0]['name'].replace('_','\\_')))
- if 'alias' in parameter_sets[0]:
- readme.write(' (alias: {})'.format(parameter_sets[0]['alias']).replace('_','\\_'))
- else:
- readme.write('- **{}**: {}'.format(sig_yaml['name'], parameter_sets[0]['name'].replace('_','\\_')))
- if 'alias' in parameter_sets[0]:
- readme.write(' (alias: {})'.format(parameter_sets[0]['alias']).replace('_','\\_'))
- for parameter_set in parameter_sets[1:]:
- if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
- readme.write(', {}†'.format(parameter_set['name'].replace('_', '\\_')))
- if 'alias' in parameter_set:
- readme.write(' (alias: {})'.format(parameter_set['alias']).replace('_','\\_'))
- else:
- readme.write(', {}'.format(parameter_set['name'].replace('_', '\\_')))
- if 'alias' in parameter_set:
- readme.write(' (alias: {})'.format(parameter_set['alias']).replace('_','\\_'))
- readme.write('\n')
-
-
- readme.write(postamble)
-
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--liboqs-root", default=".")
+# Generated from src/sig/slh_dsa/templates/slh_dsa_docs_yml_template.jinja
+# by copy_from_slh_dsa_c.py
name: SLH-DSA
type: signature
principal-submitters:
crypto-assumption: hash-based signatures
website: https://csrc.nist.gov/pubs/fips/205/final
nist-round: FIPS205
+standardization-status: Standardized by NIST
+spec-url: https://csrc.nist.gov/pubs/fips/205/final
spec-version: SLH-DSA
spdx-license-identifier: MIT or ISC or Apache 2.0
primary-upstream:
large-stack-usage: true
{% endfor %}
-
\ No newline at end of file
+