fix: use node.js crypto for x25519 keys#389
Conversation
| "clean": "aegir clean", | ||
| "dep-check": "aegir dep-check", | ||
| "build": "aegir build", | ||
| "prebuild": "mkdirp dist/src && cp -R src/proto dist/src", |
There was a problem hiding this comment.
No need to do this with protons
| const asImpl = new ChaCha20Poly1305(ctx) | ||
| const CHACHA_POLY1305 = 'chacha20-poly1305' | ||
| const PKCS8_PREFIX = Buffer.from([0x30, 0x2e, 0x02, 0x01, 0x00, 0x30, 0x05, 0x06, 0x03, 0x2b, 0x65, 0x6e, 0x04, 0x22, 0x04, 0x20]) | ||
| const X25519_PREFIX = Buffer.from([0x30, 0x2a, 0x30, 0x05, 0x06, 0x03, 0x2b, 0x65, 0x6e, 0x03, 0x21, 0x00]) |
There was a problem hiding this comment.
Node.js requires these prefixes, @noble/curves doesn't add them.
Using node crypto to do X25519 key operations instead of `@noble/curves` yields a nice little performance bump which translates to slightly lower latencies when opening connections. Running the `benchmark.js` file: Before: ```console % node ./benchmark.js Initializing handshake benchmark Init complete, running benchmark handshake x 124 ops/sec ±0.47% (84 runs sampled) ``` After: ```console % node ./benchmark.js Initializing handshake benchmark Init complete, running benchmark handshake x 314 ops/sec ±0.99% (87 runs sampled) ```
a020ec9 to
e63bf0f
Compare
|
|
||
| export const uint16BEEncode = (value: number): Uint8Array => { | ||
| const target = allocUnsafe(2) | ||
| const target = uint8ArrayAllocUnsafe(2) |
There was a problem hiding this comment.
alloc is in uint8arrays now, it does the same thing.
| ne: input.subarray(0, 32), | ||
| ciphertext: input.subarray(32, input.length), | ||
| ns: new Uint8Array(0) | ||
| ns: uint8ArrayAlloc(0) |
There was a problem hiding this comment.
Use Buffers in node wherever possible.
| @@ -4,8 +4,8 @@ | |||
| /* eslint-disable @typescript-eslint/no-unnecessary-boolean-literal-compare */ | |||
There was a problem hiding this comment.
Regenerates code using latest protons which now uses alloc from uint8arrays instead of new Uint8Array to get Buffers where possible.
|
|
||
| await handshakeInitator.finish() | ||
| await handshakeResponder.finish() | ||
| await Promise.all([ |
There was a problem hiding this comment.
For better memory performance, it-byte-stream now needs the reader to read a buf before the sender can send another one so these operations need to be done in parallel.
|
|
||
| const initPayload = await getPayload(peerA, staticKeysInitiator.publicKey) | ||
| const handshakeInitator = new XXHandshake(true, initPayload, prologue, pureJsCrypto, staticKeysInitiator, connectionFrom, peerB) | ||
| const handshakeInitiator = new XXHandshake(true, initPayload, prologue, defaultCrypto, staticKeysInitiator, connectionFrom, peerB) |
There was a problem hiding this comment.
Typo Initator/Initiator
| import { duplexPair } from 'it-pair/duplex' | ||
| import { equals as uint8ArrayEquals } from 'uint8arrays/equals' | ||
| import { pureJsCrypto } from '../src/crypto/js.js' | ||
| import { defaultCrypto } from '../src/crypto/index.js' |
There was a problem hiding this comment.
Use the default crypto for the platform to ensure all are tested.
| }, | ||
| generateX25519KeyPairFromSeed (seed: Uint8Array): KeyPair { | ||
| const privateKey = crypto.createPrivateKey({ | ||
| key: Buffer.concat([ |
There was a problem hiding this comment.
Did you mean to use both Buffer.concat and uint8ArrayConcat in this file?
There was a problem hiding this comment.
No - I used Buffer.concat because this file will only be consumed from node/electron-main so uint8ArrayConcat would chain through to Buffer.concat anyway so it cuts out the middleman.
Not that I would expect it to make a difference to the overall benchmark. mind.
Using node crypto to do X25519 key operations instead of
@noble/curvesyields a nice little performance bump which translates to ever so slightly lower latencies when opening connections.Running the
./benchmarks/benchmark.jsfile shows a 2x improvement:Before:
After:
Flamegraphs (same scale)
Before:
After: