Skip to content

fix: use node.js crypto for x25519 keys#389

Merged
wemeetagain merged 2 commits intoChainSafe:masterfrom
achingbrain:fix/use-node-x25519-crypto
Nov 24, 2023
Merged

fix: use node.js crypto for x25519 keys#389
wemeetagain merged 2 commits intoChainSafe:masterfrom
achingbrain:fix/use-node-x25519-crypto

Conversation

@achingbrain
Copy link
Copy Markdown
Collaborator

@achingbrain achingbrain commented Nov 23, 2023

Using node crypto to do X25519 key operations instead of @noble/curves yields a nice little performance bump which translates to ever so slightly lower latencies when opening connections.

image

Running the ./benchmarks/benchmark.js file shows a 2x improvement:

Before:

% node ./benchmark.js
Initializing handshake benchmark
Init complete, running benchmark
handshake x 124 ops/sec ±0.47% (84 runs sampled)

After:

% node ./benchmark.js
Initializing handshake benchmark
Init complete, running benchmark
handshake x 314 ops/sec ±0.99% (87 runs sampled)

Flamegraphs (same scale)

Before:

image

After:

image

@achingbrain achingbrain requested a review from a team as a code owner November 23, 2023 18:10
Comment thread package.json
"clean": "aegir clean",
"dep-check": "aegir dep-check",
"build": "aegir build",
"prebuild": "mkdirp dist/src && cp -R src/proto dist/src",
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to do this with protons

Comment thread src/crypto/index.ts
const asImpl = new ChaCha20Poly1305(ctx)
const CHACHA_POLY1305 = 'chacha20-poly1305'
const PKCS8_PREFIX = Buffer.from([0x30, 0x2e, 0x02, 0x01, 0x00, 0x30, 0x05, 0x06, 0x03, 0x2b, 0x65, 0x6e, 0x04, 0x22, 0x04, 0x20])
const X25519_PREFIX = Buffer.from([0x30, 0x2a, 0x30, 0x05, 0x06, 0x03, 0x2b, 0x65, 0x6e, 0x03, 0x21, 0x00])
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Node.js requires these prefixes, @noble/curves doesn't add them.

Using node crypto to do X25519 key operations instead of `@noble/curves`
yields a nice little performance bump which translates to slightly lower
latencies when opening connections.

Running the `benchmark.js` file:

Before:

```console
% node ./benchmark.js
Initializing handshake benchmark
Init complete, running benchmark
handshake x 124 ops/sec ±0.47% (84 runs sampled)
```

After:

```console
% node ./benchmark.js
Initializing handshake benchmark
Init complete, running benchmark
handshake x 314 ops/sec ±0.99% (87 runs sampled)
```
@achingbrain achingbrain force-pushed the fix/use-node-x25519-crypto branch from a020ec9 to e63bf0f Compare November 23, 2023 18:13
Comment thread src/encoder.ts

export const uint16BEEncode = (value: number): Uint8Array => {
const target = allocUnsafe(2)
const target = uint8ArrayAllocUnsafe(2)
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

alloc is in uint8arrays now, it does the same thing.

Comment thread src/encoder.ts
ne: input.subarray(0, 32),
ciphertext: input.subarray(32, input.length),
ns: new Uint8Array(0)
ns: uint8ArrayAlloc(0)
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use Buffers in node wherever possible.

Comment thread src/proto/payload.ts
@@ -4,8 +4,8 @@
/* eslint-disable @typescript-eslint/no-unnecessary-boolean-literal-compare */
Copy link
Copy Markdown
Collaborator Author

@achingbrain achingbrain Nov 23, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regenerates code using latest protons which now uses alloc from uint8arrays instead of new Uint8Array to get Buffers where possible.

Comment thread test/xx-handshake.spec.ts

await handshakeInitator.finish()
await handshakeResponder.finish()
await Promise.all([
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For better memory performance, it-byte-stream now needs the reader to read a buf before the sender can send another one so these operations need to be done in parallel.

Comment thread test/xx-handshake.spec.ts

const initPayload = await getPayload(peerA, staticKeysInitiator.publicKey)
const handshakeInitator = new XXHandshake(true, initPayload, prologue, pureJsCrypto, staticKeysInitiator, connectionFrom, peerB)
const handshakeInitiator = new XXHandshake(true, initPayload, prologue, defaultCrypto, staticKeysInitiator, connectionFrom, peerB)
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo Initator/Initiator

Comment thread test/xx-handshake.spec.ts
import { duplexPair } from 'it-pair/duplex'
import { equals as uint8ArrayEquals } from 'uint8arrays/equals'
import { pureJsCrypto } from '../src/crypto/js.js'
import { defaultCrypto } from '../src/crypto/index.js'
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use the default crypto for the platform to ensure all are tested.

Comment thread src/crypto/index.ts
},
generateX25519KeyPairFromSeed (seed: Uint8Array): KeyPair {
const privateKey = crypto.createPrivateKey({
key: Buffer.concat([
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you mean to use both Buffer.concat and uint8ArrayConcat in this file?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No - I used Buffer.concat because this file will only be consumed from node/electron-main so uint8ArrayConcat would chain through to Buffer.concat anyway so it cuts out the middleman.

Not that I would expect it to make a difference to the overall benchmark. mind.

@wemeetagain wemeetagain merged commit 8ba08ea into ChainSafe:master Nov 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants