You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you need long support, there is also a [TypeScript definition](https://github.com/DefinitelyTyped/DefinitelyTyped/blob/types-2.0/long/index.d.ts) for that (on npm: [@types/long](https://www.npmjs.com/package/@types/long)).
256
-
257
246
See also: [Generating your own TypeScript definitions](https://github.com/dcodeIO/protobuf.js#generating-typescript-definitions-from-static-modules)
258
247
248
+
Additional configuration might be necessary when not utilizing node, i.e. reference [protobuf.js.d.ts](https://github.com/dcodeIO/protobuf.js/blob/master/index.d.ts) and [long.js.d.ts](https://github.com/DefinitelyTyped/DefinitelyTyped/blob/types-2.0/long/index.d.ts).
249
+
259
250
Module Structure
260
251
----------------
261
252
The library exports a flat `protobuf` namespace including but not restricted to the following members, ordered by category:
@@ -483,37 +474,45 @@ The package includes a benchmark that tries to compare performance to native JSO
483
474
```
484
475
benchmarking encoding performance ...
485
476
486
-
Type.encode to buffer x 521,803 ops/sec ±0.84% (88 runs sampled)
487
-
JSON.stringify to string x 300,362 ops/sec ±1.11% (86 runs sampled)
488
-
JSON.stringify to buffer x 169,413 ops/sec ±1.49% (86 runs sampled)
477
+
Type.encode to buffer x 553,499 ops/sec ±0.33% (91 runs sampled)
478
+
JSON.stringify to string x 313,354 ops/sec ±0.84% (89 runs sampled)
479
+
JSON.stringify to buffer x 177,932 ops/sec ±0.78% (88 runs sampled)
489
480
490
481
Type.encode to buffer was fastest
491
-
JSON.stringify to string was 42.6% slower
492
-
JSON.stringify to buffer was 67.7% slower
482
+
JSON.stringify to string was 43.7% slower
483
+
JSON.stringify to buffer was 68.0% slower
493
484
494
485
benchmarking decoding performance ...
495
486
496
-
Type.decode from buffer x 1,325,308 ops/sec ±1.46% (88 runs sampled)
497
-
JSON.parse from string x 283,907 ops/sec ±1.39% (86 runs sampled)
498
-
JSON.parse from buffer x 255,372 ops/sec ±1.28% (88 runs sampled)
487
+
Type.decode from buffer x 1,352,868 ops/sec ±0.66% (89 runs sampled)
488
+
JSON.parse from string x 293,883 ops/sec ±0.55% (92 runs sampled)
489
+
JSON.parse from buffer x 267,287 ops/sec ±0.83% (91 runs sampled)
499
490
500
491
Type.decode from buffer was fastest
501
-
JSON.parse from string was 78.6% slower
502
-
JSON.parse from buffer was 80.7% slower
492
+
JSON.parse from string was 78.3% slower
493
+
JSON.parse from buffer was 80.3% slower
503
494
504
495
benchmarking combined performance ...
505
496
506
-
Type to/from buffer x 269,719 ops/sec ±0.87% (91 runs sampled)
507
-
JSON to/from string x 122,878 ops/sec ±1.59% (87 runs sampled)
508
-
JSON to/from buffer x 89,310 ops/sec ±1.01% (88 runs sampled)
497
+
Type to/from buffer x 267,534 ops/sec ±0.88% (91 runs sampled)
498
+
JSON to/from string x 129,143 ops/sec ±0.66% (92 runs sampled)
499
+
JSON to/from buffer x 91,789 ops/sec ±0.73% (87 runs sampled)
509
500
510
501
Type to/from buffer was fastest
511
-
JSON to/from string was 54.8% slower
512
-
JSON to/from buffer was 66.9% slower
502
+
JSON to/from string was 51.6% slower
503
+
JSON to/from buffer was 65.6% slower
513
504
514
505
benchmarking verifying performance ...
515
506
516
-
Type.verify x 5,857,856 ops/sec ±0.82% (91 runs sampled)
507
+
Type.verify x 6,431,917 ops/sec ±0.49% (91 runs sampled)
508
+
509
+
benchmarking converting performance ...
510
+
511
+
Message.from x 629,785 ops/sec ±0.62% (94 runs sampled)
512
+
Message#asJSON x 609,017 ops/sec ±0.74% (93 runs sampled)
513
+
514
+
Message.from was fastest
515
+
Message#asJSON was 3.4% slower
517
516
```
518
517
519
518
Note that JSON is a native binding nowadays and as such is about as fast as it possibly can get. So, how can protobuf.js be faster?
0 commit comments