Skip to content

Commit 170a407

Browse files
Copilotjustinchuby
andcommitted
fix: add linear_before_reset=1 to GRU op calls in aten_gru
Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
1 parent 906fdc5 commit 170a407

File tree

1 file changed

+2
-0
lines changed
  • onnxscript/function_libs/torch_lib/ops

1 file changed

+2
-0
lines changed

onnxscript/function_libs/torch_lib/ops/core.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4357,6 +4357,7 @@ def aten_gru(
43574357
initial_h=layer_h,
43584358
direction=direction,
43594359
hidden_size=hidden_size_attr,
4360+
linear_before_reset=1,
43604361
)
43614362
else:
43624363
Y, Y_h = op.GRU(
@@ -4366,6 +4367,7 @@ def aten_gru(
43664367
initial_h=layer_h,
43674368
direction=direction,
43684369
hidden_size=hidden_size_attr,
4370+
linear_before_reset=1,
43694371
)
43704372

43714373
# Y shape: [seq_length, num_directions, batch_size, hidden_size]

0 commit comments

Comments
 (0)