Remove attentions from TransformerOutput and stop computing attention weights in FLAVA #2550
Annotations
3 errors and 2 warnings
|
unit_tests (3.8)
Process completed with exit code 1.
|
|
unit_tests (3.9)
The strategy configuration was canceled because "unit_tests._3_8" failed
|
|
unit_tests (3.9)
The operation was canceled.
|
|
unit_tests (3.8)
WARNING conda.cli.main_config:_set_key(456): Key auto_activate_base is an alias of auto_activate; setting value with latter
|
|
unit_tests (3.9)
WARNING conda.cli.main_config:_set_key(456): Key auto_activate_base is an alias of auto_activate; setting value with latter
|