Skip to content

Remove attentions from TransformerOutput and stop computing attention weights in FLAVA #2550

Remove attentions from TransformerOutput and stop computing attention weights in FLAVA

Remove attentions from TransformerOutput and stop computing attention weights in FLAVA #2550

Triggered via pull request February 11, 2026 19:31
Status Failure
Total duration 1m 5s
Artifacts

unit_test.yaml

on: pull_request
Matrix: unit_tests
Fit to window
Zoom out
Zoom in

Annotations

3 errors and 2 warnings
unit_tests (3.8)
Process completed with exit code 1.
unit_tests (3.9)
The strategy configuration was canceled because "unit_tests._3_8" failed
unit_tests (3.9)
The operation was canceled.
unit_tests (3.8)
WARNING conda.cli.main_config:_set_key(456): Key auto_activate_base is an alias of auto_activate; setting value with latter
unit_tests (3.9)
WARNING conda.cli.main_config:_set_key(456): Key auto_activate_base is an alias of auto_activate; setting value with latter