What steps does it take to reproduce the issue?
Uploading an SAV file.
-
Which page(s) does it occurs on?
List of files
-
What happens?
On Ingest: tabular ingest was unsuccessful. ingest produced tabular data but failed to save it in the database. Transaction aborted. No further inofmation is available.
-
To whom does it occur (all users, curators, superusers)?
all users
-
What did you expect to happen?
Converting the .SAV to .TAB file
In [31]: # Print relevant metadata
...: print("File Encoding:", meta.file_encoding)
...: print("Creation Time:", meta.creation_time)
...: print("Modification Time:", meta.modification_time)
...: print("File Format:", meta.file_format)
...: print("Number of Rows:", meta.number_rows)
...: print("Number of Columns:", meta.number_columns)
...: print("Notes:", meta.notes)
File Encoding: WINDOWS-1252
Creation Time: 2024-06-14 11:03:04
Modification Time: 2024-06-14 11:03:04
File Format: sav/zsav
Number of Rows: 108252
Number of Columns: 691
Notes: []
File Size (bytes): 673496320
Bytes per Cell: 9.003704867663398
Total cells: 108252 rows * 691 columns = 74,802,132.
Bytes per cell: 673,496,320 / 74,802,132 ≈ 9.00.
With 691 columns and 9 bytes/cell, the file might include extensive metadata (e.g., variable labels, value labels) or string-heavy data, potentially overwhelming Dataverse’s ingest process or hitting a database constraint.
Which version of Dataverse are you using?
https://github.com/scholarsportal/dataverse latest
Any related open or closed issues to this bug report?
didn't find
Screenshots:

What steps does it take to reproduce the issue?
Uploading an SAV file.
Which page(s) does it occurs on?
List of files
What happens?
On Ingest: tabular ingest was unsuccessful. ingest produced tabular data but failed to save it in the database. Transaction aborted. No further inofmation is available.
To whom does it occur (all users, curators, superusers)?
all users
What did you expect to happen?
Converting the .SAV to .TAB file
In [31]: # Print relevant metadata
...: print("File Encoding:", meta.file_encoding)
...: print("Creation Time:", meta.creation_time)
...: print("Modification Time:", meta.modification_time)
...: print("File Format:", meta.file_format)
...: print("Number of Rows:", meta.number_rows)
...: print("Number of Columns:", meta.number_columns)
...: print("Notes:", meta.notes)
File Encoding: WINDOWS-1252
Creation Time: 2024-06-14 11:03:04
Modification Time: 2024-06-14 11:03:04
File Format: sav/zsav
Number of Rows: 108252
Number of Columns: 691
Notes: []
File Size (bytes): 673496320
Bytes per Cell: 9.003704867663398
Total cells: 108252 rows * 691 columns = 74,802,132.
Bytes per cell: 673,496,320 / 74,802,132 ≈ 9.00.
With 691 columns and 9 bytes/cell, the file might include extensive metadata (e.g., variable labels, value labels) or string-heavy data, potentially overwhelming Dataverse’s ingest process or hitting a database constraint.
Which version of Dataverse are you using?
https://github.com/scholarsportal/dataverse latest
Any related open or closed issues to this bug report?
didn't find
Screenshots: