NetCDF version: v4.9.1
OS: Debian 10 (but reproducible in Windows and MacOS)
We recently upgraded from netCDF v4.8.1 to v4.9.1, and after the upgrade, we saw that some of our tests for NC_VLEN attribute reading have regressed significantly. I am attaching a repro code for the NC_VLEN attribute reading. Specifically, the regressions are seen in the function 'nc_get_att'.
The base type for the example is NC_BYTE, but we could also see the similar regressions with other base types as well (NC_CHAR, NC_DOUBLE, NC_USHORT, NC_STRING, etc).
Is this regression intended due to some change in the source code of 'nc_get_att' ?
Please find repro code and the results below. You can find the nc file I used for the repro here.
netCDF v4.9.1 time: 321.6 microseconds
netCDF v4.8.1 time: 19 microseconds
#include <stdio.h>
#include <string.h>
#include <netcdf.h>
#include <chrono>
#include <iostream>
using namespace std::chrono;
#define FILENAME "NetcdfNcVlenTypesPerfFile.nc"
#define GRPNAME "numeric_types"
#define VAR "samples_int8"
#define ATTNAME "attribute_int8"
int main()
{
int ncid;
int grpid;
int varid, status;
// Reopen the file and inquire the variable
status = nc_open(FILENAME, NC_NOWRITE, &ncid);
printf("status after nc_open = %d\n", status);
status = nc_inq_ncid(ncid, GRPNAME, &grpid);
printf("status after nc_inq_ncid = %d\n", status);
status = nc_inq_varid(grpid, VAR, &varid);
printf("status after nc_inq_varid = %d\n", status);
nc_type datatype;
size_t attr_length;
status = nc_inq_att ( grpid, varid, ATTNAME, &datatype, &attr_length );
printf("status after nc_inq_att = %d\n", status);
nc_type base_type;
status = nc_inq_vlen(grpid, datatype, NULL, NULL, &base_type);
printf("status after nc_inq_vlen = %d\n", status);
double time;
double sum = 0;
for (int i = 0; i < 1000; i++)
{
nc_vlen_t* data = new nc_vlen_t[attr_length];
auto start = high_resolution_clock::now();
status = nc_get_att(grpid, varid, ATTNAME, data);
auto stop = high_resolution_clock::now();
auto duration = duration_cast<microseconds>(stop - start);
time = duration.count();
sum = sum + time;
status = nc_free_vlens(attr_length, data);
delete[] data;
}
double tavg = sum/1000;
std::cout << "Avg Time: " << tavg <<std::endl;
// close the file
status = nc_close(ncid);
printf("status code after close = %d\n", status);
printf("End of test.\n\n");
return 0;
}
NetCDF version: v4.9.1
OS: Debian 10 (but reproducible in Windows and MacOS)
We recently upgraded from netCDF v4.8.1 to v4.9.1, and after the upgrade, we saw that some of our tests for NC_VLEN attribute reading have regressed significantly. I am attaching a repro code for the NC_VLEN attribute reading. Specifically, the regressions are seen in the function 'nc_get_att'.
The base type for the example is NC_BYTE, but we could also see the similar regressions with other base types as well (NC_CHAR, NC_DOUBLE, NC_USHORT, NC_STRING, etc).
Is this regression intended due to some change in the source code of 'nc_get_att' ?
Please find repro code and the results below. You can find the nc file I used for the repro here.
netCDF v4.9.1 time: 321.6 microseconds
netCDF v4.8.1 time: 19 microseconds