I've read the Interoperability with HDF5 section in the docs.
Assuming a HDF5 file is written in accordance with the netCDF-4 rules (i.e. no strange types, no looping groups), and assuming that every dataset has a dimension scale attached to each dimension, the netCDF-4 API can be used to read and edit the file, quite easily.
[...]
If dimension scales are not used, then netCDF-4 can still edit the file, and will invent anonymous dimensions for each variable shape.
This works quite well when reading hdf5 weather radar data, but has one quirk.
If the variable has different size for first and second dimension (eg. (100,200)) two anonymous dimensions are invented. If the 2D dataset has same size for first and second dimension (eg. (100,100)) then only one anonymous dimension is invented.
In my use case this contradicts the original meaning of the 2D dataset dimensions to be different. Maybe I do not see it immediately, but I also can't think of any use case where the dimensions of a 2D dataset would be the same dimension (sound pretty bad, but I do not know how to explain better).
On my way to find some working solution I've already asked on the netCDF4-python github issue tracker and on the h5py mailing-list but it seems a solution is only possible by enhancing/changing behaviour of netcdf-c.
From my perspective netcdf-c would need to invent two anonymous dimensions from a 2D dataset, even if the dimensions are of the same size. Another option would be to define the wanted behaviour at reading time. Are there any other options?
Software Version Information
output of `nc-config --all (nectCDF 4.6.2):
Details
This netCDF 4.6.2 has been built with the following features:
--cc -> x86_64-conda_cos6-linux-gnu-cc
--cflags -> -I/home/kai/miniconda/envs/wradlib_150/include
--libs -> -L/home/kai/miniconda/envs/wradlib_150/lib -lnetcdf -lmfhdf -ldf -lhdf5_hl -lhdf5 -lrt -lpthread -lz -ldl -lm -lcurl
--has-c++ -> no
--cxx ->
--has-c++4 -> no
--cxx4 ->
--has-fortran -> no
--has-dap -> yes
--has-dap2 -> yes
--has-dap4 -> yes
--has-nc2 -> yes
--has-nc4 -> yes
--has-hdf5 -> yes
--has-hdf4 -> yes
--has-logging -> no
--has-pnetcdf -> no
--has-szlib -> no
--has-cdf5 -> yes
--has-parallel4 -> no
--has-parallel -> no
--prefix -> /home/kai/miniconda/envs/wradlib_150
--includedir -> /home/kai/miniconda/envs/wradlib_150/include
--libdir -> /home/kai/miniconda/envs/wradlib_150/lib
--version -> netCDF 4.6.2
output of `h5cc --showconfig (hdf 1.10.5):
Details
SUMMARY OF THE HDF5 CONFIGURATION
=================================
General Information:
HDF5 Version: 1.10.5
Configured on: Wed Aug 21 19:09:16 UTC 2019
Configured by: conda@752c21c7e9bc
Host system: x86_64-conda_cos6-linux-gnu
Uname information: Linux 752c21c7e9bc 4.15.0-1052-azure #57-Ubuntu SMP Tue Jul 23 19:07:16 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
Byte sex: little-endian
Installation point: /home/kai/miniconda/envs/wradlib_150
Compiling Options:
Build Mode: production
Debugging Symbols: no
Asserts: no
Profiling: no
Optimization Level: high
Linking Options:
Libraries: static, shared
Statically Linked Executables:
LDFLAGS: -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,-rpath,/home/kai/miniconda/envs/wradlib_150/lib -Wl,-rpath-link,/home/kai/miniconda/envs/wradlib_150/lib -L/home/kai/miniconda/envs/wradlib_150/lib
H5_LDFLAGS:
AM_LDFLAGS: -L/home/kai/miniconda/envs/wradlib_150/lib
Extra libraries: -lrt -lpthread -lz -ldl -lm
Archiver: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-ar
AR_FLAGS: cr
Ranlib: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-ranlib
Languages:
C: yes
C Compiler: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-cc
CPPFLAGS: -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -I/home/kai/miniconda/envs/wradlib_150/include
H5_CPPFLAGS: -D_GNU_SOURCE -D_POSIX_C_SOURCE=200809L -DNDEBUG -UH5_DEBUG_API
AM_CPPFLAGS: -I/home/kai/miniconda/envs/wradlib_150/include
C Flags: -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -I/home/kai/miniconda/envs/wradlib_150/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/work=/usr/local/src/conda/hdf5_split-1.10.5 -fdebug-prefix-map=/home/kai/miniconda/envs/wradlib_150=/usr/local/src/conda-prefix
H5 C Flags: -std=c99 -pedantic -Wall -Wextra -Wbad-function-cast -Wc++-compat -Wcast-align -Wcast-qual -Wconversion -Wdeclaration-after-statement -Wdisabled-optimization -Wfloat-equal -Wformat=2 -Winit-self -Winvalid-pch -Wmissing-declarations -Wmissing-include-dirs -Wmissing-prototypes -Wnested-externs -Wold-style-definition -Wpacked -Wpointer-arith -Wredundant-decls -Wshadow -Wstrict-prototypes -Wswitch-default -Wswitch-enum -Wundef -Wunused-macros -Wunsafe-loop-optimizations -Wwrite-strings -finline-functions -s -Wno-inline -Wno-aggregate-return -Wno-missing-format-attribute -Wno-missing-noreturn -O
AM C Flags:
Shared C Library: yes
Static C Library: yes
Fortran: yes
Fortran Compiler: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-gfortran
Fortran Flags:
H5 Fortran Flags: -pedantic -Wall -Wextra -Wunderflow -Wimplicit-interface -Wsurprising -Wno-c-binding-type -s -O2
AM Fortran Flags:
Shared Fortran Library: yes
Static Fortran Library: yes
C++: yes
C++ Compiler: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-c++
C++ Flags: -fvisibility-inlines-hidden -std=c++17 -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -I/home/kai/miniconda/envs/wradlib_150/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/work=/usr/local/src/conda/hdf5_split-1.10.5 -fdebug-prefix-map=/home/kai/miniconda/envs/wradlib_150=/usr/local/src/conda-prefix
H5 C++ Flags: -pedantic -Wall -W -Wundef -Wshadow -Wpointer-arith -Wcast-qual -Wcast-align -Wwrite-strings -Wconversion -Wredundant-decls -Winline -Wsign-promo -Woverloaded-virtual -Wold-style-cast -Weffc++ -Wreorder -Wnon-virtual-dtor -Wctor-dtor-privacy -Wabi -finline-functions -s -O
AM C++ Flags:
Shared C++ Library: yes
Static C++ Library: yes
Java: no
Features:
Parallel Filtered Dataset Writes: no
Large Parallel I/O: no
High-level library: yes
Threadsafety: yes
Default API mapping: v110
With deprecated public symbols: yes
I/O filters (external): deflate(zlib)
MPE: no
Direct VFD: no
dmalloc: no
Packages w/ extra debug output: none
API tracing: no
Using memory checker: yes
Memory allocation sanity checks: no
Function stack tracing: no
Strict file format checks: no
Optimization instrumentation: no
Python Version
Python 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21)
[GCC 7.3.0] :: Anaconda, Inc. on linux
netCDF4 version
h5py version
I've read the Interoperability with HDF5 section in the docs.
Assuming a HDF5 file is written in accordance with the netCDF-4 rules (i.e. no strange types, no looping groups), and assuming that every dataset has a dimension scale attached to each dimension, the netCDF-4 API can be used to read and edit the file, quite easily.
[...]
If dimension scales are not used, then netCDF-4 can still edit the file, and will invent anonymous dimensions for each variable shape.
This works quite well when reading hdf5 weather radar data, but has one quirk.
If the variable has different size for first and second dimension (eg.
(100,200)) two anonymous dimensions are invented. If the 2D dataset has same size for first and second dimension (eg.(100,100)) then only one anonymous dimension is invented.In my use case this contradicts the original meaning of the 2D dataset dimensions to be different. Maybe I do not see it immediately, but I also can't think of any use case where the dimensions of a 2D dataset would be the same dimension (sound pretty bad, but I do not know how to explain better).
On my way to find some working solution I've already asked on the netCDF4-python github issue tracker and on the
h5pymailing-list but it seems a solution is only possible by enhancing/changing behaviour of netcdf-c.From my perspective netcdf-c would need to invent two anonymous dimensions from a 2D dataset, even if the dimensions are of the same size. Another option would be to define the wanted behaviour at reading time. Are there any other options?
Software Version Information
output of `nc-config --all (nectCDF 4.6.2):
Details
This netCDF 4.6.2 has been built with the following features:--cc -> x86_64-conda_cos6-linux-gnu-cc
--cflags -> -I/home/kai/miniconda/envs/wradlib_150/include
--libs -> -L/home/kai/miniconda/envs/wradlib_150/lib -lnetcdf -lmfhdf -ldf -lhdf5_hl -lhdf5 -lrt -lpthread -lz -ldl -lm -lcurl
--has-c++ -> no
--cxx ->
--has-c++4 -> no
--cxx4 ->
--has-fortran -> no
--has-dap -> yes
--has-dap2 -> yes
--has-dap4 -> yes
--has-nc2 -> yes
--has-nc4 -> yes
--has-hdf5 -> yes
--has-hdf4 -> yes
--has-logging -> no
--has-pnetcdf -> no
--has-szlib -> no
--has-cdf5 -> yes
--has-parallel4 -> no
--has-parallel -> no
--prefix -> /home/kai/miniconda/envs/wradlib_150
--includedir -> /home/kai/miniconda/envs/wradlib_150/include
--libdir -> /home/kai/miniconda/envs/wradlib_150/lib
--version -> netCDF 4.6.2
output of `h5cc --showconfig (hdf 1.10.5):
Details
SUMMARY OF THE HDF5 CONFIGURATION =================================General Information:
Compiling Options:
Linking Options:
Statically Linked Executables:
LDFLAGS: -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,-rpath,/home/kai/miniconda/envs/wradlib_150/lib -Wl,-rpath-link,/home/kai/miniconda/envs/wradlib_150/lib -L/home/kai/miniconda/envs/wradlib_150/lib
H5_LDFLAGS:
AM_LDFLAGS: -L/home/kai/miniconda/envs/wradlib_150/lib
Extra libraries: -lrt -lpthread -lz -ldl -lm
Archiver: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-ar
AR_FLAGS: cr
Ranlib: /home/conda/feedstock_root/build_artifacts/hdf5_split_1566414109997/_build_env/bin/x86_64-conda_cos6-linux-gnu-ranlib
Languages:
Features:
Parallel Filtered Dataset Writes: no
Large Parallel I/O: no
High-level library: yes
Threadsafety: yes
Default API mapping: v110
With deprecated public symbols: yes
I/O filters (external): deflate(zlib)
MPE: no
Direct VFD: no
dmalloc: no
Packages w/ extra debug output: none
API tracing: no
Using memory checker: yes
Memory allocation sanity checks: no
Function stack tracing: no
Strict file format checks: no
Optimization instrumentation: no
Python Version
netCDF4 version
h5py version