* Package: sci-biology/neuroconv-0.4.8:0 * Repository: science * Maintainer: gentoo@chymera.eu sci@gentoo.org * USE: abi_x86_64 amd64 ecephys elibc_glibc icephys kernel_linux ophys python_targets_python3_11 test * FEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox >>> Unpacking source... >>> Unpacking neuroconv-0.4.8.gh.tar.gz to /var/tmp/portage/sci-biology/neuroconv-0.4.8/work >>> Source unpacked in /var/tmp/portage/sci-biology/neuroconv-0.4.8/work >>> Preparing source in /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8 ... * Build system packages: * dev-python/gpep517 : 15 * dev-python/installer : 0.7.0 * dev-python/setuptools : 69.0.3 * dev-python/setuptools-rust : 1.8.1 * dev-python/setuptools-scm : 8.0.4 * dev-python/wheel : 0.42.0 >>> Source prepared. >>> Configuring source in /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8 ... >>> Source configured. >>> Compiling source in /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8 ... * python3_11: running distutils-r1_run_phase distutils-r1_python_compile * Building the wheel for neuroconv-0.4.8 via setuptools.build_meta:__legacy__ python3.11 -m gpep517 build-wheel --prefix=/usr --backend setuptools.build_meta:__legacy__ --output-fd 3 --wheel-dir /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/wheel 2024-03-21 12:56:26,770 gpep517 INFO Building wheel via backend setuptools.build_meta:__legacy__ 2024-03-21 12:56:26,989 root INFO running bdist_wheel 2024-03-21 12:56:27,087 root INFO running build 2024-03-21 12:56:27,087 root INFO running build_py 2024-03-21 12:56:27,106 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build 2024-03-21 12:56:27,106 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib 2024-03-21 12:56:27,106 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv 2024-03-21 12:56:27,106 root INFO copying src/neuroconv/nwbconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv 2024-03-21 12:56:27,107 root INFO copying src/neuroconv/basetemporalalignmentinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv 2024-03-21 12:56:27,107 root INFO copying src/neuroconv/basedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv 2024-03-21 12:56:27,107 root INFO copying src/neuroconv/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv 2024-03-21 12:56:27,108 root INFO copying src/neuroconv/baseextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv 2024-03-21 12:56:27,108 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces 2024-03-21 12:56:27,108 root INFO copying src/neuroconv/datainterfaces/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces 2024-03-21 12:56:27,109 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,109 root INFO copying src/neuroconv/utils/checks.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,109 root INFO copying src/neuroconv/utils/dict.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,109 root INFO copying src/neuroconv/utils/path.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,109 root INFO copying src/neuroconv/utils/json_schema.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,110 root INFO copying src/neuroconv/utils/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,110 root INFO copying src/neuroconv/utils/types.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils 2024-03-21 12:56:27,110 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,110 root INFO copying src/neuroconv/tools/importing.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,111 root INFO copying src/neuroconv/tools/path_expansion.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,111 root INFO copying src/neuroconv/tools/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,111 root INFO copying src/neuroconv/tools/hdmf.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,111 root INFO copying src/neuroconv/tools/processes.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,112 root INFO copying src/neuroconv/tools/text.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,112 root INFO copying src/neuroconv/tools/figshare.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,112 root INFO copying src/neuroconv/tools/signal_processing.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools 2024-03-21 12:56:27,112 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/converters 2024-03-21 12:56:27,113 root INFO copying src/neuroconv/converters/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/converters 2024-03-21 12:56:27,113 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,113 root INFO copying src/neuroconv/datainterfaces/ophys/basesegmentationextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,113 root INFO copying src/neuroconv/datainterfaces/ophys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,113 root INFO copying src/neuroconv/datainterfaces/ophys/baseimagingextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,114 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior 2024-03-21 12:56:27,114 root INFO copying src/neuroconv/datainterfaces/behavior/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior 2024-03-21 12:56:27,114 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys 2024-03-21 12:56:27,114 root INFO copying src/neuroconv/datainterfaces/icephys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys 2024-03-21 12:56:27,115 root INFO copying src/neuroconv/datainterfaces/icephys/baseicephysinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys 2024-03-21 12:56:27,115 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text 2024-03-21 12:56:27,115 root INFO copying src/neuroconv/datainterfaces/text/timeintervalsinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text 2024-03-21 12:56:27,116 root INFO copying src/neuroconv/datainterfaces/text/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text 2024-03-21 12:56:27,116 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,116 root INFO copying src/neuroconv/datainterfaces/ecephys/basesortingextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,116 root INFO copying src/neuroconv/datainterfaces/ecephys/baselfpextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,117 root INFO copying src/neuroconv/datainterfaces/ecephys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,117 root INFO copying src/neuroconv/datainterfaces/ecephys/baserecordingextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,117 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/extract 2024-03-21 12:56:27,117 root INFO copying src/neuroconv/datainterfaces/ophys/extract/extractdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/extract 2024-03-21 12:56:27,118 root INFO copying src/neuroconv/datainterfaces/ophys/extract/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/extract 2024-03-21 12:56:27,118 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/cnmfe 2024-03-21 12:56:27,118 root INFO copying src/neuroconv/datainterfaces/ophys/cnmfe/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/cnmfe 2024-03-21 12:56:27,118 root INFO copying src/neuroconv/datainterfaces/ophys/cnmfe/cnmfedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/cnmfe 2024-03-21 12:56:27,119 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/hdf5 2024-03-21 12:56:27,119 root INFO copying src/neuroconv/datainterfaces/ophys/hdf5/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/hdf5 2024-03-21 12:56:27,119 root INFO copying src/neuroconv/datainterfaces/ophys/hdf5/hdf5datainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/hdf5 2024-03-21 12:56:27,119 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/suite2p 2024-03-21 12:56:27,120 root INFO copying src/neuroconv/datainterfaces/ophys/suite2p/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/suite2p 2024-03-21 12:56:27,120 root INFO copying src/neuroconv/datainterfaces/ophys/suite2p/suite2pdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/suite2p 2024-03-21 12:56:27,120 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sima 2024-03-21 12:56:27,120 root INFO copying src/neuroconv/datainterfaces/ophys/sima/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sima 2024-03-21 12:56:27,121 root INFO copying src/neuroconv/datainterfaces/ophys/sima/simadatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sima 2024-03-21 12:56:27,121 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/caiman 2024-03-21 12:56:27,121 root INFO copying src/neuroconv/datainterfaces/ophys/caiman/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/caiman 2024-03-21 12:56:27,121 root INFO copying src/neuroconv/datainterfaces/ophys/caiman/caimandatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/caiman 2024-03-21 12:56:27,122 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sbx 2024-03-21 12:56:27,122 root INFO copying src/neuroconv/datainterfaces/ophys/sbx/sbxdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sbx 2024-03-21 12:56:27,122 root INFO copying src/neuroconv/datainterfaces/ophys/sbx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sbx 2024-03-21 12:56:27,122 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/scanimage 2024-03-21 12:56:27,122 root INFO copying src/neuroconv/datainterfaces/ophys/scanimage/scanimageimaginginterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/scanimage 2024-03-21 12:56:27,123 root INFO copying src/neuroconv/datainterfaces/ophys/scanimage/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/scanimage 2024-03-21 12:56:27,123 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,123 root INFO copying src/neuroconv/datainterfaces/ophys/miniscope/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,123 root INFO copying src/neuroconv/datainterfaces/ophys/miniscope/miniscopeconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,124 root INFO copying src/neuroconv/datainterfaces/ophys/miniscope/miniscopeimagingdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,124 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/micromanagertiff 2024-03-21 12:56:27,124 root INFO copying src/neuroconv/datainterfaces/ophys/micromanagertiff/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/micromanagertiff 2024-03-21 12:56:27,124 root INFO copying src/neuroconv/datainterfaces/ophys/micromanagertiff/micromanagertiffdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/micromanagertiff 2024-03-21 12:56:27,125 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,125 root INFO copying src/neuroconv/datainterfaces/ophys/brukertiff/brukertiffdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,125 root INFO copying src/neuroconv/datainterfaces/ophys/brukertiff/brukertiffconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,125 root INFO copying src/neuroconv/datainterfaces/ophys/brukertiff/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,126 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/tiff 2024-03-21 12:56:27,126 root INFO copying src/neuroconv/datainterfaces/ophys/tiff/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/tiff 2024-03-21 12:56:27,126 root INFO copying src/neuroconv/datainterfaces/ophys/tiff/tiffdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/tiff 2024-03-21 12:56:27,126 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/audio 2024-03-21 12:56:27,126 root INFO copying src/neuroconv/datainterfaces/behavior/audio/audiointerface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/audio 2024-03-21 12:56:27,127 root INFO copying src/neuroconv/datainterfaces/behavior/audio/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/audio 2024-03-21 12:56:27,127 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,127 root INFO copying src/neuroconv/datainterfaces/behavior/sleap/sleapdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,127 root INFO copying src/neuroconv/datainterfaces/behavior/sleap/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,128 root INFO copying src/neuroconv/datainterfaces/behavior/sleap/sleap_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,128 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,128 root INFO copying src/neuroconv/datainterfaces/behavior/neuralynx/neuralynx_nvt_interface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,128 root INFO copying src/neuroconv/datainterfaces/behavior/neuralynx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,129 root INFO copying src/neuroconv/datainterfaces/behavior/neuralynx/nvt_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,129 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,129 root INFO copying src/neuroconv/datainterfaces/behavior/lightningpose/lightningposedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,129 root INFO copying src/neuroconv/datainterfaces/behavior/lightningpose/lightningposeconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,130 root INFO copying src/neuroconv/datainterfaces/behavior/lightningpose/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,130 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/miniscope 2024-03-21 12:56:27,130 root INFO copying src/neuroconv/datainterfaces/behavior/miniscope/miniscopedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/miniscope 2024-03-21 12:56:27,130 root INFO copying src/neuroconv/datainterfaces/behavior/miniscope/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/miniscope 2024-03-21 12:56:27,131 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/deeplabcut 2024-03-21 12:56:27,131 root INFO copying src/neuroconv/datainterfaces/behavior/deeplabcut/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/deeplabcut 2024-03-21 12:56:27,131 root INFO copying src/neuroconv/datainterfaces/behavior/deeplabcut/deeplabcutdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/deeplabcut 2024-03-21 12:56:27,131 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,131 root INFO copying src/neuroconv/datainterfaces/behavior/video/video_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,132 root INFO copying src/neuroconv/datainterfaces/behavior/video/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,132 root INFO copying src/neuroconv/datainterfaces/behavior/video/videodatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,132 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/fictrac 2024-03-21 12:56:27,132 root INFO copying src/neuroconv/datainterfaces/behavior/fictrac/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/fictrac 2024-03-21 12:56:27,132 root INFO copying src/neuroconv/datainterfaces/behavior/fictrac/fictracdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/fictrac 2024-03-21 12:56:27,133 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/abf 2024-03-21 12:56:27,133 root INFO copying src/neuroconv/datainterfaces/icephys/abf/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/abf 2024-03-21 12:56:27,133 root INFO copying src/neuroconv/datainterfaces/icephys/abf/abfdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/abf 2024-03-21 12:56:27,133 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/excel 2024-03-21 12:56:27,134 root INFO copying src/neuroconv/datainterfaces/text/excel/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/excel 2024-03-21 12:56:27,134 root INFO copying src/neuroconv/datainterfaces/text/excel/exceltimeintervalsinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/excel 2024-03-21 12:56:27,134 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/csv 2024-03-21 12:56:27,134 root INFO copying src/neuroconv/datainterfaces/text/csv/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/csv 2024-03-21 12:56:27,134 root INFO copying src/neuroconv/datainterfaces/text/csv/csvtimeintervalsinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/csv 2024-03-21 12:56:27,135 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/plexon 2024-03-21 12:56:27,135 root INFO copying src/neuroconv/datainterfaces/ecephys/plexon/plexondatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/plexon 2024-03-21 12:56:27,135 root INFO copying src/neuroconv/datainterfaces/ecephys/plexon/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/plexon 2024-03-21 12:56:27,135 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/biocam 2024-03-21 12:56:27,136 root INFO copying src/neuroconv/datainterfaces/ecephys/biocam/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/biocam 2024-03-21 12:56:27,136 root INFO copying src/neuroconv/datainterfaces/ecephys/biocam/biocamdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/biocam 2024-03-21 12:56:27,136 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,136 root INFO copying src/neuroconv/datainterfaces/ecephys/axona/axona_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,137 root INFO copying src/neuroconv/datainterfaces/ecephys/axona/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,137 root INFO copying src/neuroconv/datainterfaces/ecephys/axona/axonadatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,137 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,137 root INFO copying src/neuroconv/datainterfaces/ecephys/neuroscope/neuroscopedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,138 root INFO copying src/neuroconv/datainterfaces/ecephys/neuroscope/neuroscope_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,138 root INFO copying src/neuroconv/datainterfaces/ecephys/neuroscope/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,138 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,138 root INFO copying src/neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,139 root INFO copying src/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,139 root INFO copying src/neuroconv/datainterfaces/ecephys/openephys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,139 root INFO copying src/neuroconv/datainterfaces/ecephys/openephys/openephyssortingdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,139 root INFO copying src/neuroconv/datainterfaces/ecephys/openephys/openephysbinarydatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,140 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikegadgets 2024-03-21 12:56:27,140 root INFO copying src/neuroconv/datainterfaces/ecephys/spikegadgets/spikegadgetsdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikegadgets 2024-03-21 12:56:27,140 root INFO copying src/neuroconv/datainterfaces/ecephys/spikegadgets/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikegadgets 2024-03-21 12:56:27,140 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,141 root INFO copying src/neuroconv/datainterfaces/ecephys/spikeglx/spikeglx_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,141 root INFO copying src/neuroconv/datainterfaces/ecephys/spikeglx/spikeglxconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,141 root INFO copying src/neuroconv/datainterfaces/ecephys/spikeglx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,141 root INFO copying src/neuroconv/datainterfaces/ecephys/spikeglx/spikeglxdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,142 root INFO copying src/neuroconv/datainterfaces/ecephys/spikeglx/spikeglxnidqinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,142 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuralynx 2024-03-21 12:56:27,142 root INFO copying src/neuroconv/datainterfaces/ecephys/neuralynx/neuralynxdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuralynx 2024-03-21 12:56:27,142 root INFO copying src/neuroconv/datainterfaces/ecephys/neuralynx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuralynx 2024-03-21 12:56:27,143 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mearec 2024-03-21 12:56:27,143 root INFO copying src/neuroconv/datainterfaces/ecephys/mearec/mearecdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mearec 2024-03-21 12:56:27,143 root INFO copying src/neuroconv/datainterfaces/ecephys/mearec/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mearec 2024-03-21 12:56:27,143 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/cellexplorer 2024-03-21 12:56:27,143 root INFO copying src/neuroconv/datainterfaces/ecephys/cellexplorer/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/cellexplorer 2024-03-21 12:56:27,143 root INFO copying src/neuroconv/datainterfaces/ecephys/cellexplorer/cellexplorerdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/cellexplorer 2024-03-21 12:56:27,144 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/kilosort 2024-03-21 12:56:27,144 root INFO copying src/neuroconv/datainterfaces/ecephys/kilosort/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/kilosort 2024-03-21 12:56:27,144 root INFO copying src/neuroconv/datainterfaces/ecephys/kilosort/kilosortdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/kilosort 2024-03-21 12:56:27,144 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/tdt 2024-03-21 12:56:27,145 root INFO copying src/neuroconv/datainterfaces/ecephys/tdt/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/tdt 2024-03-21 12:56:27,145 root INFO copying src/neuroconv/datainterfaces/ecephys/tdt/tdtdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/tdt 2024-03-21 12:56:27,145 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spike2 2024-03-21 12:56:27,145 root INFO copying src/neuroconv/datainterfaces/ecephys/spike2/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spike2 2024-03-21 12:56:27,145 root INFO copying src/neuroconv/datainterfaces/ecephys/spike2/spike2datainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spike2 2024-03-21 12:56:27,146 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/intan 2024-03-21 12:56:27,146 root INFO copying src/neuroconv/datainterfaces/ecephys/intan/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/intan 2024-03-21 12:56:27,146 root INFO copying src/neuroconv/datainterfaces/ecephys/intan/intandatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/intan 2024-03-21 12:56:27,146 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/maxwell 2024-03-21 12:56:27,147 root INFO copying src/neuroconv/datainterfaces/ecephys/maxwell/maxonedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/maxwell 2024-03-21 12:56:27,147 root INFO copying src/neuroconv/datainterfaces/ecephys/maxwell/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/maxwell 2024-03-21 12:56:27,147 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/alphaomega 2024-03-21 12:56:27,147 root INFO copying src/neuroconv/datainterfaces/ecephys/alphaomega/alphaomegadatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/alphaomega 2024-03-21 12:56:27,147 root INFO copying src/neuroconv/datainterfaces/ecephys/alphaomega/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/alphaomega 2024-03-21 12:56:27,148 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/edf 2024-03-21 12:56:27,148 root INFO copying src/neuroconv/datainterfaces/ecephys/edf/edfdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/edf 2024-03-21 12:56:27,148 root INFO copying src/neuroconv/datainterfaces/ecephys/edf/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/edf 2024-03-21 12:56:27,148 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/phy 2024-03-21 12:56:27,149 root INFO copying src/neuroconv/datainterfaces/ecephys/phy/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/phy 2024-03-21 12:56:27,149 root INFO copying src/neuroconv/datainterfaces/ecephys/phy/phydatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/phy 2024-03-21 12:56:27,149 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,149 root INFO copying src/neuroconv/datainterfaces/ecephys/blackrock/blackrockdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,150 root INFO copying src/neuroconv/datainterfaces/ecephys/blackrock/header_tools.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,150 root INFO copying src/neuroconv/datainterfaces/ecephys/blackrock/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,150 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mcsraw 2024-03-21 12:56:27,150 root INFO copying src/neuroconv/datainterfaces/ecephys/mcsraw/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mcsraw 2024-03-21 12:56:27,150 root INFO copying src/neuroconv/datainterfaces/ecephys/mcsraw/mcsrawdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mcsraw 2024-03-21 12:56:27,151 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/audio 2024-03-21 12:56:27,151 root INFO copying src/neuroconv/tools/audio/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/audio 2024-03-21 12:56:27,151 root INFO copying src/neuroconv/tools/audio/audio.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/audio 2024-03-21 12:56:27,151 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/yaml_conversion_specification 2024-03-21 12:56:27,152 root INFO copying src/neuroconv/tools/yaml_conversion_specification/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/yaml_conversion_specification 2024-03-21 12:56:27,152 root INFO copying src/neuroconv/tools/yaml_conversion_specification/_yaml_conversion_specification.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/yaml_conversion_specification 2024-03-21 12:56:27,152 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers 2024-03-21 12:56:27,152 root INFO copying src/neuroconv/tools/data_transfers/_aws.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers 2024-03-21 12:56:27,152 root INFO copying src/neuroconv/tools/data_transfers/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers 2024-03-21 12:56:27,153 root INFO copying src/neuroconv/tools/data_transfers/_helpers.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers 2024-03-21 12:56:27,153 root INFO copying src/neuroconv/tools/data_transfers/_globus.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers 2024-03-21 12:56:27,153 root INFO copying src/neuroconv/tools/data_transfers/_dandi.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers 2024-03-21 12:56:27,154 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/neo 2024-03-21 12:56:27,154 root INFO copying src/neuroconv/tools/neo/neo.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/neo 2024-03-21 12:56:27,154 root INFO copying src/neuroconv/tools/neo/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/neo 2024-03-21 12:56:27,154 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,154 root INFO copying src/neuroconv/tools/testing/mock_probes.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,155 root INFO copying src/neuroconv/tools/testing/mock_files.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,155 root INFO copying src/neuroconv/tools/testing/mock_ttl_signals.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,155 root INFO copying src/neuroconv/tools/testing/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,155 root INFO copying src/neuroconv/tools/testing/mock_interfaces.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,156 root INFO copying src/neuroconv/tools/testing/data_interface_mixins.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,156 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,156 root INFO copying src/neuroconv/tools/spikeinterface/spikeinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,156 root INFO copying src/neuroconv/tools/spikeinterface/spikeinterfacerecordingdatachunkiterator.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,157 root INFO copying src/neuroconv/tools/spikeinterface/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,157 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors 2024-03-21 12:56:27,157 root INFO copying src/neuroconv/tools/roiextractors/imagingextractordatachunkiterator.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors 2024-03-21 12:56:27,157 root INFO copying src/neuroconv/tools/roiextractors/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors 2024-03-21 12:56:27,157 root INFO copying src/neuroconv/tools/roiextractors/roiextractors.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors 2024-03-21 12:56:27,158 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,158 root INFO copying src/neuroconv/tools/nwb_helpers/_metadata_and_file_helpers.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,158 root INFO copying src/neuroconv/tools/nwb_helpers/_backend_configuration.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,159 root INFO copying src/neuroconv/tools/nwb_helpers/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,159 root INFO copying src/neuroconv/tools/nwb_helpers/_dataset_configuration.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,159 root INFO copying src/neuroconv/tools/nwb_helpers/_configure_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,159 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/_mock 2024-03-21 12:56:27,159 root INFO copying src/neuroconv/tools/testing/_mock/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/_mock 2024-03-21 12:56:27,160 root INFO copying src/neuroconv/tools/testing/_mock/_mock_dataset_models.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/_mock 2024-03-21 12:56:27,160 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,160 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/_base_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,160 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/_zarr_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,161 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/_zarr_dataset_io.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,161 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,161 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/_hdf5_dataset_io.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,161 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/_base_dataset_io.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,161 root INFO copying src/neuroconv/tools/nwb_helpers/_configuration_models/_hdf5_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,162 root INFO running egg_info 2024-03-21 12:56:27,162 root INFO creating src/neuroconv.egg-info 2024-03-21 12:56:27,182 root INFO writing src/neuroconv.egg-info/PKG-INFO 2024-03-21 12:56:27,202 root INFO writing dependency_links to src/neuroconv.egg-info/dependency_links.txt 2024-03-21 12:56:27,202 root INFO writing entry points to src/neuroconv.egg-info/entry_points.txt 2024-03-21 12:56:27,213 root INFO writing requirements to src/neuroconv.egg-info/requires.txt 2024-03-21 12:56:27,214 root INFO writing top-level names to src/neuroconv.egg-info/top_level.txt 2024-03-21 12:56:27,292 root INFO writing manifest file 'src/neuroconv.egg-info/SOURCES.txt' [03/21/24 12:56:27] ERROR listing git files failed - pretending there aren't any git.py:24 2024-03-21 12:56:27,386 root INFO reading manifest file 'src/neuroconv.egg-info/SOURCES.txt' 2024-03-21 12:56:27,386 root INFO reading manifest template 'MANIFEST.in' 2024-03-21 12:56:27,387 root INFO adding license file 'license.txt' 2024-03-21 12:56:27,389 root INFO writing manifest file 'src/neuroconv.egg-info/SOURCES.txt' /usr/lib/python3.11/site-packages/setuptools/command/build_py.py:207: _Warning: Package 'neuroconv.schemas' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'neuroconv.schemas' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'neuroconv.schemas' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'neuroconv.schemas' to be distributed and are already explicitly excluding 'neuroconv.schemas' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) 2024-03-21 12:56:27,396 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,396 root INFO copying src/neuroconv/schemas/base_metadata_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,396 root INFO copying src/neuroconv/schemas/metadata_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,397 root INFO copying src/neuroconv/schemas/source_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,397 root INFO copying src/neuroconv/schemas/time_series_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,397 root INFO copying src/neuroconv/schemas/timeintervals_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,398 root INFO copying src/neuroconv/schemas/yaml_conversion_specification_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas 2024-03-21 12:56:27,398 root INFO copying src/neuroconv/tools/testing/_path_expander_demo_ibl_filepaths.txt -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing 2024-03-21 12:56:27,404 root WARNING warning: build_py: byte-compiling is disabled, skipping. 2024-03-21 12:56:27,448 wheel INFO installing to /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel 2024-03-21 12:56:27,448 root INFO running install 2024-03-21 12:56:27,456 root INFO running install_lib 2024-03-21 12:56:27,477 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64 2024-03-21 12:56:27,477 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel 2024-03-21 12:56:27,477 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv 2024-03-21 12:56:27,477 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,477 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas/time_series_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,478 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas/source_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,478 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas/base_metadata_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,478 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas/metadata_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,478 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas/yaml_conversion_specification_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,479 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/schemas/timeintervals_schema.json -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/schemas 2024-03-21 12:56:27,479 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/nwbconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv 2024-03-21 12:56:27,479 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces 2024-03-21 12:56:27,479 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,480 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/extract 2024-03-21 12:56:27,480 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/extract/extractdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/extract 2024-03-21 12:56:27,480 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/extract/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/extract 2024-03-21 12:56:27,480 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/basesegmentationextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,480 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/cnmfe 2024-03-21 12:56:27,481 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/cnmfe/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/cnmfe 2024-03-21 12:56:27,481 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/cnmfe/cnmfedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/cnmfe 2024-03-21 12:56:27,481 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/hdf5 2024-03-21 12:56:27,481 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/hdf5/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/hdf5 2024-03-21 12:56:27,482 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/hdf5/hdf5datainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/hdf5 2024-03-21 12:56:27,482 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,482 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/suite2p 2024-03-21 12:56:27,482 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/suite2p/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/suite2p 2024-03-21 12:56:27,482 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/suite2p/suite2pdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/suite2p 2024-03-21 12:56:27,483 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/sima 2024-03-21 12:56:27,483 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sima/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/sima 2024-03-21 12:56:27,483 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sima/simadatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/sima 2024-03-21 12:56:27,483 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/caiman 2024-03-21 12:56:27,483 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/caiman/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/caiman 2024-03-21 12:56:27,484 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/caiman/caimandatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/caiman 2024-03-21 12:56:27,484 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/sbx 2024-03-21 12:56:27,484 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sbx/sbxdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/sbx 2024-03-21 12:56:27,484 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/sbx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/sbx 2024-03-21 12:56:27,485 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/scanimage 2024-03-21 12:56:27,485 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/scanimage/scanimageimaginginterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/scanimage 2024-03-21 12:56:27,485 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/scanimage/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/scanimage 2024-03-21 12:56:27,485 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/baseimagingextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys 2024-03-21 12:56:27,486 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,486 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,486 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope/miniscopeconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,486 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/miniscope/miniscopeimagingdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/miniscope 2024-03-21 12:56:27,486 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/micromanagertiff 2024-03-21 12:56:27,487 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/micromanagertiff/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/micromanagertiff 2024-03-21 12:56:27,487 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/micromanagertiff/micromanagertiffdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/micromanagertiff 2024-03-21 12:56:27,487 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,487 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff/brukertiffdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,488 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff/brukertiffconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,488 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/brukertiff/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/brukertiff 2024-03-21 12:56:27,488 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/tiff 2024-03-21 12:56:27,488 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/tiff/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/tiff 2024-03-21 12:56:27,488 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ophys/tiff/tiffdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ophys/tiff 2024-03-21 12:56:27,489 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior 2024-03-21 12:56:27,489 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/audio 2024-03-21 12:56:27,489 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/audio/audiointerface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/audio 2024-03-21 12:56:27,489 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/audio/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/audio 2024-03-21 12:56:27,490 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,490 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap/sleapdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,490 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,490 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/sleap/sleap_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/sleap 2024-03-21 12:56:27,490 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior 2024-03-21 12:56:27,491 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,491 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx/neuralynx_nvt_interface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,491 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,491 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/neuralynx/nvt_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/neuralynx 2024-03-21 12:56:27,492 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,492 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose/lightningposedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,492 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose/lightningposeconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,492 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/lightningpose/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/lightningpose 2024-03-21 12:56:27,492 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/miniscope 2024-03-21 12:56:27,493 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/miniscope/miniscopedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/miniscope 2024-03-21 12:56:27,493 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/miniscope/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/miniscope 2024-03-21 12:56:27,493 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/deeplabcut 2024-03-21 12:56:27,493 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/deeplabcut/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/deeplabcut 2024-03-21 12:56:27,494 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/deeplabcut/deeplabcutdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/deeplabcut 2024-03-21 12:56:27,494 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,494 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video/video_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,495 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,495 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/video/videodatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/video 2024-03-21 12:56:27,496 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/fictrac 2024-03-21 12:56:27,496 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/fictrac/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/fictrac 2024-03-21 12:56:27,496 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/behavior/fictrac/fictracdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/behavior/fictrac 2024-03-21 12:56:27,497 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces 2024-03-21 12:56:27,497 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/icephys 2024-03-21 12:56:27,497 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/icephys/abf 2024-03-21 12:56:27,498 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/abf/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/icephys/abf 2024-03-21 12:56:27,498 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/abf/abfdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/icephys/abf 2024-03-21 12:56:27,498 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/icephys 2024-03-21 12:56:27,499 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/icephys/baseicephysinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/icephys 2024-03-21 12:56:27,499 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text 2024-03-21 12:56:27,499 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text/excel 2024-03-21 12:56:27,499 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/excel/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text/excel 2024-03-21 12:56:27,500 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/excel/exceltimeintervalsinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text/excel 2024-03-21 12:56:27,500 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/timeintervalsinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text 2024-03-21 12:56:27,500 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text 2024-03-21 12:56:27,501 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text/csv 2024-03-21 12:56:27,501 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/csv/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text/csv 2024-03-21 12:56:27,501 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/text/csv/csvtimeintervalsinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/text/csv 2024-03-21 12:56:27,502 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,502 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/plexon 2024-03-21 12:56:27,502 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/plexon/plexondatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/plexon 2024-03-21 12:56:27,503 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/plexon/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/plexon 2024-03-21 12:56:27,503 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/basesortingextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,504 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/biocam 2024-03-21 12:56:27,504 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/biocam/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/biocam 2024-03-21 12:56:27,504 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/biocam/biocamdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/biocam 2024-03-21 12:56:27,505 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,505 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona/axona_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,505 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,505 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/axona/axonadatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/axona 2024-03-21 12:56:27,506 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,506 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope/neuroscopedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,506 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope/neuroscope_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,506 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuroscope/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuroscope 2024-03-21 12:56:27,507 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/baselfpextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,507 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,507 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,507 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,508 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,508 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys/openephyssortingdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,508 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/openephys/openephysbinarydatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/openephys 2024-03-21 12:56:27,508 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikegadgets 2024-03-21 12:56:27,509 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikegadgets/spikegadgetsdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikegadgets 2024-03-21 12:56:27,509 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikegadgets/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikegadgets 2024-03-21 12:56:27,509 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,509 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,510 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx/spikeglx_utils.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,510 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx/spikeglxconverter.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,510 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,510 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx/spikeglxdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,511 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spikeglx/spikeglxnidqinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spikeglx 2024-03-21 12:56:27,511 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuralynx 2024-03-21 12:56:27,511 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuralynx/neuralynxdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuralynx 2024-03-21 12:56:27,511 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/neuralynx/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/neuralynx 2024-03-21 12:56:27,512 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/mearec 2024-03-21 12:56:27,512 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mearec/mearecdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/mearec 2024-03-21 12:56:27,512 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mearec/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/mearec 2024-03-21 12:56:27,513 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/cellexplorer 2024-03-21 12:56:27,513 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/cellexplorer/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/cellexplorer 2024-03-21 12:56:27,513 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/cellexplorer/cellexplorerdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/cellexplorer 2024-03-21 12:56:27,514 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/kilosort 2024-03-21 12:56:27,514 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/kilosort/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/kilosort 2024-03-21 12:56:27,514 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/kilosort/kilosortdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/kilosort 2024-03-21 12:56:27,515 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/tdt 2024-03-21 12:56:27,515 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/tdt/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/tdt 2024-03-21 12:56:27,515 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/tdt/tdtdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/tdt 2024-03-21 12:56:27,516 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spike2 2024-03-21 12:56:27,516 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spike2/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spike2 2024-03-21 12:56:27,516 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/spike2/spike2datainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/spike2 2024-03-21 12:56:27,517 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/intan 2024-03-21 12:56:27,517 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/intan/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/intan 2024-03-21 12:56:27,517 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/intan/intandatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/intan 2024-03-21 12:56:27,518 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/maxwell 2024-03-21 12:56:27,518 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/maxwell/maxonedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/maxwell 2024-03-21 12:56:27,518 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/maxwell/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/maxwell 2024-03-21 12:56:27,519 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/baserecordingextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys 2024-03-21 12:56:27,519 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/alphaomega 2024-03-21 12:56:27,519 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/alphaomega/alphaomegadatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/alphaomega 2024-03-21 12:56:27,520 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/alphaomega/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/alphaomega 2024-03-21 12:56:27,520 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/edf 2024-03-21 12:56:27,520 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/edf/edfdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/edf 2024-03-21 12:56:27,520 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/edf/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/edf 2024-03-21 12:56:27,521 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/phy 2024-03-21 12:56:27,521 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/phy/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/phy 2024-03-21 12:56:27,521 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/phy/phydatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/phy 2024-03-21 12:56:27,522 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,522 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock/blackrockdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,522 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock/header_tools.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,523 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/blackrock/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/blackrock 2024-03-21 12:56:27,523 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/mcsraw 2024-03-21 12:56:27,523 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mcsraw/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/mcsraw 2024-03-21 12:56:27,523 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/datainterfaces/ecephys/mcsraw/mcsrawdatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/datainterfaces/ecephys/mcsraw 2024-03-21 12:56:27,524 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,524 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils/checks.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,524 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils/dict.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,524 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils/path.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,525 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils/json_schema.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,525 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,525 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/utils/types.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/utils 2024-03-21 12:56:27,525 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/basetemporalalignmentinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv 2024-03-21 12:56:27,526 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/basedatainterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv 2024-03-21 12:56:27,526 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv 2024-03-21 12:56:27,526 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,526 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/audio 2024-03-21 12:56:27,526 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/audio/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/audio 2024-03-21 12:56:27,527 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/audio/audio.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/audio 2024-03-21 12:56:27,527 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/importing.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,527 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/path_expansion.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,527 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,528 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/yaml_conversion_specification 2024-03-21 12:56:27,528 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/yaml_conversion_specification/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/yaml_conversion_specification 2024-03-21 12:56:27,528 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/yaml_conversion_specification/_yaml_conversion_specification.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/yaml_conversion_specification 2024-03-21 12:56:27,528 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/hdmf.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,529 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/processes.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,529 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/data_transfers 2024-03-21 12:56:27,529 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers/_aws.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/data_transfers 2024-03-21 12:56:27,529 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/data_transfers 2024-03-21 12:56:27,529 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers/_helpers.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/data_transfers 2024-03-21 12:56:27,530 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers/_globus.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/data_transfers 2024-03-21 12:56:27,530 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/data_transfers/_dandi.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/data_transfers 2024-03-21 12:56:27,530 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/text.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,530 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/neo 2024-03-21 12:56:27,531 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/neo/neo.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/neo 2024-03-21 12:56:27,531 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/neo/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/neo 2024-03-21 12:56:27,531 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,531 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/mock_probes.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,532 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing/_mock 2024-03-21 12:56:27,532 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/_mock/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing/_mock 2024-03-21 12:56:27,532 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/_mock/_mock_dataset_models.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing/_mock 2024-03-21 12:56:27,532 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/mock_files.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,532 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/mock_ttl_signals.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,533 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,533 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/mock_interfaces.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,533 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/data_interface_mixins.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,533 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/testing/_path_expander_demo_ibl_filepaths.txt -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/testing 2024-03-21 12:56:27,534 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,534 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface/spikeinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,534 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface/spikeinterfacerecordingdatachunkiterator.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,535 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/spikeinterface/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/spikeinterface 2024-03-21 12:56:27,535 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/figshare.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,535 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/roiextractors 2024-03-21 12:56:27,535 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors/imagingextractordatachunkiterator.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/roiextractors 2024-03-21 12:56:27,535 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/roiextractors 2024-03-21 12:56:27,536 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/roiextractors/roiextractors.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/roiextractors 2024-03-21 12:56:27,536 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,536 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_metadata_and_file_helpers.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,536 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_backend_configuration.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,537 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,537 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_dataset_configuration.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,537 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,537 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/_base_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,538 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/_zarr_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,538 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/_zarr_dataset_io.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,538 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,538 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/_hdf5_dataset_io.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,539 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/_base_dataset_io.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,539 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configuration_models/_hdf5_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers/_configuration_models 2024-03-21 12:56:27,539 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/nwb_helpers/_configure_backend.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools/nwb_helpers 2024-03-21 12:56:27,539 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/tools/signal_processing.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/tools 2024-03-21 12:56:27,540 root INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/converters 2024-03-21 12:56:27,540 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/converters/__init__.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv/converters 2024-03-21 12:56:27,540 root INFO copying /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/lib/neuroconv/baseextractorinterface.py -> /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv 2024-03-21 12:56:27,540 root WARNING warning: install_lib: byte-compiling is disabled, skipping. 2024-03-21 12:56:27,540 root INFO running install_egg_info 2024-03-21 12:56:27,561 root INFO Copying src/neuroconv.egg-info to /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv-0.4.8-py3.11.egg-info 2024-03-21 12:56:27,563 root INFO running install_scripts 2024-03-21 12:56:27,584 wheel INFO creating /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel/neuroconv-0.4.8.dist-info/WHEEL 2024-03-21 12:56:27,585 wheel INFO creating '/var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/wheel/.tmp-698xbq5v/neuroconv-0.4.8-py3-none-any.whl' and adding '/var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel' to it 2024-03-21 12:56:27,585 wheel INFO adding 'neuroconv/__init__.py' 2024-03-21 12:56:27,585 wheel INFO adding 'neuroconv/basedatainterface.py' 2024-03-21 12:56:27,585 wheel INFO adding 'neuroconv/baseextractorinterface.py' 2024-03-21 12:56:27,586 wheel INFO adding 'neuroconv/basetemporalalignmentinterface.py' 2024-03-21 12:56:27,586 wheel INFO adding 'neuroconv/nwbconverter.py' 2024-03-21 12:56:27,586 wheel INFO adding 'neuroconv/converters/__init__.py' 2024-03-21 12:56:27,586 wheel INFO adding 'neuroconv/datainterfaces/__init__.py' 2024-03-21 12:56:27,587 wheel INFO adding 'neuroconv/datainterfaces/behavior/__init__.py' 2024-03-21 12:56:27,587 wheel INFO adding 'neuroconv/datainterfaces/behavior/audio/__init__.py' 2024-03-21 12:56:27,587 wheel INFO adding 'neuroconv/datainterfaces/behavior/audio/audiointerface.py' 2024-03-21 12:56:27,587 wheel INFO adding 'neuroconv/datainterfaces/behavior/deeplabcut/__init__.py' 2024-03-21 12:56:27,587 wheel INFO adding 'neuroconv/datainterfaces/behavior/deeplabcut/deeplabcutdatainterface.py' 2024-03-21 12:56:27,588 wheel INFO adding 'neuroconv/datainterfaces/behavior/fictrac/__init__.py' 2024-03-21 12:56:27,588 wheel INFO adding 'neuroconv/datainterfaces/behavior/fictrac/fictracdatainterface.py' 2024-03-21 12:56:27,588 wheel INFO adding 'neuroconv/datainterfaces/behavior/lightningpose/__init__.py' 2024-03-21 12:56:27,588 wheel INFO adding 'neuroconv/datainterfaces/behavior/lightningpose/lightningposeconverter.py' 2024-03-21 12:56:27,588 wheel INFO adding 'neuroconv/datainterfaces/behavior/lightningpose/lightningposedatainterface.py' 2024-03-21 12:56:27,589 wheel INFO adding 'neuroconv/datainterfaces/behavior/miniscope/__init__.py' 2024-03-21 12:56:27,589 wheel INFO adding 'neuroconv/datainterfaces/behavior/miniscope/miniscopedatainterface.py' 2024-03-21 12:56:27,589 wheel INFO adding 'neuroconv/datainterfaces/behavior/neuralynx/__init__.py' 2024-03-21 12:56:27,589 wheel INFO adding 'neuroconv/datainterfaces/behavior/neuralynx/neuralynx_nvt_interface.py' 2024-03-21 12:56:27,589 wheel INFO adding 'neuroconv/datainterfaces/behavior/neuralynx/nvt_utils.py' 2024-03-21 12:56:27,590 wheel INFO adding 'neuroconv/datainterfaces/behavior/sleap/__init__.py' 2024-03-21 12:56:27,590 wheel INFO adding 'neuroconv/datainterfaces/behavior/sleap/sleap_utils.py' 2024-03-21 12:56:27,590 wheel INFO adding 'neuroconv/datainterfaces/behavior/sleap/sleapdatainterface.py' 2024-03-21 12:56:27,590 wheel INFO adding 'neuroconv/datainterfaces/behavior/video/__init__.py' 2024-03-21 12:56:27,590 wheel INFO adding 'neuroconv/datainterfaces/behavior/video/video_utils.py' 2024-03-21 12:56:27,591 wheel INFO adding 'neuroconv/datainterfaces/behavior/video/videodatainterface.py' 2024-03-21 12:56:27,591 wheel INFO adding 'neuroconv/datainterfaces/ecephys/__init__.py' 2024-03-21 12:56:27,591 wheel INFO adding 'neuroconv/datainterfaces/ecephys/baselfpextractorinterface.py' 2024-03-21 12:56:27,591 wheel INFO adding 'neuroconv/datainterfaces/ecephys/baserecordingextractorinterface.py' 2024-03-21 12:56:27,591 wheel INFO adding 'neuroconv/datainterfaces/ecephys/basesortingextractorinterface.py' 2024-03-21 12:56:27,592 wheel INFO adding 'neuroconv/datainterfaces/ecephys/alphaomega/__init__.py' 2024-03-21 12:56:27,592 wheel INFO adding 'neuroconv/datainterfaces/ecephys/alphaomega/alphaomegadatainterface.py' 2024-03-21 12:56:27,592 wheel INFO adding 'neuroconv/datainterfaces/ecephys/axona/__init__.py' 2024-03-21 12:56:27,592 wheel INFO adding 'neuroconv/datainterfaces/ecephys/axona/axona_utils.py' 2024-03-21 12:56:27,592 wheel INFO adding 'neuroconv/datainterfaces/ecephys/axona/axonadatainterface.py' 2024-03-21 12:56:27,593 wheel INFO adding 'neuroconv/datainterfaces/ecephys/biocam/__init__.py' 2024-03-21 12:56:27,593 wheel INFO adding 'neuroconv/datainterfaces/ecephys/biocam/biocamdatainterface.py' 2024-03-21 12:56:27,593 wheel INFO adding 'neuroconv/datainterfaces/ecephys/blackrock/__init__.py' 2024-03-21 12:56:27,593 wheel INFO adding 'neuroconv/datainterfaces/ecephys/blackrock/blackrockdatainterface.py' 2024-03-21 12:56:27,593 wheel INFO adding 'neuroconv/datainterfaces/ecephys/blackrock/header_tools.py' 2024-03-21 12:56:27,594 wheel INFO adding 'neuroconv/datainterfaces/ecephys/cellexplorer/__init__.py' 2024-03-21 12:56:27,594 wheel INFO adding 'neuroconv/datainterfaces/ecephys/cellexplorer/cellexplorerdatainterface.py' 2024-03-21 12:56:27,594 wheel INFO adding 'neuroconv/datainterfaces/ecephys/edf/__init__.py' 2024-03-21 12:56:27,594 wheel INFO adding 'neuroconv/datainterfaces/ecephys/edf/edfdatainterface.py' 2024-03-21 12:56:27,594 wheel INFO adding 'neuroconv/datainterfaces/ecephys/intan/__init__.py' 2024-03-21 12:56:27,595 wheel INFO adding 'neuroconv/datainterfaces/ecephys/intan/intandatainterface.py' 2024-03-21 12:56:27,595 wheel INFO adding 'neuroconv/datainterfaces/ecephys/kilosort/__init__.py' 2024-03-21 12:56:27,595 wheel INFO adding 'neuroconv/datainterfaces/ecephys/kilosort/kilosortdatainterface.py' 2024-03-21 12:56:27,595 wheel INFO adding 'neuroconv/datainterfaces/ecephys/maxwell/__init__.py' 2024-03-21 12:56:27,595 wheel INFO adding 'neuroconv/datainterfaces/ecephys/maxwell/maxonedatainterface.py' 2024-03-21 12:56:27,596 wheel INFO adding 'neuroconv/datainterfaces/ecephys/mcsraw/__init__.py' 2024-03-21 12:56:27,596 wheel INFO adding 'neuroconv/datainterfaces/ecephys/mcsraw/mcsrawdatainterface.py' 2024-03-21 12:56:27,596 wheel INFO adding 'neuroconv/datainterfaces/ecephys/mearec/__init__.py' 2024-03-21 12:56:27,596 wheel INFO adding 'neuroconv/datainterfaces/ecephys/mearec/mearecdatainterface.py' 2024-03-21 12:56:27,596 wheel INFO adding 'neuroconv/datainterfaces/ecephys/neuralynx/__init__.py' 2024-03-21 12:56:27,596 wheel INFO adding 'neuroconv/datainterfaces/ecephys/neuralynx/neuralynxdatainterface.py' 2024-03-21 12:56:27,597 wheel INFO adding 'neuroconv/datainterfaces/ecephys/neuroscope/__init__.py' 2024-03-21 12:56:27,597 wheel INFO adding 'neuroconv/datainterfaces/ecephys/neuroscope/neuroscope_utils.py' 2024-03-21 12:56:27,597 wheel INFO adding 'neuroconv/datainterfaces/ecephys/neuroscope/neuroscopedatainterface.py' 2024-03-21 12:56:27,597 wheel INFO adding 'neuroconv/datainterfaces/ecephys/openephys/__init__.py' 2024-03-21 12:56:27,597 wheel INFO adding 'neuroconv/datainterfaces/ecephys/openephys/openephysbinarydatainterface.py' 2024-03-21 12:56:27,598 wheel INFO adding 'neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py' 2024-03-21 12:56:27,598 wheel INFO adding 'neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py' 2024-03-21 12:56:27,598 wheel INFO adding 'neuroconv/datainterfaces/ecephys/openephys/openephyssortingdatainterface.py' 2024-03-21 12:56:27,598 wheel INFO adding 'neuroconv/datainterfaces/ecephys/phy/__init__.py' 2024-03-21 12:56:27,598 wheel INFO adding 'neuroconv/datainterfaces/ecephys/phy/phydatainterface.py' 2024-03-21 12:56:27,599 wheel INFO adding 'neuroconv/datainterfaces/ecephys/plexon/__init__.py' 2024-03-21 12:56:27,599 wheel INFO adding 'neuroconv/datainterfaces/ecephys/plexon/plexondatainterface.py' 2024-03-21 12:56:27,599 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spike2/__init__.py' 2024-03-21 12:56:27,599 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spike2/spike2datainterface.py' 2024-03-21 12:56:27,599 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikegadgets/__init__.py' 2024-03-21 12:56:27,600 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikegadgets/spikegadgetsdatainterface.py' 2024-03-21 12:56:27,600 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikeglx/__init__.py' 2024-03-21 12:56:27,600 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikeglx/spikeglx_utils.py' 2024-03-21 12:56:27,600 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikeglx/spikeglxconverter.py' 2024-03-21 12:56:27,600 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikeglx/spikeglxdatainterface.py' 2024-03-21 12:56:27,600 wheel INFO adding 'neuroconv/datainterfaces/ecephys/spikeglx/spikeglxnidqinterface.py' 2024-03-21 12:56:27,601 wheel INFO adding 'neuroconv/datainterfaces/ecephys/tdt/__init__.py' 2024-03-21 12:56:27,601 wheel INFO adding 'neuroconv/datainterfaces/ecephys/tdt/tdtdatainterface.py' 2024-03-21 12:56:27,601 wheel INFO adding 'neuroconv/datainterfaces/icephys/__init__.py' 2024-03-21 12:56:27,601 wheel INFO adding 'neuroconv/datainterfaces/icephys/baseicephysinterface.py' 2024-03-21 12:56:27,601 wheel INFO adding 'neuroconv/datainterfaces/icephys/abf/__init__.py' 2024-03-21 12:56:27,602 wheel INFO adding 'neuroconv/datainterfaces/icephys/abf/abfdatainterface.py' 2024-03-21 12:56:27,602 wheel INFO adding 'neuroconv/datainterfaces/ophys/__init__.py' 2024-03-21 12:56:27,602 wheel INFO adding 'neuroconv/datainterfaces/ophys/baseimagingextractorinterface.py' 2024-03-21 12:56:27,602 wheel INFO adding 'neuroconv/datainterfaces/ophys/basesegmentationextractorinterface.py' 2024-03-21 12:56:27,602 wheel INFO adding 'neuroconv/datainterfaces/ophys/brukertiff/__init__.py' 2024-03-21 12:56:27,602 wheel INFO adding 'neuroconv/datainterfaces/ophys/brukertiff/brukertiffconverter.py' 2024-03-21 12:56:27,603 wheel INFO adding 'neuroconv/datainterfaces/ophys/brukertiff/brukertiffdatainterface.py' 2024-03-21 12:56:27,603 wheel INFO adding 'neuroconv/datainterfaces/ophys/caiman/__init__.py' 2024-03-21 12:56:27,603 wheel INFO adding 'neuroconv/datainterfaces/ophys/caiman/caimandatainterface.py' 2024-03-21 12:56:27,603 wheel INFO adding 'neuroconv/datainterfaces/ophys/cnmfe/__init__.py' 2024-03-21 12:56:27,603 wheel INFO adding 'neuroconv/datainterfaces/ophys/cnmfe/cnmfedatainterface.py' 2024-03-21 12:56:27,604 wheel INFO adding 'neuroconv/datainterfaces/ophys/extract/__init__.py' 2024-03-21 12:56:27,604 wheel INFO adding 'neuroconv/datainterfaces/ophys/extract/extractdatainterface.py' 2024-03-21 12:56:27,604 wheel INFO adding 'neuroconv/datainterfaces/ophys/hdf5/__init__.py' 2024-03-21 12:56:27,604 wheel INFO adding 'neuroconv/datainterfaces/ophys/hdf5/hdf5datainterface.py' 2024-03-21 12:56:27,604 wheel INFO adding 'neuroconv/datainterfaces/ophys/micromanagertiff/__init__.py' 2024-03-21 12:56:27,605 wheel INFO adding 'neuroconv/datainterfaces/ophys/micromanagertiff/micromanagertiffdatainterface.py' 2024-03-21 12:56:27,605 wheel INFO adding 'neuroconv/datainterfaces/ophys/miniscope/__init__.py' 2024-03-21 12:56:27,605 wheel INFO adding 'neuroconv/datainterfaces/ophys/miniscope/miniscopeconverter.py' 2024-03-21 12:56:27,605 wheel INFO adding 'neuroconv/datainterfaces/ophys/miniscope/miniscopeimagingdatainterface.py' 2024-03-21 12:56:27,605 wheel INFO adding 'neuroconv/datainterfaces/ophys/sbx/__init__.py' 2024-03-21 12:56:27,606 wheel INFO adding 'neuroconv/datainterfaces/ophys/sbx/sbxdatainterface.py' 2024-03-21 12:56:27,606 wheel INFO adding 'neuroconv/datainterfaces/ophys/scanimage/__init__.py' 2024-03-21 12:56:27,606 wheel INFO adding 'neuroconv/datainterfaces/ophys/scanimage/scanimageimaginginterface.py' 2024-03-21 12:56:27,606 wheel INFO adding 'neuroconv/datainterfaces/ophys/sima/__init__.py' 2024-03-21 12:56:27,606 wheel INFO adding 'neuroconv/datainterfaces/ophys/sima/simadatainterface.py' 2024-03-21 12:56:27,606 wheel INFO adding 'neuroconv/datainterfaces/ophys/suite2p/__init__.py' 2024-03-21 12:56:27,607 wheel INFO adding 'neuroconv/datainterfaces/ophys/suite2p/suite2pdatainterface.py' 2024-03-21 12:56:27,607 wheel INFO adding 'neuroconv/datainterfaces/ophys/tiff/__init__.py' 2024-03-21 12:56:27,607 wheel INFO adding 'neuroconv/datainterfaces/ophys/tiff/tiffdatainterface.py' 2024-03-21 12:56:27,607 wheel INFO adding 'neuroconv/datainterfaces/text/__init__.py' 2024-03-21 12:56:27,607 wheel INFO adding 'neuroconv/datainterfaces/text/timeintervalsinterface.py' 2024-03-21 12:56:27,608 wheel INFO adding 'neuroconv/datainterfaces/text/csv/__init__.py' 2024-03-21 12:56:27,608 wheel INFO adding 'neuroconv/datainterfaces/text/csv/csvtimeintervalsinterface.py' 2024-03-21 12:56:27,608 wheel INFO adding 'neuroconv/datainterfaces/text/excel/__init__.py' 2024-03-21 12:56:27,608 wheel INFO adding 'neuroconv/datainterfaces/text/excel/exceltimeintervalsinterface.py' 2024-03-21 12:56:27,608 wheel INFO adding 'neuroconv/schemas/base_metadata_schema.json' 2024-03-21 12:56:27,609 wheel INFO adding 'neuroconv/schemas/metadata_schema.json' 2024-03-21 12:56:27,609 wheel INFO adding 'neuroconv/schemas/source_schema.json' 2024-03-21 12:56:27,609 wheel INFO adding 'neuroconv/schemas/time_series_schema.json' 2024-03-21 12:56:27,609 wheel INFO adding 'neuroconv/schemas/timeintervals_schema.json' 2024-03-21 12:56:27,609 wheel INFO adding 'neuroconv/schemas/yaml_conversion_specification_schema.json' 2024-03-21 12:56:27,609 wheel INFO adding 'neuroconv/tools/__init__.py' 2024-03-21 12:56:27,610 wheel INFO adding 'neuroconv/tools/figshare.py' 2024-03-21 12:56:27,610 wheel INFO adding 'neuroconv/tools/hdmf.py' 2024-03-21 12:56:27,610 wheel INFO adding 'neuroconv/tools/importing.py' 2024-03-21 12:56:27,610 wheel INFO adding 'neuroconv/tools/path_expansion.py' 2024-03-21 12:56:27,610 wheel INFO adding 'neuroconv/tools/processes.py' 2024-03-21 12:56:27,610 wheel INFO adding 'neuroconv/tools/signal_processing.py' 2024-03-21 12:56:27,611 wheel INFO adding 'neuroconv/tools/text.py' 2024-03-21 12:56:27,611 wheel INFO adding 'neuroconv/tools/audio/__init__.py' 2024-03-21 12:56:27,611 wheel INFO adding 'neuroconv/tools/audio/audio.py' 2024-03-21 12:56:27,611 wheel INFO adding 'neuroconv/tools/data_transfers/__init__.py' 2024-03-21 12:56:27,611 wheel INFO adding 'neuroconv/tools/data_transfers/_aws.py' 2024-03-21 12:56:27,611 wheel INFO adding 'neuroconv/tools/data_transfers/_dandi.py' 2024-03-21 12:56:27,612 wheel INFO adding 'neuroconv/tools/data_transfers/_globus.py' 2024-03-21 12:56:27,612 wheel INFO adding 'neuroconv/tools/data_transfers/_helpers.py' 2024-03-21 12:56:27,612 wheel INFO adding 'neuroconv/tools/neo/__init__.py' 2024-03-21 12:56:27,612 wheel INFO adding 'neuroconv/tools/neo/neo.py' 2024-03-21 12:56:27,612 wheel INFO adding 'neuroconv/tools/nwb_helpers/__init__.py' 2024-03-21 12:56:27,613 wheel INFO adding 'neuroconv/tools/nwb_helpers/_backend_configuration.py' 2024-03-21 12:56:27,613 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configure_backend.py' 2024-03-21 12:56:27,613 wheel INFO adding 'neuroconv/tools/nwb_helpers/_dataset_configuration.py' 2024-03-21 12:56:27,613 wheel INFO adding 'neuroconv/tools/nwb_helpers/_metadata_and_file_helpers.py' 2024-03-21 12:56:27,613 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/__init__.py' 2024-03-21 12:56:27,613 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/_base_backend.py' 2024-03-21 12:56:27,614 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/_base_dataset_io.py' 2024-03-21 12:56:27,614 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/_hdf5_backend.py' 2024-03-21 12:56:27,614 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/_hdf5_dataset_io.py' 2024-03-21 12:56:27,614 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/_zarr_backend.py' 2024-03-21 12:56:27,614 wheel INFO adding 'neuroconv/tools/nwb_helpers/_configuration_models/_zarr_dataset_io.py' 2024-03-21 12:56:27,615 wheel INFO adding 'neuroconv/tools/roiextractors/__init__.py' 2024-03-21 12:56:27,615 wheel INFO adding 'neuroconv/tools/roiextractors/imagingextractordatachunkiterator.py' 2024-03-21 12:56:27,615 wheel INFO adding 'neuroconv/tools/roiextractors/roiextractors.py' 2024-03-21 12:56:27,615 wheel INFO adding 'neuroconv/tools/spikeinterface/__init__.py' 2024-03-21 12:56:27,615 wheel INFO adding 'neuroconv/tools/spikeinterface/spikeinterface.py' 2024-03-21 12:56:27,616 wheel INFO adding 'neuroconv/tools/spikeinterface/spikeinterfacerecordingdatachunkiterator.py' 2024-03-21 12:56:27,616 wheel INFO adding 'neuroconv/tools/testing/__init__.py' 2024-03-21 12:56:27,616 wheel INFO adding 'neuroconv/tools/testing/_path_expander_demo_ibl_filepaths.txt' 2024-03-21 12:56:27,616 wheel INFO adding 'neuroconv/tools/testing/data_interface_mixins.py' 2024-03-21 12:56:27,617 wheel INFO adding 'neuroconv/tools/testing/mock_files.py' 2024-03-21 12:56:27,617 wheel INFO adding 'neuroconv/tools/testing/mock_interfaces.py' 2024-03-21 12:56:27,617 wheel INFO adding 'neuroconv/tools/testing/mock_probes.py' 2024-03-21 12:56:27,617 wheel INFO adding 'neuroconv/tools/testing/mock_ttl_signals.py' 2024-03-21 12:56:27,617 wheel INFO adding 'neuroconv/tools/testing/_mock/__init__.py' 2024-03-21 12:56:27,617 wheel INFO adding 'neuroconv/tools/testing/_mock/_mock_dataset_models.py' 2024-03-21 12:56:27,618 wheel INFO adding 'neuroconv/tools/yaml_conversion_specification/__init__.py' 2024-03-21 12:56:27,618 wheel INFO adding 'neuroconv/tools/yaml_conversion_specification/_yaml_conversion_specification.py' 2024-03-21 12:56:27,618 wheel INFO adding 'neuroconv/utils/__init__.py' 2024-03-21 12:56:27,618 wheel INFO adding 'neuroconv/utils/checks.py' 2024-03-21 12:56:27,618 wheel INFO adding 'neuroconv/utils/dict.py' 2024-03-21 12:56:27,619 wheel INFO adding 'neuroconv/utils/json_schema.py' 2024-03-21 12:56:27,619 wheel INFO adding 'neuroconv/utils/path.py' 2024-03-21 12:56:27,619 wheel INFO adding 'neuroconv/utils/types.py' 2024-03-21 12:56:27,619 wheel INFO adding 'neuroconv-0.4.8.dist-info/METADATA' 2024-03-21 12:56:27,619 wheel INFO adding 'neuroconv-0.4.8.dist-info/WHEEL' 2024-03-21 12:56:27,620 wheel INFO adding 'neuroconv-0.4.8.dist-info/entry_points.txt' 2024-03-21 12:56:27,620 wheel INFO adding 'neuroconv-0.4.8.dist-info/license.txt' 2024-03-21 12:56:27,620 wheel INFO adding 'neuroconv-0.4.8.dist-info/top_level.txt' 2024-03-21 12:56:27,621 wheel INFO adding 'neuroconv-0.4.8.dist-info/RECORD' 2024-03-21 12:56:27,621 wheel INFO removing /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/build/bdist.linux-x86_64/wheel 2024-03-21 12:56:27,632 gpep517 INFO The backend produced /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/wheel/neuroconv-0.4.8-py3-none-any.whl * Installing neuroconv-0.4.8-py3-none-any.whl to /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install python3.11 -m gpep517 install-wheel --destdir=/var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install --interpreter=/usr/bin/python3.11 --prefix=/usr --optimize=all /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/wheel/neuroconv-0.4.8-py3-none-any.whl 2024-03-21 12:56:27,754 gpep517 INFO Installing /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/wheel/neuroconv-0.4.8-py3-none-any.whl into /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install 2024-03-21 12:56:28,182 gpep517 INFO Installation complete >>> Source compiled. >>> Test phase: sci-biology/neuroconv-0.4.8 * python3_11: running distutils-r1_run_phase python_test python3.11 -m pytest -vv -ra -l -Wdefault --color=yes -o console_output_style=count -o tmp_path_retention_count=0 -o tmp_path_retention_policy=failed -p no:cov -p no:flake8 -p no:flakes -p no:pylint -p no:markdown -p no:sugar -p no:xvfb -p no:pytest-describe -p no:plus -p no:tavern -p no:salt-factories tests/test_minimal tests/test_ecephys =========================================================== test session starts ============================================================ platform linux -- Python 3.11.8, pytest-7.4.4, pluggy-1.4.0 -- /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install/usr/bin/python3.11 cachedir: .pytest_cache rootdir: /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8 configfile: pyproject.toml plugins: mock-3.12.0, pyfakefs-5.3.5, rerunfailures-14.0, anyio-4.2.0, pkgcore-0.12.24 collecting ... collected 270 items tests/test_minimal/test_converter.py::test_converter PASSED [ 1/270] tests/test_minimal/test_converter.py::TestNWBConverterAndPipeInitialization::test_child_class_source_data_init PASSED [ 2/270] tests/test_minimal/test_converter.py::TestNWBConverterAndPipeInitialization::test_consistent_init_pipe_vs_nwb PASSED [ 3/270] tests/test_minimal/test_converter.py::TestNWBConverterAndPipeInitialization::test_pipe_list_dict PASSED [ 4/270] tests/test_minimal/test_converter.py::TestNWBConverterAndPipeInitialization::test_pipe_list_init PASSED [ 5/270] tests/test_minimal/test_converter.py::TestNWBConverterAndPipeInitialization::test_unique_names_with_list_argument PASSED [ 6/270] tests/test_minimal/test_metadata_schema.py::test_metadata_schema PASSED [ 7/270] tests/test_minimal/test_metadata_schema.py::test_invalid_ophys_metadata PASSED [ 8/270] tests/test_minimal/test_metadata_schema.py::test_invalid_ophys_plane_metadata PASSED [ 9/270] tests/test_minimal/test_metadata_schema.py::test_ophys_plane_fix PASSED [ 10/270] tests/test_minimal/test_tools_hdmf.py::TestIteratorAssertions::test_buffer_bigger_than_chunk_assertion PASSED [ 11/270] tests/test_minimal/test_tools_hdmf.py::test_early_exit PASSED [ 12/270] tests/test_minimal/test_tools_hdmf.py::test_buffer_padding_long_shape PASSED [ 13/270] tests/test_minimal/test_tools_hdmf.py::test_buffer_padding_mixed_shape PASSED [ 14/270] tests/test_minimal/test_tools_hdmf.py::test_min_axis_too_large PASSED [ 15/270] tests/test_minimal/test_tools_hdmf.py::test_sliceable_data_chunk_iterator PASSED [ 16/270] tests/test_minimal/test_tools_hdmf.py::test_sliceable_data_chunk_iterator_edge_case_1 PASSED [ 17/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_baseline_mean_int_dtype_float_assertion PASSED [ 18/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_baseline_mean_int_dtype_int_assertion PASSED [ 19/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_channel_noise_int_dtype_float_assertion PASSED [ 20/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_channel_noise_int_dtype_int_assertion PASSED [ 21/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_default PASSED [ 22/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_irregular_short_pulses PASSED [ 23/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_irregular_short_pulses_adjusted_noise PASSED [ 24/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_irregular_short_pulses_different_seed PASSED [ 25/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_non_default_regular PASSED [ 26/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_non_default_regular_adjusted_means PASSED [ 27/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_non_default_regular_as_uint16 PASSED [ 28/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_non_default_regular_floats PASSED [ 29/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_non_default_regular_floats_adjusted_means_and_noise PASSED [ 30/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_overlapping_ttl_assertion PASSED [ 31/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_signal_mean_int_dtype_float_assertion PASSED [ 32/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_signal_mean_int_dtype_int_assertion PASSED [ 33/270] tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_single_frame_overlapping_ttl_assertion PASSED [ 34/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_append PASSED [ 35/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_assertion_conflicting_bases PASSED [ 36/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_assertion_nwbfile_path_and_nwbfile_object PASSED [ 37/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_closure PASSED [ 38/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_no_file_save_on_error_in_context PASSED [ 39/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_no_print_on_error_in_context PASSED [ 40/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_overwrite PASSED [ 41/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_pass_nwbfile PASSED [ 42/270] tests/test_minimal/test_tools/test_context_tools.py::TestMakeOrLoadNWBFile::test_make_or_load_nwbfile_write PASSED [ 43/270] tests/test_minimal/test_tools/test_expand_paths.py::test_only_folder_match PASSED [ 44/270] tests/test_minimal/test_tools/test_expand_paths.py::test_only_file_match PASSED [ 45/270] tests/test_minimal/test_tools/test_expand_paths.py::test_expand_paths PASSED [ 46/270] tests/test_minimal/test_tools/test_expand_paths.py::test_expand_paths_with_extras PASSED [ 47/270] tests/test_minimal/test_tools/test_expand_paths.py::test_expand_paths_ibl PASSED [ 48/270] tests/test_minimal/test_tools/test_expand_paths.py::test_construct_path_template_with_varied_cases[case0] PASSED [ 49/270] tests/test_minimal/test_tools/test_expand_paths.py::test_construct_path_template_with_varied_cases[case1] PASSED [ 50/270] tests/test_minimal/test_tools/test_expand_paths.py::test_construct_path_template_with_varied_cases[case2] PASSED [ 51/270] tests/test_minimal/test_tools/test_expand_paths.py::test_construct_path_template_with_varied_cases[case3] PASSED [ 52/270] tests/test_minimal/test_tools/test_expand_paths.py::test_construct_path_template_with_varied_cases[case4] PASSED [ 53/270] tests/test_minimal/test_tools/test_expand_paths.py::test_construct_path_template_with_varied_cases[case5] PASSED [ 54/270] tests/test_minimal/test_tools/test_expand_paths.py::test_empty_subject_id PASSED [ 55/270] tests/test_minimal/test_tools/test_expand_paths.py::test_empty_session_id PASSED [ 56/270] tests/test_minimal/test_tools/test_expand_paths.py::test_missing_subject_id_in_path PASSED [ 57/270] tests/test_minimal/test_tools/test_expand_paths.py::test_missing_session_id_in_path PASSED [ 58/270] tests/test_minimal/test_tools/test_expand_paths.py::test_empty_metadata_value PASSED [ 59/270] tests/test_minimal/test_tools/test_expand_paths.py::test_missing_metadata_value_in_path PASSED [ 60/270] tests/test_minimal/test_tools/test_importing.py::test_guide_attributes PASSED [ 61/270] tests/test_minimal/test_tools/test_nwb_helpers.py::TestNWBHelpers::test_make_nwbfile_from_metadata_empty PASSED [ 62/270] tests/test_minimal/test_tools/test_nwb_helpers.py::TestNWBHelpers::test_make_nwbfile_from_metadata_no_in_place_modification PASSED [ 63/270] tests/test_minimal/test_tools/test_nwb_helpers.py::TestNWBHelpers::test_make_nwbfile_from_metadata_session_start_time PASSED [ 64/270] tests/test_minimal/test_tools/test_nwb_helpers.py::TestNWBHelpers::test_make_nwbfile_successful PASSED [ 65/270] tests/test_minimal/test_tools/test_nwb_helpers.py::TestNWBHelpers::test_metadata_integrity PASSED [ 66/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_current_defaults PASSED [ 67/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_custom_threshold_floats PASSED [ 68/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_end_during_off_pulse_floats PASSED [ 69/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_end_during_off_pulse_int16 PASSED [ 70/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_end_during_on_pulse_floats PASSED [ 71/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_end_during_on_pulse_int16 PASSED [ 72/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_explicit_original_defaults PASSED [ 73/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_input_dimensions_assertion PASSED [ 74/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_start_during_on_pulse_floats PASSED [ 75/270] tests/test_minimal/test_tools/test_signal_processing.py::TestGetRisingAndFallingTimesFromTTL::test_start_during_on_pulse_int16 PASSED [ 76/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_length_consistency[mock_HDF5DatasetIOConfiguration] PASSED [ 77/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_length_consistency[mock_ZarrDatasetIOConfiguration] PASSED [ 78/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_and_buffer_length_consistency[mock_HDF5DatasetIOConfiguration] PASSED [ 79/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_and_buffer_length_consistency[mock_ZarrDatasetIOConfiguration] PASSED [ 80/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_shape_nonpositive_elements[mock_HDF5DatasetIOConfiguration] PASSED [ 81/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_shape_nonpositive_elements[mock_ZarrDatasetIOConfiguration] PASSED [ 82/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_buffer_shape_nonpositive_elements[mock_HDF5DatasetIOConfiguration] PASSED [ 83/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_buffer_shape_nonpositive_elements[mock_ZarrDatasetIOConfiguration] PASSED [ 84/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_shape_exceeds_buffer_shape[mock_HDF5DatasetIOConfiguration] PASSED [ 85/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_shape_exceeds_buffer_shape[mock_ZarrDatasetIOConfiguration] PASSED [ 86/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_buffer_shape_exceeds_full_shape[mock_HDF5DatasetIOConfiguration] PASSED [ 87/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_buffer_shape_exceeds_full_shape[mock_ZarrDatasetIOConfiguration] PASSED [ 88/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_dimensions_do_not_evenly_divide_buffer[mock_HDF5DatasetIOConfiguration] PASSED [ 89/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_dimensions_do_not_evenly_divide_buffer[mock_ZarrDatasetIOConfiguration] PASSED [ 90/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_dimensions_do_not_evenly_divide_buffer_skip_full_shape[mock_HDF5DatasetIOConfiguration] PASSED [ 91/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_common_dataset_io_configuration_model.py::test_validator_chunk_dimensions_do_not_evenly_divide_buffer_skip_full_shape[mock_ZarrDatasetIOConfiguration] PASSED [ 92/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[hdf5-unwrapped-<lambda>-iterator_options0] FAILED [ 93/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[hdf5-generic-SliceableDataChunkIterator-iterator_options1] FAILED [ 94/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[hdf5-classic-DataChunkIterator-iterator_options2] FAILED [ 95/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[zarr-unwrapped-<lambda>-iterator_options0] FAILED [ 96/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[zarr-generic-SliceableDataChunkIterator-iterator_options1] FAILED [ 97/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[zarr-classic-DataChunkIterator-iterator_options2] FAILED [ 98/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_dynamic_table[hdf5] FAILED [ 99/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_dynamic_table[zarr] FAILED [100/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[hdf5-unwrapped-<lambda>-data_iterator_options0-timestamps_iterator_options0] FAILED [101/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[hdf5-generic-SliceableDataChunkIterator-data_iterator_options1-timestamps_iterator_options1] FAILED [102/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[hdf5-classic-DataChunkIterator-data_iterator_options2-timestamps_iterator_options2] FAILED [103/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[zarr-unwrapped-<lambda>-data_iterator_options0-timestamps_iterator_options0] FAILED [104/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[zarr-generic-SliceableDataChunkIterator-data_iterator_options1-timestamps_iterator_options1] FAILED [105/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[zarr-classic-DataChunkIterator-data_iterator_options2-timestamps_iterator_options2] FAILED [106/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[hdf5-unwrapped-<lambda>-iterator_options0] FAILED [107/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[hdf5-generic-SliceableDataChunkIterator-iterator_options1] FAILED [108/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[hdf5-classic-DataChunkIterator-iterator_options2] FAILED [109/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[zarr-unwrapped-<lambda>-iterator_options0] FAILED [110/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[zarr-generic-SliceableDataChunkIterator-iterator_options1] FAILED [111/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[zarr-classic-DataChunkIterator-iterator_options2] FAILED [112/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_dynamic_table_override[hdf5] FAILED [113/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_dynamic_table_override[zarr] FAILED [114/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_backend_configuration.py::test_complex_hdf5 PASSED [115/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_backend_configuration.py::test_complex_zarr PASSED [116/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_time_series[hdf5-<lambda>] PASSED [117/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_time_series[hdf5-SliceableDataChunkIterator] PASSED [118/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_time_series[hdf5-DataChunkIterator] PASSED [119/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_time_series[zarr-<lambda>] PASSED [120/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_time_series[zarr-SliceableDataChunkIterator] PASSED [121/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_time_series[zarr-DataChunkIterator] PASSED [122/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_external_image_series[hdf5] PASSED [123/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_external_image_series[zarr] PASSED [124/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_dynamic_table[hdf5-<lambda>] PASSED [125/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_dynamic_table[hdf5-SliceableDataChunkIterator] PASSED [126/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_dynamic_table[hdf5-DataChunkIterator] PASSED [127/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_dynamic_table[zarr-<lambda>] PASSED [128/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_dynamic_table[zarr-SliceableDataChunkIterator] PASSED [129/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_dynamic_table[zarr-DataChunkIterator] PASSED [130/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_ragged_units_table[hdf5] PASSED [131/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_ragged_units_table[zarr] PASSED [132/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_compass_direction[hdf5-<lambda>] PASSED [133/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_compass_direction[hdf5-SliceableDataChunkIterator] PASSED [134/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_compass_direction[hdf5-DataChunkIterator] PASSED [135/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_compass_direction[zarr-<lambda>] PASSED [136/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_compass_direction[zarr-SliceableDataChunkIterator] PASSED [137/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_compass_direction[zarr-DataChunkIterator] PASSED [138/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_ndx_events[hdf5] SKIPPED [139/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_ndx_events[zarr] SKIPPED [140/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations_appended_files.py::test_unwrapped_time_series_hdf5 PASSED [141/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations_appended_files.py::test_unwrapped_time_series_zarr PASSED [142/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations_appended_files.py::test_unwrapped_dynamic_table_hdf5 PASSED [143/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations_appended_files.py::test_unwrapped_dynamic_table_zarr PASSED [144/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_dataset_io_configuration_model.py::test_get_data_io_kwargs_abstract_error PASSED [145/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_dataset_io_configuration_model.py::test_get_data_io_kwargs_not_implemented PASSED [146/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_backend_configuration_model.py::test_hdf5_backend_configuration_print PASSED [147/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_hdf5_dataset_configuration_print PASSED [148/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_hdf5_dataset_configuration_print_with_compression_options PASSED [149/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_hdf5_dataset_configuration_print_with_compression_disabled PASSED [150/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_hdf5_dataset_configuration_repr PASSED [151/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_available_hdf5_compression_methods_not_empty PASSED [152/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_default_compression_is_always_available PASSED [153/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_hdf5_dataset_io_configuration_model.py::test_get_data_io_kwargs PASSED [154/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_backend_configuration_model.py::test_zarr_backend_configuration_print PASSED [155/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_zarr_dataset_io_configuration_print PASSED [156/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_zarr_dataset_configuration_print_with_compression_options PASSED [157/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_zarr_dataset_configuration_print_with_compression_disabled PASSED [158/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_zarr_dataset_configuration_print_with_filter_methods PASSED [159/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_zarr_dataset_configuration_print_with_filter_options PASSED [160/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_zarr_dataset_configuration_repr PASSED [161/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_validator_filter_options_has_methods PASSED [162/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_validator_filter_methods_length_match_options PASSED [163/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_available_zarr_compression_methods_not_empty PASSED [164/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_default_compression_is_always_available PASSED [165/270] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_models/test_zarr_dataset_io_configuration_model.py::test_get_data_io_kwargs PASSED [166/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_deep_update PASSED [167/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_deep_update_kwargs_input PASSED [168/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_deepcopy PASSED [169/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_dict_magic PASSED [170/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_getitem PASSED [171/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_getitem_hashable PASSED [172/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_missing_key PASSED [173/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_recursive_conversion PASSED [174/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_repr PASSED [175/270] tests/test_minimal/test_utils/test_dict.py::TestDeepDict::test_to_dict PASSED [176/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_get_schema_from_method_signature PASSED [177/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_dict_deep_update_1 PASSED [178/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_dict_deep_update_2 PASSED [179/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_dict_deep_update_3 PASSED [180/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_dict_deep_update_4 PASSED [181/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_fill_defaults PASSED [182/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_load_metadata_from_file PASSED [183/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_get_schema_from_ImagingPlane_array_type PASSED [184/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_get_schema_from_TwoPhotonSeries_array_type PASSED [185/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_np_array_encoding PASSED [186/270] tests/test_minimal/test_utils/test_json_schema_utils.py::test_get_schema_from_NWBDataInterface PASSED [187/270] tests/test_minimal/test_utils/test_path.py::test_windows_path PASSED [188/270] tests/test_minimal/test_utils/test_path.py::test_unix_path PASSED [189/270] tests/test_minimal/test_utils/test_path.py::test_mixed_path PASSED [190/270] tests/test_minimal/test_utils/test_utils.py::test_check_regular_series PASSED [191/270] tests/test_ecephys/test_ecephys_interfaces.py::TestRecordingInterface::test_no_slash_in_name PASSED [192/270] tests/test_ecephys/test_ecephys_interfaces.py::TestRecordingInterface::test_stub_multi_segment PASSED [193/270] tests/test_ecephys/test_ecephys_interfaces.py::TestRecordingInterface::test_stub_single_segment PASSED [194/270] tests/test_ecephys/test_ecephys_interfaces.py::TestAssertions::test_spike2_import_assertions_3_10 SKIPPED (Only testing with Python 3.10!) [195/270] tests/test_ecephys/test_ecephys_interfaces.py::TestAssertions::test_spike2_import_assertions_3_11 PASSED [196/270] tests/test_ecephys/test_ecephys_interfaces.py::TestSortingInterface::test_sorting_full PASSED [197/270] tests/test_ecephys/test_ecephys_interfaces.py::TestSortingInterface::test_sorting_propagate_conversion_options PASSED [198/270] tests/test_ecephys/test_ecephys_interfaces.py::TestSortingInterface::test_sorting_stub PASSED [199/270] tests/test_ecephys/test_ecephys_interfaces.py::TestSortingInterface::test_sorting_stub_with_recording PASSED [200/270] tests/test_ecephys/test_mock_nidq_interface.py::TestMockSpikeGLXNIDQInterface::test_current_default_inferred_ttl_times PASSED [201/270] tests/test_ecephys/test_mock_nidq_interface.py::TestMockSpikeGLXNIDQInterface::test_custom_inferred_ttl_times PASSED [202/270] tests/test_ecephys/test_mock_nidq_interface.py::TestMockSpikeGLXNIDQInterface::test_explicit_original_default_inferred_ttl_times PASSED [203/270] tests/test_ecephys/test_mock_nidq_interface.py::TestMockSpikeGLXNIDQInterface::test_mock_metadata PASSED [204/270] tests/test_ecephys/test_mock_nidq_interface.py::TestMockSpikeGLXNIDQInterface::test_mock_run_conversion PASSED [205/270] tests/test_ecephys/test_mock_recording_interface.py::TestMockRecordingInterface::test_conversion_as_lone_interface <- ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/testing/data_interface_mixins.py FAILED [206/270] tests/test_ecephys/test_mock_recording_interface.py::TestMockRecordingInterface::test_interface_alignment <- ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/testing/data_interface_mixins.py PASSED [207/270] tests/test_ecephys/test_mock_recording_interface.py::TestMockRecordingInterface::test_source_schema_valid <- ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/testing/data_interface_mixins.py PASSED [208/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_default_values PASSED [209/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_invalid_write_as_argument_assertion PASSED [210/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_as_lfp PASSED [211/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_as_processing PASSED [212/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_multiple_electrical_series_from_same_electrode_group PASSED [213/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_multiple_electrical_series_with_different_electrode_groups PASSED [214/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_with_higher_gzip_level PASSED [215/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_with_lzf_compression PASSED [216/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesSavingTimestampsVsRates::test_non_uniform_timestamps PASSED [217/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesSavingTimestampsVsRates::test_uniform_timestamps PASSED [218/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesVoltsScaling::test_null_offsets_in_recording_extractor PASSED [219/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesVoltsScaling::test_uniform_non_default PASSED [220/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesVoltsScaling::test_uniform_values PASSED [221/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesVoltsScaling::test_variable_gains PASSED [222/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesVoltsScaling::test_variable_offsets_assertion PASSED [223/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesChunking::test_default_chunking PASSED [224/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesChunking::test_hdfm_iterator PASSED [225/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesChunking::test_invalid_iterator_type_assertion PASSED [226/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesChunking::test_iterator_opts_propagation PASSED [227/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesChunking::test_non_iterative_write PASSED [228/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesChunking::test_non_iterative_write_assertion PASSED [229/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteRecording::test_default_values_single_segment PASSED [230/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteRecording::test_write_bool_properties PASSED [231/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteRecording::test_write_multiple_segments PASSED [232/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_add_electrodes_addition PASSED [233/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_channel_group_names_table PASSED [234/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_common_property_extension PASSED [235/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_default_electrode_column_names PASSED [236/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_integer_channel_names PASSED [237/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_manual_row_adition_after_add_electrodes_function PASSED [238/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_manual_row_adition_before_add_electrodes_function PASSED [239/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_manual_row_adition_before_add_electrodes_function_optional_columns PASSED [240/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_new_property_addition PASSED [241/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_non_overwriting_channel_names_property PASSED [242/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_property_metadata_mismatch PASSED [243/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_row_matching_by_channel_name_with_existing_property PASSED [244/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_row_matching_by_channel_name_with_new_property PASSED [245/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectrodes::test_string_channel_names PASSED [246/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_add_existing_units PASSED [247/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_common_property_extension PASSED [248/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_integer_unit_names PASSED [249/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_integer_unit_names_overwrite PASSED [250/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_manual_extension_after_add_units_table PASSED [251/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_non_overwriting_unit_names_sorting_property PASSED [252/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_property_addition PASSED [253/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_property_matching_by_unit_name_with_existing_property PASSED [254/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_property_matching_by_unit_name_with_new_property PASSED [255/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_string_unit_names PASSED [256/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_string_unit_names_overwrite PASSED [257/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_units_table_extension_after_manual_unit_addition PASSED [258/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_write_bool_properties PASSED [259/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_write_subset_units PASSED [260/270] tests/test_ecephys/test_tools_spikeinterface.py::TestAddUnitsTable::test_write_units_table_in_processing_module PASSED [261/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property PASSED [262/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_missing_electrode_group PASSED [263/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_units_table_name PASSED [264/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_write_multiple_probes_with_electrical_series PASSED [265/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_write_multiple_probes_without_electrical_series PASSED [266/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_write_multiple_segments PASSED [267/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_write_recordingless PASSED [268/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_write_single_segment PASSED [269/270] tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_write_subset_units PASSED [270/270] ================================================================= FAILURES ================================================================= ____________________________________ test_simple_time_series[hdf5-unwrapped-<lambda>-iterator_options0] ____________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_hdf5_u0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'unwrapped', iterator = <function <lambda> at 0x7f29da632660>, iterator_options = {}, backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data = iterator(integer_array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='6f9579ae-af14-4eb7-a470-e4ec3dea2776', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) case_name = 'unwrapped' data = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) dataset_configuration = HDF5DatasetIOConfiguration(object_id='6f9579ae-af14-4eb7-a470-e4ec3dea2776', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <function <lambda> at 0x7f29da632660> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817685231632 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 40, 695676, tzinfo=tzlocal())] identifier: 288fb792-2cef-4987-adcb-44e60547da2d session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817685079504 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_hdf5_u0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='6f9579ae-af14-4eb7-a470-e4ec3dea2776', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='6f9579ae-af14-4eb7-a470-e4ec3dea2776', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817685231632 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 40, 695676, tzinfo=tzlocal())] identifier: 288fb792-2cef-4987-adcb-44e60547da2d session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817685079504 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'160d0b64-27f7-475e-9b16-9bfe2e85b1e5': root pynwb.file.NWBFile at 0x139817685231632 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 40, 695676, tzinfo=tzlocal())] identifier: 288fb792-2cef-4987-adcb-44e60547da2d session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , '6f9579ae-af14-4eb7-a470-e4ec3dea2776': TestTimeSeries pynwb.base.TimeSeries at 0x139817685079504 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = '6f9579ae-af14-4eb7-a470-e4ec3dea2776' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817685079504 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7a46710>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7a46710>,) kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -...2, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7a46710>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ____________________________ test_simple_time_series[hdf5-generic-SliceableDataChunkIterator-iterator_options1] ____________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_hdf5_g0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'generic', iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'>, iterator_options = {}, backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data = iterator(integer_array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='eb6f40b0-6453-46e3-8e57-6caf39527d4e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) case_name = 'generic' data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890> dataset_configuration = HDF5DatasetIOConfiguration(object_id='eb6f40b0-6453-46e3-8e57-6caf39527d4e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817698881872 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 40, 885482, tzinfo=tzlocal())] identifier: 37f5a8e8-917c-49fe-8f53-8a59fcaceb45 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817689965200 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_hdf5_g0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='eb6f40b0-6453-46e3-8e57-6caf39527d4e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='eb6f40b0-6453-46e3-8e57-6caf39527d4e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817698881872 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 40, 885482, tzinfo=tzlocal())] identifier: 37f5a8e8-917c-49fe-8f53-8a59fcaceb45 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817689965200 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'6a2c1674-4ff4-4567-b90b-c847e0a0ac4a': root pynwb.file.NWBFile at 0x139817698881872 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 40, 885482, tzinfo=tzlocal())] identifier: 37f5a8e8-917c-49fe-8f53-8a59fcaceb45 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'eb6f40b0-6453-46e3-8e57-6caf39527d4e': TestTimeSeries pynwb.base.TimeSeries at 0x139817689965200 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'eb6f40b0-6453-46e3-8e57-6caf39527d4e' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890> data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817689965200 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7be4f50>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7be4f50>,) kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7be4f50>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29da129890>, 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ________________________________ test_simple_time_series[hdf5-classic-DataChunkIterator-iterator_options2] _________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_hdf5_c0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'classic', iterator = <class 'hdmf.data_utils.DataChunkIterator'>, iterator_options = {'buffer_size': 150000, 'iter_axis': 1} backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data = iterator(integer_array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='4e3b5aec-b329-4aea-9c0b-88a1bb3949c5', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) case_name = 'classic' data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0> dataset_configuration = HDF5DatasetIOConfiguration(object_id='4e3b5aec-b329-4aea-9c0b-88a1bb3949c5', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'hdmf.data_utils.DataChunkIterator'> iterator_options = {'buffer_size': 150000, 'iter_axis': 1} nwbfile = root pynwb.file.NWBFile at 0x139817687664400 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 408010, tzinfo=tzlocal())] identifier: e2f5d2b1-707f-42b1-b322-6f475da2467b session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817687665360 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_hdf5_c0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='4e3b5aec-b329-4aea-9c0b-88a1bb3949c5', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='4e3b5aec-b329-4aea-9c0b-88a1bb3949c5', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817687664400 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 408010, tzinfo=tzlocal())] identifier: e2f5d2b1-707f-42b1-b322-6f475da2467b session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817687665360 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'4e3b5aec-b329-4aea-9c0b-88a1bb3949c5': TestTimeSeries pynwb.base.TimeSeries at 0x139817687665360 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts , 'bc32bbdc-1a4f-4726-90bd-4445a24950ad': root pynwb.file.NWBFile at 0x139817687664400 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 408010, tzinfo=tzlocal())] identifier: e2f5d2b1-707f-42b1-b322-6f475da2467b session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 } object_id = '4e3b5aec-b329-4aea-9c0b-88a1bb3949c5' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0> data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817687665360 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d79b6ad0>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d79b6ad0>,) kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d79b6ad0>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7ed0>, 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ____________________________________ test_simple_time_series[zarr-unwrapped-<lambda>-iterator_options0] ____________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_zarr_u0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'unwrapped', iterator = <function <lambda> at 0x7f29da632660>, iterator_options = {}, backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data = iterator(integer_array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='8ccc9c56-60ce-4ccb-a24a-95f0e6401b59', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'unwrapped' data = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) dataset_configuration = ZarrDatasetIOConfiguration(object_id='8ccc9c56-60ce-4ccb-a24a-95f0e6401b59', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <function <lambda> at 0x7f29da632660> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817683015440 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 468867, tzinfo=tzlocal())] identifier: f3a79c3e-9746-4f37-8ff6-8416703333d5 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817683014544 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_zarr_u0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='8ccc9c56-60ce-4ccb-a24a-95f0e6401b59', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='8ccc9c56-60ce-4ccb-a24a-95f0e6401b59', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817683015440 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 468867, tzinfo=tzlocal())] identifier: f3a79c3e-9746-4f37-8ff6-8416703333d5 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817683014544 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'8ccc9c56-60ce-4ccb-a24a-95f0e6401b59': TestTimeSeries pynwb.base.TimeSeries at 0x139817683014544 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts , 'eda2c266-0642-4705-9937-192668387a21': root pynwb.file.NWBFile at 0x139817683015440 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 468867, tzinfo=tzlocal())] identifier: f3a79c3e-9746-4f37-8ff6-8416703333d5 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 } object_id = '8ccc9c56-60ce-4ccb-a24a-95f0e6401b59' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817683014544 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7544e10>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7544e10>,) kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -...5762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7544e10>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ____________________________ test_simple_time_series[zarr-generic-SliceableDataChunkIterator-iterator_options1] ____________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_zarr_g0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'generic', iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'>, iterator_options = {}, backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data = iterator(integer_array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='d873007f-b05c-46ae-8bda-65e3a16d22ca', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'generic' data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710> dataset_configuration = ZarrDatasetIOConfiguration(object_id='d873007f-b05c-46ae-8bda-65e3a16d22ca', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817688730576 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 519745, tzinfo=tzlocal())] identifier: fdd9553b-b19b-4d04-94e6-0506fe6c91a5 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817680468240 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_zarr_g0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='d873007f-b05c-46ae-8bda-65e3a16d22ca', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='d873007f-b05c-46ae-8bda-65e3a16d22ca', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817688730576 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 519745, tzinfo=tzlocal())] identifier: fdd9553b-b19b-4d04-94e6-0506fe6c91a5 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817680468240 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'c9e54d7b-ef81-4e03-8ce6-0181b398123e': root pynwb.file.NWBFile at 0x139817688730576 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 42, 519745, tzinfo=tzlocal())] identifier: fdd9553b-b19b-4d04-94e6-0506fe6c91a5 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'd873007f-b05c-46ae-8bda-65e3a16d22ca': TestTimeSeries pynwb.base.TimeSeries at 0x139817680468240 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'd873007f-b05c-46ae-8bda-65e3a16d22ca' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710> data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817680468240 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d72d5ed0>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d72d5ed0>,) kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d72d5ed0>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7ab9710>, 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ________________________________ test_simple_time_series[zarr-classic-DataChunkIterator-iterator_options2] _________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_zarr_c0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'classic', iterator = <class 'hdmf.data_utils.DataChunkIterator'>, iterator_options = {'buffer_size': 150000, 'iter_axis': 1} backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data = iterator(integer_array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='57a4ea63-d9eb-4ff4-88c5-47349be9e0e6', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'classic' data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50> dataset_configuration = ZarrDatasetIOConfiguration(object_id='57a4ea63-d9eb-4ff4-88c5-47349be9e0e6', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'hdmf.data_utils.DataChunkIterator'> iterator_options = {'buffer_size': 150000, 'iter_axis': 1} nwbfile = root pynwb.file.NWBFile at 0x139817684150992 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 33351, tzinfo=tzlocal())] identifier: 49d22803-003b-4741-a0ca-17ea48ff74eb session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817684146832 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_zarr_c0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='57a4ea63-d9eb-4ff4-88c5-47349be9e0e6', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='57a4ea63-d9eb-4ff4-88c5-47349be9e0e6', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817684150992 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 33351, tzinfo=tzlocal())] identifier: 49d22803-003b-4741-a0ca-17ea48ff74eb session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817684146832 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'57a4ea63-d9eb-4ff4-88c5-47349be9e0e6': TestTimeSeries pynwb.base.TimeSeries at 0x139817684146832 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts , 'deb3bd14-ae3f-49b1-8d7c-937196b3c673': root pynwb.file.NWBFile at 0x139817684150992 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 33351, tzinfo=tzlocal())] identifier: 49d22803-003b-4741-a0ca-17ea48ff74eb session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 } object_id = '57a4ea63-d9eb-4ff4-88c5-47349be9e0e6' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50> data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817684146832 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7659c50>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7659c50>,) kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7659c50>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7659d50>, 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _____________________________________________________ test_simple_dynamic_table[hdf5] ______________________________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_hdf50') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) backend = 'hdf5' @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_dynamic_table(tmpdir: Path, integer_array: np.ndarray, backend: Literal["hdf5", "zarr"]): nwbfile = mock_NWBFile() dynamic_table = DynamicTable( name="TestDynamicTable", description="", columns=[VectorData(name="TestColumn", description="", data=integer_array)], ) nwbfile.add_acquisition(dynamic_table) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestDynamicTable/TestColumn/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': HDF5DatasetIOConfiguration(object_id='15637e3d-7454-4d7a-9c1d-0e400a5476b6', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) dataset_configuration = HDF5DatasetIOConfiguration(object_id='15637e3d-7454-4d7a-9c1d-0e400a5476b6', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dynamic_table = TestDynamicTable hdmf.common.table.DynamicTable at 0x139817681019792 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) nwbfile = root pynwb.file.NWBFile at 0x139817687489040 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 98611, tzinfo=tzlocal())] identifier: 364560ac-f43e-4f56-aa8c-f93233f481a9 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_hdf50') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ nwbfile = root pynwb.file.NWBFile at 0x139817687489040 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.D...ion_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': HDF5DatasetIOConfigur...000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) def configure_backend( nwbfile: NWBFile, backend_configuration: Union[HDF5BackendConfiguration, ZarrBackendConfiguration] ) -> None: """Configure all datasets specified in the `backend_configuration` with their appropriate DataIO and options.""" nwbfile_objects = nwbfile.objects data_io_class = backend_configuration.data_io_class for dataset_configuration in backend_configuration.dataset_configurations.values(): object_id = dataset_configuration.object_id dataset_name = dataset_configuration.dataset_name data_io_kwargs = dataset_configuration.get_data_io_kwargs() # TODO: update buffer shape in iterator, if present nwbfile_object = nwbfile_objects[object_id] is_dataset_linked = isinstance(nwbfile_object.fields.get(dataset_name), TimeSeries) # Table columns if isinstance(nwbfile_object, Data): > nwbfile_object.set_data_io(data_io_class=data_io_class, data_io_kwargs=data_io_kwargs) E AttributeError: 'VectorData' object has no attribute 'set_data_io' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': HDF5DatasetIOConfiguration(object_id='15637e3d-7454-4d7a-9c1d-0e400a5476b6', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='15637e3d-7454-4d7a-9c1d-0e400a5476b6', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817687489040 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 98611, tzinfo=tzlocal())] identifier: 364560ac-f43e-4f56-aa8c-f93233f481a9 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = <hdmf.common.table.VectorData object at 0x7f29d735d610> nwbfile_objects = {'15637e3d-7454-4d7a-9c1d-0e400a5476b6': <hdmf.common.table.VectorData object at 0x7f29d735d610>, 'bd48ed04-6d2b-4035-b4dc-e297c780d97d': root pynwb.file.NWBFile at 0x139817687489040 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 98611, tzinfo=tzlocal())] identifier: 364560ac-f43e-4f56-aa8c-f93233f481a9 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'd769f389-c5b5-4536-8ce5-2f94ef78c7af': <hdmf.common.table.ElementIdentifiers object at 0x7f29d735d3d0>, 'f9b72107-72d0-4d4d-bb03-51b8df817406': TestDynamicTable hdmf.common.table.DynamicTable at 0x139817681019792 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> } object_id = '15637e3d-7454-4d7a-9c1d-0e400a5476b6' ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:30: AttributeError _____________________________________________________ test_simple_dynamic_table[zarr] ______________________________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_zarr0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) backend = 'zarr' @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_dynamic_table(tmpdir: Path, integer_array: np.ndarray, backend: Literal["hdf5", "zarr"]): nwbfile = mock_NWBFile() dynamic_table = DynamicTable( name="TestDynamicTable", description="", columns=[VectorData(name="TestColumn", description="", data=integer_array)], ) nwbfile.add_acquisition(dynamic_table) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestDynamicTable/TestColumn/data"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': ZarrDatasetIOConfiguration(object_id='08d0a7f5-9336-4d79-a55f-cf7fdd51c013', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) dataset_configuration = ZarrDatasetIOConfiguration(object_id='08d0a7f5-9336-4d79-a55f-cf7fdd51c013', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dynamic_table = TestDynamicTable hdmf.common.table.DynamicTable at 0x139817681022224 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) nwbfile = root pynwb.file.NWBFile at 0x139817681014096 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 112537, tzinfo=tzlocal())] identifier: 5db077a3-2af4-4fd0-a1fd-6f3ac6a29956 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_zarr0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ nwbfile = root pynwb.file.NWBFile at 0x139817681014096 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.D...ion_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': ZarrDatasetIOConfigur...4), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) def configure_backend( nwbfile: NWBFile, backend_configuration: Union[HDF5BackendConfiguration, ZarrBackendConfiguration] ) -> None: """Configure all datasets specified in the `backend_configuration` with their appropriate DataIO and options.""" nwbfile_objects = nwbfile.objects data_io_class = backend_configuration.data_io_class for dataset_configuration in backend_configuration.dataset_configurations.values(): object_id = dataset_configuration.object_id dataset_name = dataset_configuration.dataset_name data_io_kwargs = dataset_configuration.get_data_io_kwargs() # TODO: update buffer shape in iterator, if present nwbfile_object = nwbfile_objects[object_id] is_dataset_linked = isinstance(nwbfile_object.fields.get(dataset_name), TimeSeries) # Table columns if isinstance(nwbfile_object, Data): > nwbfile_object.set_data_io(data_io_class=data_io_class, data_io_kwargs=data_io_kwargs) E AttributeError: 'VectorData' object has no attribute 'set_data_io' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': ZarrDatasetIOConfiguration(object_id='08d0a7f5-9336-4d79-a55f-cf7fdd51c013', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='08d0a7f5-9336-4d79-a55f-cf7fdd51c013', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817681014096 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 112537, tzinfo=tzlocal())] identifier: 5db077a3-2af4-4fd0-a1fd-6f3ac6a29956 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = <hdmf.common.table.VectorData object at 0x7f29d735f850> nwbfile_objects = {'08d0a7f5-9336-4d79-a55f-cf7fdd51c013': <hdmf.common.table.VectorData object at 0x7f29d735f850>, '385aa403-a827-40ac-a8e3-35aa20b3d79e': <hdmf.common.table.ElementIdentifiers object at 0x7f29d7317e90>, '512e749f-22a0-4c95-8388-f7593c3de969': TestDynamicTable hdmf.common.table.DynamicTable at 0x139817681022224 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> , 'f0ff992e-ac40-4172-8c12-b8237bd3b173': root pynwb.file.NWBFile at 0x139817681014096 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 112537, tzinfo=tzlocal())] identifier: 5db077a3-2af4-4fd0-a1fd-6f3ac6a29956 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 } object_id = '08d0a7f5-9336-4d79-a55f-cf7fdd51c013' ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:30: AttributeError _____________ test_time_series_timestamps_linkage[hdf5-unwrapped-<lambda>-data_iterator_options0-timestamps_iterator_options0] _____________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li0') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'unwrapped', iterator = <function <lambda> at 0x7f29da6337e0>, data_iterator_options = {}, timestamps_iterator_options = {} backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,data_iterator_options,timestamps_iterator_options", [ ("unwrapped", lambda x: x, dict(), dict()), ("generic", SliceableDataChunkIterator, dict(), dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000), dict(buffer_size=30_000)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_time_series_timestamps_linkage( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, data_iterator_options: dict, timestamps_iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data_1 = iterator(integer_array, **data_iterator_options) data_2 = iterator(integer_array, **data_iterator_options) timestamps_array = np.linspace(start=0.0, stop=1.0, num=integer_array.shape[0]) timestamps = iterator(timestamps_array, **timestamps_iterator_options) nwbfile = mock_NWBFile() time_series_1 = mock_TimeSeries(name="TestTimeSeries1", data=data_1, timestamps=timestamps, rate=None) nwbfile.add_acquisition(time_series_1) time_series_2 = mock_TimeSeries(name="TestTimeSeries2", data=data_2, timestamps=time_series_1, rate=None) nwbfile.add_acquisition(time_series_2) # Note that the field will still show up in the configuration display backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) # print(backend_configuration) dataset_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/data"] dataset_configuration_2 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/data"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/timestamps"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/timestamps"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries2/timestamps': HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/data': HDF5DatasetIOConfiguration(object_id='b7c66efe-9685-44fc-bff1-e177c16aa9d3', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/timestamps': HDF5DatasetIOConfiguration(object_id='b7c66efe-9685-44fc-bff1-e177c16aa9d3', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None)}) case_name = 'unwrapped' data_1 = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_2 = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_iterator_options = {} dataset_configuration_1 = HDF5DatasetIOConfiguration(object_id='b7c66efe-9685-44fc-bff1-e177c16aa9d3', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_configuration_2 = HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <function <lambda> at 0x7f29da6337e0> nwbfile = root pynwb.file.NWBFile at 0x139817682475600 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 126464, tzinfo=tzlocal())] identifier: 1344c48d-b7b3-44a3-8154-3a8ecde3919c session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series_1 = TestTimeSeries1 pynwb.base.TimeSeries at 0x139817682469072 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts time_series_2 = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817682478928 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817682469072 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts timestamps = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_array = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_configuration_1 = HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None) timestamps_iterator_options = {} tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries2/timestamps': HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/data': HDF5DatasetIOConfiguration(object_id='b7c66efe-9685-44fc-bff1-e177c16aa9d3', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/timestamps': HDF5DatasetIOConfiguration(object_id='b7c66efe-9685-44fc-bff1-e177c16aa9d3', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='4b6ee92c-f236-43d0-a69c-a827023173e4', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817682475600 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 126464, tzinfo=tzlocal())] identifier: 1344c48d-b7b3-44a3-8154-3a8ecde3919c session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817682478928 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817682469072 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts nwbfile_objects = {'4b6ee92c-f236-43d0-a69c-a827023173e4': TestTimeSeries2 pynwb.base.TimeSeries at 0x139817682478928 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817682469072 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts , '5232a662-d40c-4f1d-9a23-2105e2a0b90c': root pynwb.file.NWBFile at 0x139817682475600 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 126464, tzinfo=tzlocal())] identifier: 1344c48d-b7b3-44a3-8154-3a8ecde3919c session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'b7c66efe-9685-44fc-bff1-e177c16aa9d3': TestTimeSeries1 pynwb.base.TimeSeries at 0x139817682469072 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts } object_id = '4b6ee92c-f236-43d0-a69c-a827023173e4' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} self = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817682478928 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817682469072 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d74c0390>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d74c0390>,) kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -...2, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d74c0390>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _____ test_time_series_timestamps_linkage[hdf5-generic-SliceableDataChunkIterator-data_iterator_options1-timestamps_iterator_options1] _____ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li1') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'generic', iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'>, data_iterator_options = {} timestamps_iterator_options = {}, backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,data_iterator_options,timestamps_iterator_options", [ ("unwrapped", lambda x: x, dict(), dict()), ("generic", SliceableDataChunkIterator, dict(), dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000), dict(buffer_size=30_000)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_time_series_timestamps_linkage( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, data_iterator_options: dict, timestamps_iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data_1 = iterator(integer_array, **data_iterator_options) data_2 = iterator(integer_array, **data_iterator_options) timestamps_array = np.linspace(start=0.0, stop=1.0, num=integer_array.shape[0]) timestamps = iterator(timestamps_array, **timestamps_iterator_options) nwbfile = mock_NWBFile() time_series_1 = mock_TimeSeries(name="TestTimeSeries1", data=data_1, timestamps=timestamps, rate=None) nwbfile.add_acquisition(time_series_1) time_series_2 = mock_TimeSeries(name="TestTimeSeries2", data=data_2, timestamps=time_series_1, rate=None) nwbfile.add_acquisition(time_series_2) # Note that the field will still show up in the configuration display backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) # print(backend_configuration) dataset_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/data"] dataset_configuration_2 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/data"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/timestamps"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/timestamps"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries2/timestamps': HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/data': HDF5DatasetIOConfiguration(object_id='7734bf45-1a5b-43ab-9583-51b4b19b5dee', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/timestamps': HDF5DatasetIOConfiguration(object_id='7734bf45-1a5b-43ab-9583-51b4b19b5dee', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None)}) case_name = 'generic' data_1 = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> data_2 = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10> data_iterator_options = {} dataset_configuration_1 = HDF5DatasetIOConfiguration(object_id='7734bf45-1a5b-43ab-9583-51b4b19b5dee', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_configuration_2 = HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'> nwbfile = root pynwb.file.NWBFile at 0x139817685840784 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 186805, tzinfo=tzlocal())] identifier: b529a4af-4158-47e2-93b0-efc95d56afdd session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series_1 = TestTimeSeries1 pynwb.base.TimeSeries at 0x139817685829008 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_unit: seconds unit: volts time_series_2 = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817685839248 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817685829008 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts timestamps = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_array = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_configuration_1 = HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None) timestamps_iterator_options = {} tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li1') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries2/timestamps': HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/data': HDF5DatasetIOConfiguration(object_id='7734bf45-1a5b-43ab-9583-51b4b19b5dee', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/timestamps': HDF5DatasetIOConfiguration(object_id='7734bf45-1a5b-43ab-9583-51b4b19b5dee', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817685840784 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 186805, tzinfo=tzlocal())] identifier: b529a4af-4158-47e2-93b0-efc95d56afdd session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817685839248 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817685829008 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts nwbfile_objects = {'7734bf45-1a5b-43ab-9583-51b4b19b5dee': TestTimeSeries1 pynwb.base.TimeSeries at 0x139817685829008 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_unit: seconds unit: volts , 'c48ae3a5-3596-48eb-8e54-845cfe65eb92': root pynwb.file.NWBFile at 0x139817685840784 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 44, 186805, tzinfo=tzlocal())] identifier: b529a4af-4158-47e2-93b0-efc95d56afdd session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481': TestTimeSeries2 pynwb.base.TimeSeries at 0x139817685839248 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817685829008 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts } object_id = 'e84c2dc8-ffbd-4a86-9e7f-e9a8d2c1b481' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10> data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} self = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817685839248 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817685829008 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f50d0> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d77f6b90>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d77f6b90>,) kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d77f6b90>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d77f5a10>, 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _________ test_time_series_timestamps_linkage[hdf5-classic-DataChunkIterator-data_iterator_options2-timestamps_iterator_options2] __________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li2') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'classic', iterator = <class 'hdmf.data_utils.DataChunkIterator'> data_iterator_options = {'buffer_size': 30000, 'iter_axis': 1}, timestamps_iterator_options = {'buffer_size': 30000}, backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,data_iterator_options,timestamps_iterator_options", [ ("unwrapped", lambda x: x, dict(), dict()), ("generic", SliceableDataChunkIterator, dict(), dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000), dict(buffer_size=30_000)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_time_series_timestamps_linkage( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, data_iterator_options: dict, timestamps_iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data_1 = iterator(integer_array, **data_iterator_options) data_2 = iterator(integer_array, **data_iterator_options) timestamps_array = np.linspace(start=0.0, stop=1.0, num=integer_array.shape[0]) timestamps = iterator(timestamps_array, **timestamps_iterator_options) nwbfile = mock_NWBFile() time_series_1 = mock_TimeSeries(name="TestTimeSeries1", data=data_1, timestamps=timestamps, rate=None) nwbfile.add_acquisition(time_series_1) time_series_2 = mock_TimeSeries(name="TestTimeSeries2", data=data_2, timestamps=time_series_1, rate=None) nwbfile.add_acquisition(time_series_2) # Note that the field will still show up in the configuration display backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) # print(backend_configuration) dataset_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/data"] dataset_configuration_2 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/data"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/timestamps"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/timestamps"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries2/timestamps': HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/data': HDF5DatasetIOConfiguration(object_id='cdfc095f-23d5-4f24-ace5-ecc5e96f4310', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/timestamps': HDF5DatasetIOConfiguration(object_id='cdfc095f-23d5-4f24-ace5-ecc5e96f4310', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None)}) case_name = 'classic' data_1 = <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> data_2 = <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10> data_iterator_options = {'buffer_size': 30000, 'iter_axis': 1} dataset_configuration_1 = HDF5DatasetIOConfiguration(object_id='cdfc095f-23d5-4f24-ace5-ecc5e96f4310', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_configuration_2 = HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'hdmf.data_utils.DataChunkIterator'> nwbfile = root pynwb.file.NWBFile at 0x139817686394320 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 278202, tzinfo=tzlocal())] identifier: 71e29bef-a18c-48da-85ea-3ab3a1e5dc40 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series_1 = TestTimeSeries1 pynwb.base.TimeSeries at 0x139817686394768 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_unit: seconds unit: volts time_series_2 = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817686397200 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817686394768 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts timestamps = <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_array = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_configuration_1 = HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None) timestamps_iterator_options = {'buffer_size': 30000} tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li2') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries2/timestamps': HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/data': HDF5DatasetIOConfiguration(object_id='cdfc095f-23d5-4f24-ace5-ecc5e96f4310', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None), 'acquisition/TestTimeSeries1/timestamps': HDF5DatasetIOConfiguration(object_id='cdfc095f-23d5-4f24-ace5-ecc5e96f4310', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None)}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None} dataset_configuration = HDF5DatasetIOConfiguration(object_id='33ae2664-69ca-461a-8aaa-bf593fd821a7', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817686394320 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 278202, tzinfo=tzlocal())] identifier: 71e29bef-a18c-48da-85ea-3ab3a1e5dc40 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817686397200 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817686394768 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts nwbfile_objects = {'33ae2664-69ca-461a-8aaa-bf593fd821a7': TestTimeSeries2 pynwb.base.TimeSeries at 0x139817686397200 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817686394768 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts , '893ca8b3-504d-4004-bf0f-d3489beda75c': root pynwb.file.NWBFile at 0x139817686394320 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 278202, tzinfo=tzlocal())] identifier: 71e29bef-a18c-48da-85ea-3ab3a1e5dc40 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'cdfc095f-23d5-4f24-ace5-ecc5e96f4310': TestTimeSeries1 pynwb.base.TimeSeries at 0x139817686394768 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_unit: seconds unit: volts } object_id = '33ae2664-69ca-461a-8aaa-bf593fd821a7' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10> data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} self = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817686397200 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817686394768 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e710> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d787e510> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d787d290>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d787d290>,) kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d787d290>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compression': 'gzip', 'compression_opts': None}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d787ef10>, 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _____________ test_time_series_timestamps_linkage[zarr-unwrapped-<lambda>-data_iterator_options0-timestamps_iterator_options0] _____________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li3') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'unwrapped', iterator = <function <lambda> at 0x7f29da6337e0>, data_iterator_options = {}, timestamps_iterator_options = {} backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,data_iterator_options,timestamps_iterator_options", [ ("unwrapped", lambda x: x, dict(), dict()), ("generic", SliceableDataChunkIterator, dict(), dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000), dict(buffer_size=30_000)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_time_series_timestamps_linkage( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, data_iterator_options: dict, timestamps_iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data_1 = iterator(integer_array, **data_iterator_options) data_2 = iterator(integer_array, **data_iterator_options) timestamps_array = np.linspace(start=0.0, stop=1.0, num=integer_array.shape[0]) timestamps = iterator(timestamps_array, **timestamps_iterator_options) nwbfile = mock_NWBFile() time_series_1 = mock_TimeSeries(name="TestTimeSeries1", data=data_1, timestamps=timestamps, rate=None) nwbfile.add_acquisition(time_series_1) time_series_2 = mock_TimeSeries(name="TestTimeSeries2", data=data_2, timestamps=time_series_1, rate=None) nwbfile.add_acquisition(time_series_2) # Note that the field will still show up in the configuration display backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) # print(backend_configuration) dataset_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/data"] dataset_configuration_2 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/data"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/timestamps"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/timestamps"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries2/timestamps': ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/data': ZarrDatasetIOConfiguration(object_id='c7e4e8e9-7011-4ae5-b87d-69d6934e4b38', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/timestamps': ZarrDatasetIOConfiguration(object_id='c7e4e8e9-7011-4ae5-b87d-69d6934e4b38', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'unwrapped' data_1 = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_2 = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_iterator_options = {} dataset_configuration_1 = ZarrDatasetIOConfiguration(object_id='c7e4e8e9-7011-4ae5-b87d-69d6934e4b38', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_configuration_2 = ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <function <lambda> at 0x7f29da6337e0> nwbfile = root pynwb.file.NWBFile at 0x139817689956432 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 393705, tzinfo=tzlocal())] identifier: e5088e07-00c0-4fca-b279-d170be429381 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series_1 = TestTimeSeries1 pynwb.base.TimeSeries at 0x139817689968336 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts time_series_2 = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817689961424 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817689968336 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts timestamps = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_array = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_configuration_1 = ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) timestamps_iterator_options = {} tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li3') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries2/timestamps': ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/data': ZarrDatasetIOConfiguration(object_id='c7e4e8e9-7011-4ae5-b87d-69d6934e4b38', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/timestamps': ZarrDatasetIOConfiguration(object_id='c7e4e8e9-7011-4ae5-b87d-69d6934e4b38', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='5f073065-afb3-4832-a342-a1ae7a14de62', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817689956432 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 393705, tzinfo=tzlocal())] identifier: e5088e07-00c0-4fca-b279-d170be429381 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817689961424 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817689968336 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts nwbfile_objects = {'1243363d-0a61-4f40-a535-6f9fcfa5190f': root pynwb.file.NWBFile at 0x139817689956432 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 393705, tzinfo=tzlocal())] identifier: e5088e07-00c0-4fca-b279-d170be429381 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , '5f073065-afb3-4832-a342-a1ae7a14de62': TestTimeSeries2 pynwb.base.TimeSeries at 0x139817689961424 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817689968336 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts , 'c7e4e8e9-7011-4ae5-b87d-69d6934e4b38': TestTimeSeries1 pynwb.base.TimeSeries at 0x139817689968336 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts } object_id = '5f073065-afb3-4832-a342-a1ae7a14de62' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} self = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817689961424 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817689968336 Fields: comments: no comments conversion: 1.0 data: [[ 606 22977 27598 ... 21453 14831 29962] [-26530 -9155 -6666 ... 18490 -6943 1704] [ 10727 16504 20858 ... -14473 23537 17539] ... [ 20291 10140 26729 ... -5514 8882 19710] [ 22656 25954 -21319 ... -8983 -30074 -24446] [-30841 -12815 28599 ... 24069 -15762 -3284]] description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: [0.00000000e+00 6.66671111e-06 1.33334222e-05 ... 9.99986667e-01 9.99993333e-01 1.00000000e+00] timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7be69d0>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7be69d0>,) kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -...5762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7be69d0>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16), 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _____ test_time_series_timestamps_linkage[zarr-generic-SliceableDataChunkIterator-data_iterator_options1-timestamps_iterator_options1] _____ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li4') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'generic', iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'>, data_iterator_options = {} timestamps_iterator_options = {}, backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,data_iterator_options,timestamps_iterator_options", [ ("unwrapped", lambda x: x, dict(), dict()), ("generic", SliceableDataChunkIterator, dict(), dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000), dict(buffer_size=30_000)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_time_series_timestamps_linkage( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, data_iterator_options: dict, timestamps_iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data_1 = iterator(integer_array, **data_iterator_options) data_2 = iterator(integer_array, **data_iterator_options) timestamps_array = np.linspace(start=0.0, stop=1.0, num=integer_array.shape[0]) timestamps = iterator(timestamps_array, **timestamps_iterator_options) nwbfile = mock_NWBFile() time_series_1 = mock_TimeSeries(name="TestTimeSeries1", data=data_1, timestamps=timestamps, rate=None) nwbfile.add_acquisition(time_series_1) time_series_2 = mock_TimeSeries(name="TestTimeSeries2", data=data_2, timestamps=time_series_1, rate=None) nwbfile.add_acquisition(time_series_2) # Note that the field will still show up in the configuration display backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) # print(backend_configuration) dataset_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/data"] dataset_configuration_2 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/data"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/timestamps"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/timestamps"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries2/timestamps': ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/data': ZarrDatasetIOConfiguration(object_id='2a5d0180-9491-4588-91fe-28d492505265', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/timestamps': ZarrDatasetIOConfiguration(object_id='2a5d0180-9491-4588-91fe-28d492505265', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'generic' data_1 = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> data_2 = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090> data_iterator_options = {} dataset_configuration_1 = ZarrDatasetIOConfiguration(object_id='2a5d0180-9491-4588-91fe-28d492505265', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_configuration_2 = ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'> nwbfile = root pynwb.file.NWBFile at 0x139817681325584 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 448291, tzinfo=tzlocal())] identifier: 41d7ed14-10e4-44ff-9ffc-879ad92bcfba session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series_1 = TestTimeSeries1 pynwb.base.TimeSeries at 0x139817681325712 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_unit: seconds unit: volts time_series_2 = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817681324112 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817681325712 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts timestamps = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_array = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_configuration_1 = ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) timestamps_iterator_options = {} tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li4') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries2/timestamps': ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/data': ZarrDatasetIOConfiguration(object_id='2a5d0180-9491-4588-91fe-28d492505265', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/timestamps': ZarrDatasetIOConfiguration(object_id='2a5d0180-9491-4588-91fe-28d492505265', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='4598ed0e-be1c-47a9-a365-45e92acfe66e', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817681325584 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 448291, tzinfo=tzlocal())] identifier: 41d7ed14-10e4-44ff-9ffc-879ad92bcfba session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817681324112 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817681325712 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts nwbfile_objects = {'2a5d0180-9491-4588-91fe-28d492505265': TestTimeSeries1 pynwb.base.TimeSeries at 0x139817681325712 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_unit: seconds unit: volts , '4598ed0e-be1c-47a9-a365-45e92acfe66e': TestTimeSeries2 pynwb.base.TimeSeries at 0x139817681324112 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817681325712 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts , '5491459e-e5a2-4a69-9050-63f74cdd9ee1': root pynwb.file.NWBFile at 0x139817681325584 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 47, 448291, tzinfo=tzlocal())] identifier: 41d7ed14-10e4-44ff-9ffc-879ad92bcfba session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 } object_id = '4598ed0e-be1c-47a9-a365-45e92acfe66e' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090> data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} self = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817681324112 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817681325712 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73a8f50> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aad10> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d73abb90>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d73abb90>,) kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d73abb90>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d73aa090>, 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _________ test_time_series_timestamps_linkage[zarr-classic-DataChunkIterator-data_iterator_options2-timestamps_iterator_options2] __________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li5') integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1...5954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) case_name = 'classic', iterator = <class 'hdmf.data_utils.DataChunkIterator'> data_iterator_options = {'buffer_size': 30000, 'iter_axis': 1}, timestamps_iterator_options = {'buffer_size': 30000}, backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,data_iterator_options,timestamps_iterator_options", [ ("unwrapped", lambda x: x, dict(), dict()), ("generic", SliceableDataChunkIterator, dict(), dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000), dict(buffer_size=30_000)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_time_series_timestamps_linkage( tmpdir: Path, integer_array: np.ndarray, case_name: str, iterator: callable, data_iterator_options: dict, timestamps_iterator_options: dict, backend: Literal["hdf5", "zarr"], ): data_1 = iterator(integer_array, **data_iterator_options) data_2 = iterator(integer_array, **data_iterator_options) timestamps_array = np.linspace(start=0.0, stop=1.0, num=integer_array.shape[0]) timestamps = iterator(timestamps_array, **timestamps_iterator_options) nwbfile = mock_NWBFile() time_series_1 = mock_TimeSeries(name="TestTimeSeries1", data=data_1, timestamps=timestamps, rate=None) nwbfile.add_acquisition(time_series_1) time_series_2 = mock_TimeSeries(name="TestTimeSeries2", data=data_2, timestamps=time_series_1, rate=None) nwbfile.add_acquisition(time_series_2) # Note that the field will still show up in the configuration display backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) # print(backend_configuration) dataset_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/data"] dataset_configuration_2 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/data"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries1/timestamps"] timestamps_configuration_1 = backend_configuration.dataset_configurations["acquisition/TestTimeSeries2/timestamps"] > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries2/timestamps': ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/data': ZarrDatasetIOConfiguration(object_id='6845cf33-68c7-45b5-adea-ae10bb92ae4e', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/timestamps': ZarrDatasetIOConfiguration(object_id='6845cf33-68c7-45b5-adea-ae10bb92ae4e', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'classic' data_1 = <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> data_2 = <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10> data_iterator_options = {'buffer_size': 30000, 'iter_axis': 1} dataset_configuration_1 = ZarrDatasetIOConfiguration(object_id='6845cf33-68c7-45b5-adea-ae10bb92ae4e', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_configuration_2 = ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) integer_array = array([[ 606, 22977, 27598, ..., 21453, 14831, 29962], [-26530, -9155, -6666, ..., 18490, -6943, 1704], [ 10727, 16504, 20858, ..., -14473, 23537, 17539], ..., [ 20291, 10140, 26729, ..., -5514, 8882, 19710], [ 22656, 25954, -21319, ..., -8983, -30074, -24446], [-30841, -12815, 28599, ..., 24069, -15762, -3284]], dtype=int16) iterator = <class 'hdmf.data_utils.DataChunkIterator'> nwbfile = root pynwb.file.NWBFile at 0x139817687664720 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 515913, tzinfo=tzlocal())] identifier: 90461a86-307c-4a5d-a612-abcfe9d8fddf session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 time_series_1 = TestTimeSeries1 pynwb.base.TimeSeries at 0x139817687673296 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_unit: seconds unit: volts time_series_2 = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817687665616 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817687673296 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts timestamps = <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_array = array([0.00000000e+00, 6.66671111e-06, 1.33334222e-05, ..., 9.99986667e-01, 9.99993333e-01, 1.00000000e+00]) timestamps_configuration_1 = ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) timestamps_iterator_options = {'buffer_size': 30000} tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_time_series_timestamps_li5') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py:159: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries2/data': ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries2/timestamps': ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/data': ZarrDatasetIOConfiguration(object_id='6845cf33-68c7-45b5-adea-ae10bb92ae4e', location_in_file='acquisition/TestTimeSeries1/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None), 'acquisition/TestTimeSeries1/timestamps': ZarrDatasetIOConfiguration(object_id='6845cf33-68c7-45b5-adea-ae10bb92ae4e', location_in_file='acquisition/TestTimeSeries1/timestamps', dataset_name='timestamps', dtype=dtype('float64'), full_shape=(150000,), chunk_shape=(150000,), buffer_shape=(150000,), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='db544803-9797-47ca-b6bf-ff9d2027da9a', location_in_file='acquisition/TestTimeSeries2/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(44070, 113), buffer_shape=(150000, 384), compression_method='gzip', compression_options=None, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817687664720 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 515913, tzinfo=tzlocal())] identifier: 90461a86-307c-4a5d-a612-abcfe9d8fddf session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817687665616 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817687673296 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts nwbfile_objects = {'6845cf33-68c7-45b5-adea-ae10bb92ae4e': TestTimeSeries1 pynwb.base.TimeSeries at 0x139817687673296 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_unit: seconds unit: volts , '71881984-b922-4e93-8eea-66d6dbb2af08': root pynwb.file.NWBFile at 0x139817687664720 Fields: acquisition: { TestTimeSeries1 <class 'pynwb.base.TimeSeries'>, TestTimeSeries2 <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 515913, tzinfo=tzlocal())] identifier: 90461a86-307c-4a5d-a612-abcfe9d8fddf session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'db544803-9797-47ca-b6bf-ff9d2027da9a': TestTimeSeries2 pynwb.base.TimeSeries at 0x139817687665616 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817687673296 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts } object_id = 'db544803-9797-47ca-b6bf-ff9d2027da9a' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10> data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} self = TestTimeSeries2 pynwb.base.TimeSeries at 0x139817687665616 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamps: TestTimeSeries1 pynwb.base.TimeSeries at 0x139817687673296 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b72d0> description: no description interval: 1 offset: 0.0 resolution: -1.0 timestamp_link: ( TestTimeSeries2 <class 'pynwb.base.TimeSeries'> ) timestamps: <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b7590> timestamps_unit: seconds unit: volts timestamps_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d79b6c10>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d79b6c10>,) kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d79b6c10>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10>, 'data_io_kwargs': {'chunks': (44070, 113), 'compressor': GZip(level=1), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d79b6e10>, 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _______________________________ test_simple_time_series_override[hdf5-unwrapped-<lambda>-iterator_options0] ________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri0') case_name = 'unwrapped', iterator = <function <lambda> at 0x7f29da633ec0>, iterator_options = {}, backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series_override( tmpdir: Path, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"] ): array = np.zeros(shape=(30_000 * 5, 384), dtype="int16") data = iterator(array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] smaller_chunk_shape = (30_000, 64) smaller_buffer_shape = (60_000, 192) dataset_configuration.chunk_shape = smaller_chunk_shape dataset_configuration.buffer_shape = smaller_buffer_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) array = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='cedbc68f-fdc2-44c7-98f6-5eea02db404e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5})}) case_name = 'unwrapped' data = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) dataset_configuration = HDF5DatasetIOConfiguration(object_id='cedbc68f-fdc2-44c7-98f6-5eea02db404e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}) higher_gzip_level = 5 iterator = <function <lambda> at 0x7f29da633ec0> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817680951696 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 567697, tzinfo=tzlocal())] identifier: aa93e003-34f4-4cd3-8484-a689226aedfb session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_buffer_shape = (60000, 192) smaller_chunk_shape = (30000, 64) time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817680950992 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='cedbc68f-fdc2-44c7-98f6-5eea02db404e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5})}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5} dataset_configuration = HDF5DatasetIOConfiguration(object_id='cedbc68f-fdc2-44c7-98f6-5eea02db404e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817680951696 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 567697, tzinfo=tzlocal())] identifier: aa93e003-34f4-4cd3-8484-a689226aedfb session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817680950992 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'738bebea-1efe-44d6-8a7c-0c2449cbc0d3': root pynwb.file.NWBFile at 0x139817680951696 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 567697, tzinfo=tzlocal())] identifier: aa93e003-34f4-4cd3-8484-a689226aedfb session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'cedbc68f-fdc2-44c7-98f6-5eea02db404e': TestTimeSeries pynwb.base.TimeSeries at 0x139817680950992 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'cedbc68f-fdc2-44c7-98f6-5eea02db404e' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817680950992 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d734c910>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16), 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d734c910>,) kwargs = {'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., ... ..., 0, 0, 0]], dtype=int16), 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d734c910>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16), 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16), 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _______________________ test_simple_time_series_override[hdf5-generic-SliceableDataChunkIterator-iterator_options1] ________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri1') case_name = 'generic', iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'>, iterator_options = {}, backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series_override( tmpdir: Path, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"] ): array = np.zeros(shape=(30_000 * 5, 384), dtype="int16") data = iterator(array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] smaller_chunk_shape = (30_000, 64) smaller_buffer_shape = (60_000, 192) dataset_configuration.chunk_shape = smaller_chunk_shape dataset_configuration.buffer_shape = smaller_buffer_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) array = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='e9064187-779d-4d4c-a268-adc99c385a3e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5})}) case_name = 'generic' data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490> dataset_configuration = HDF5DatasetIOConfiguration(object_id='e9064187-779d-4d4c-a268-adc99c385a3e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}) higher_gzip_level = 5 iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817681975824 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 710861, tzinfo=tzlocal())] identifier: c952216b-4f24-4705-964d-355f9201874b session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_buffer_shape = (60000, 192) smaller_chunk_shape = (30000, 64) time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817681962064 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri1') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='e9064187-779d-4d4c-a268-adc99c385a3e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5})}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5} dataset_configuration = HDF5DatasetIOConfiguration(object_id='e9064187-779d-4d4c-a268-adc99c385a3e', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817681975824 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 710861, tzinfo=tzlocal())] identifier: c952216b-4f24-4705-964d-355f9201874b session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817681962064 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'0f387481-0e6c-4d70-b84d-3088dc22165a': root pynwb.file.NWBFile at 0x139817681975824 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 50, 710861, tzinfo=tzlocal())] identifier: c952216b-4f24-4705-964d-355f9201874b session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'e9064187-779d-4d4c-a268-adc99c385a3e': TestTimeSeries pynwb.base.TimeSeries at 0x139817681962064 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'e9064187-779d-4d4c-a268-adc99c385a3e' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490> data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817681962064 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7445a10>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490>, 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7445a10>,) kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490>, 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d7445a10>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490>, 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7447490>, 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ____________________________ test_simple_time_series_override[hdf5-classic-DataChunkIterator-iterator_options2] ____________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri2') case_name = 'classic', iterator = <class 'hdmf.data_utils.DataChunkIterator'>, iterator_options = {'buffer_size': 150000, 'iter_axis': 1} backend = 'hdf5' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series_override( tmpdir: Path, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"] ): array = np.zeros(shape=(30_000 * 5, 384), dtype="int16") data = iterator(array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] smaller_chunk_shape = (30_000, 64) smaller_buffer_shape = (60_000, 192) dataset_configuration.chunk_shape = smaller_chunk_shape dataset_configuration.buffer_shape = smaller_buffer_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) array = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='5aa05223-5f7e-4fda-98cc-b0c2744ec881', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5})}) case_name = 'classic' data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150> dataset_configuration = HDF5DatasetIOConfiguration(object_id='5aa05223-5f7e-4fda-98cc-b0c2744ec881', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}) higher_gzip_level = 5 iterator = <class 'hdmf.data_utils.DataChunkIterator'> iterator_options = {'buffer_size': 150000, 'iter_axis': 1} nwbfile = root pynwb.file.NWBFile at 0x139817680870992 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 219425, tzinfo=tzlocal())] identifier: 3d04b1a0-dee4-4739-8296-68841f6afdc4 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_buffer_shape = (60000, 192) smaller_chunk_shape = (30000, 64) time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817680869520 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri2') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': HDF5DatasetIOConfiguration(object_id='5aa05223-5f7e-4fda-98cc-b0c2744ec881', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5})}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5} dataset_configuration = HDF5DatasetIOConfiguration(object_id='5aa05223-5f7e-4fda-98cc-b0c2744ec881', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817680870992 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 219425, tzinfo=tzlocal())] identifier: 3d04b1a0-dee4-4739-8296-68841f6afdc4 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817680869520 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'3f195628-af98-4c98-8982-34a0697eb51d': root pynwb.file.NWBFile at 0x139817680870992 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 219425, tzinfo=tzlocal())] identifier: 3d04b1a0-dee4-4739-8296-68841f6afdc4 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , '5aa05223-5f7e-4fda-98cc-b0c2744ec881': TestTimeSeries pynwb.base.TimeSeries at 0x139817680869520 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = '5aa05223-5f7e-4fda-98cc-b0c2744ec881' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150> data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817680869520 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29dc9f9bc0> args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d733b950>,) func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150>, 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d733b950>,) kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150>, 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf.backends.hdf5.h5_utils.H5DataIO object at 0x7f29d733b950>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function H5DataIO.__init__ at 0x7f29dc9f9b20> is_method = True kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150>, 'data_io_kwargs': {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5}} loc_val = [{'default': None, 'doc': 'the data to be written. NOTE: If an h5py.Dataset is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in H5DataIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'h5py._hl.dataset.Dataset'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Dataset will be resizable up to this shape (Tuple). Automatically ' 'enables chunking.Use None for the axes you want to be unlimited.', 'name': 'maxshape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'Chunk shape or True to enable auto-chunking', 'name': 'chunks', 'type': (<class 'bool'>, <class 'tuple'>)}, {'default': None, 'doc': 'Compression strategy. If a bool is given, then gzip compression will ' 'be used by ' 'default.http://docs.h5py.org/en/latest/high/dataset.html#dataset-compression', 'name': 'compression', 'type': (<class 'str'>, <class 'bool'>, <class 'int'>)}, {'default': None, 'doc': 'Parameter for compression filter', 'name': 'compression_opts', 'type': (<class 'int'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Enable shuffle I/O filter. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-shuffle', 'name': 'shuffle', 'type': <class 'bool'>}, {'default': None, 'doc': 'Enable fletcher32 checksum. ' 'http://docs.h5py.org/en/latest/high/dataset.html#dataset-fletcher32', 'name': 'fletcher32', 'type': <class 'bool'>}, {'default': False, 'doc': 'If data is an h5py.Dataset should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an h5py.Dataset', 'name': 'link_data', 'type': <class 'bool'>}, {'default': False, 'doc': 'Enable passing dynamically loaded filters as compression parameter', 'name': 'allow_plugin_filters', 'type': <class 'bool'>}, {'default': None, 'doc': 'the shape of the new dataset, used only if data is None', 'name': 'shape', 'type': <class 'tuple'>}, {'default': None, 'doc': 'the data type of the new dataset, used only if data is None', 'name': 'dtype', 'type': (<class 'str'>, <class 'type'>, <class 'numpy.dtype'>)}] msg = "H5DataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'allow_plugin_filters': False, 'chunks': None, 'compression': None, 'compression_opts': None, 'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d7339150>, 'dtype': None, 'fillvalue': None, 'fletcher32': None, 'link_data': False, 'maxshape': None, 'shape': None, 'shuffle': None}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _______________________________ test_simple_time_series_override[zarr-unwrapped-<lambda>-iterator_options0] ________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri3') case_name = 'unwrapped', iterator = <function <lambda> at 0x7f29da633ec0>, iterator_options = {}, backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series_override( tmpdir: Path, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"] ): array = np.zeros(shape=(30_000 * 5, 384), dtype="int16") data = iterator(array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] smaller_chunk_shape = (30_000, 64) smaller_buffer_shape = (60_000, 192) dataset_configuration.chunk_shape = smaller_chunk_shape dataset_configuration.buffer_shape = smaller_buffer_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) array = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='e56f6aa9-5852-40a1-9110-1dd94b3eaa25', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'unwrapped' data = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) dataset_configuration = ZarrDatasetIOConfiguration(object_id='e56f6aa9-5852-40a1-9110-1dd94b3eaa25', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) higher_gzip_level = 5 iterator = <function <lambda> at 0x7f29da633ec0> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817681021328 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 264337, tzinfo=tzlocal())] identifier: ca042321-6f84-4af5-98c0-281e606cc8a7 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_buffer_shape = (60000, 192) smaller_chunk_shape = (30000, 64) time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817681016656 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri3') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='e56f6aa9-5852-40a1-9110-1dd94b3eaa25', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='e56f6aa9-5852-40a1-9110-1dd94b3eaa25', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817681021328 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 264337, tzinfo=tzlocal())] identifier: ca042321-6f84-4af5-98c0-281e606cc8a7 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817681016656 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'56610b1d-35b5-437b-8bcc-bdb2b74b91a7': root pynwb.file.NWBFile at 0x139817681021328 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 264337, tzinfo=tzlocal())] identifier: ca042321-6f84-4af5-98c0-281e606cc8a7 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'e56f6aa9-5852-40a1-9110-1dd94b3eaa25': TestTimeSeries pynwb.base.TimeSeries at 0x139817681016656 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'e56f6aa9-5852-40a1-9110-1dd94b3eaa25' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817681016656 Fields: comments: no comments conversion: 1.0 data: [[0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] ... [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0] [0 0 0 ... 0 0 0]] description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d735d110>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16), 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d735d110>,) kwargs = {'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., ... ..., 0, 0, 0]], dtype=int16), 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d735d110>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16), 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16), 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _______________________ test_simple_time_series_override[zarr-generic-SliceableDataChunkIterator-iterator_options1] ________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri4') case_name = 'generic', iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'>, iterator_options = {}, backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series_override( tmpdir: Path, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"] ): array = np.zeros(shape=(30_000 * 5, 384), dtype="int16") data = iterator(array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] smaller_chunk_shape = (30_000, 64) smaller_buffer_shape = (60_000, 192) dataset_configuration.chunk_shape = smaller_chunk_shape dataset_configuration.buffer_shape = smaller_buffer_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) array = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='b355023b-7f04-41d0-9ddd-42e407f6b7aa', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'generic' data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950> dataset_configuration = ZarrDatasetIOConfiguration(object_id='b355023b-7f04-41d0-9ddd-42e407f6b7aa', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) higher_gzip_level = 5 iterator = <class 'neuroconv.tools.hdmf.SliceableDataChunkIterator'> iterator_options = {} nwbfile = root pynwb.file.NWBFile at 0x139817680719312 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 312123, tzinfo=tzlocal())] identifier: 3003f17c-0ff7-4fb6-938b-b99740aabe00 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_buffer_shape = (60000, 192) smaller_chunk_shape = (30000, 64) time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817680729296 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri4') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='b355023b-7f04-41d0-9ddd-42e407f6b7aa', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='b355023b-7f04-41d0-9ddd-42e407f6b7aa', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817680719312 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 312123, tzinfo=tzlocal())] identifier: 3003f17c-0ff7-4fb6-938b-b99740aabe00 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817680729296 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'9e197a7d-c11e-4430-8e7e-57b69457b58f': root pynwb.file.NWBFile at 0x139817680719312 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 312123, tzinfo=tzlocal())] identifier: 3003f17c-0ff7-4fb6-938b-b99740aabe00 session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'b355023b-7f04-41d0-9ddd-42e407f6b7aa': TestTimeSeries pynwb.base.TimeSeries at 0x139817680729296 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'b355023b-7f04-41d0-9ddd-42e407f6b7aa' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950> data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817680729296 Fields: comments: no comments conversion: 1.0 data: <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7316350>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950>, 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7316350>,) kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950>, 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d7316350>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950>, 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': <neuroconv.tools.hdmf.SliceableDataChunkIterator object at 0x7f29d7314950>, 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError ____________________________ test_simple_time_series_override[zarr-classic-DataChunkIterator-iterator_options2] ____________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri5') case_name = 'classic', iterator = <class 'hdmf.data_utils.DataChunkIterator'>, iterator_options = {'buffer_size': 150000, 'iter_axis': 1} backend = 'zarr' @pytest.mark.parametrize( "case_name,iterator,iterator_options", [ ("unwrapped", lambda x: x, dict()), ("generic", SliceableDataChunkIterator, dict()), ("classic", DataChunkIterator, dict(iter_axis=1, buffer_size=30_000 * 5)), # Need to hardcode buffer size in classic case or else it takes forever... ], ) @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_time_series_override( tmpdir: Path, case_name: str, iterator: callable, iterator_options: dict, backend: Literal["hdf5", "zarr"] ): array = np.zeros(shape=(30_000 * 5, 384), dtype="int16") data = iterator(array, **iterator_options) nwbfile = mock_NWBFile() time_series = mock_TimeSeries(name="TestTimeSeries", data=data) nwbfile.add_acquisition(time_series) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestTimeSeries/data"] smaller_chunk_shape = (30_000, 64) smaller_buffer_shape = (60_000, 192) dataset_configuration.chunk_shape = smaller_chunk_shape dataset_configuration.buffer_shape = smaller_buffer_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) array = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='eb35d47e-2271-4205-8c3e-81d3748ccf94', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) case_name = 'classic' data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90> dataset_configuration = ZarrDatasetIOConfiguration(object_id='eb35d47e-2271-4205-8c3e-81d3748ccf94', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) higher_gzip_level = 5 iterator = <class 'hdmf.data_utils.DataChunkIterator'> iterator_options = {'buffer_size': 150000, 'iter_axis': 1} nwbfile = root pynwb.file.NWBFile at 0x139817685080784 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 907194, tzinfo=tzlocal())] identifier: bb9a9565-bfd0-4ef3-aa8a-9a4740fd9d9e session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_buffer_shape = (60000, 192) smaller_chunk_shape = (30000, 64) time_series = TestTimeSeries pynwb.base.TimeSeries at 0x139817685077456 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_time_series_overri5') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:56: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:33: in configure_backend nwbfile_object.set_data_io( backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestTimeSeries/data': ZarrDatasetIOConfiguration(object_id='eb35d47e-2271-4205-8c3e-81d3748ccf94', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='eb35d47e-2271-4205-8c3e-81d3748ccf94', location_in_file='acquisition/TestTimeSeries/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(60000, 192), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817685080784 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 907194, tzinfo=tzlocal())] identifier: bb9a9565-bfd0-4ef3-aa8a-9a4740fd9d9e session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = TestTimeSeries pynwb.base.TimeSeries at 0x139817685077456 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts nwbfile_objects = {'74e98dbb-dd27-40ce-8430-1b294a6b0938': root pynwb.file.NWBFile at 0x139817685080784 Fields: acquisition: { TestTimeSeries <class 'pynwb.base.TimeSeries'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 907194, tzinfo=tzlocal())] identifier: bb9a9565-bfd0-4ef3-aa8a-9a4740fd9d9e session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , 'eb35d47e-2271-4205-8c3e-81d3748ccf94': TestTimeSeries pynwb.base.TimeSeries at 0x139817685077456 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts } object_id = 'eb35d47e-2271-4205-8c3e-81d3748ccf94' /usr/lib/python3.11/site-packages/hdmf/container.py:746: in set_data_io self.fields[dataset_name] = data_io_class(data=data, **kwargs) data = <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90> data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> dataset_name = 'data' kwargs = {'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} self = TestTimeSeries pynwb.base.TimeSeries at 0x139817685077456 Fields: comments: no comments conversion: 1.0 data: <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90> description: no description offset: 0.0 rate: 10.0 resolution: -1.0 starting_time: 0.0 starting_time_unit: seconds unit: volts /usr/lib/python3.11/site-packages/hdmf/utils.py:663: in func_call pargs = _check_args(args, kwargs) _check_args = <function docval.<locals>.dec.<locals>._check_args at 0x7f29db1577e0> args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d773e0d0>,) func = <function ZarrDataIO.__init__ at 0x7f29db157740> kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90>, 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d773e0d0>,) kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90>, 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} def _check_args(args, kwargs): """Parse and check arguments to decorated function. Raise warnings and errors as appropriate.""" # this function was separated from func_call() in order to make stepping through lines of code using pdb # easier parsed = __parse_args( loc_val, args[1:] if is_method else args, kwargs, enforce_type=enforce_type, enforce_shape=enforce_shape, allow_extra=allow_extra, allow_positional=allow_positional ) parse_warnings = parsed.get('future_warnings') if parse_warnings: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_warnings)) warnings.warn(msg, FutureWarning) for error_type, ExceptionType in (('type_errors', TypeError), ('value_errors', ValueError), ('syntax_errors', SyntaxError)): parse_err = parsed.get(error_type) if parse_err: msg = '%s: %s' % (func.__qualname__, ', '.join(parse_err)) > raise ExceptionType(msg) E TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' ExceptionType = <class 'TypeError'> allow_extra = False allow_positional = True args = (<hdmf_zarr.utils.ZarrDataIO object at 0x7f29d773e0d0>,) enforce_shape = True enforce_type = True error_type = 'type_errors' func = <function ZarrDataIO.__init__ at 0x7f29db157740> is_method = True kwargs = {'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90>, 'data_io_kwargs': {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None}} loc_val = [{'doc': 'the data to be written. NOTE: If an zarr.Array is used, all other ' 'settings but link_data will be ignored as the dataset will either be ' 'linked to or copied as is in ZarrIO.', 'name': 'data', 'type': (<class 'numpy.ndarray'>, <class 'list'>, <class 'tuple'>, <class 'zarr.core.Array'>, <class 'collections.abc.Iterable'>)}, {'default': None, 'doc': 'Chunk shape', 'name': 'chunks', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': None, 'doc': 'Value to be returned when reading uninitialized parts of the dataset', 'name': 'fillvalue', 'type': None}, {'default': None, 'doc': 'Zarr compressor filter to be used. Set to True to use Zarr ' 'default.Set to False to disable compression)', 'name': 'compressor', 'type': (<class 'numcodecs.abc.Codec'>, <class 'bool'>)}, {'default': None, 'doc': 'One or more Zarr-supported codecs used to transform data prior to ' 'compression.', 'name': 'filters', 'type': (<class 'list'>, <class 'tuple'>)}, {'default': False, 'doc': 'If data is an zarr.Array should it be linked to or copied. NOTE: ' 'This parameter is only allowed if data is an zarr.Array', 'name': 'link_data', 'type': <class 'bool'>}] msg = "ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs'" parse_err = ["unrecognized argument: 'data_io_kwargs'"] parse_warnings = [] parsed = {'args': {'chunks': None, 'compressor': None, 'data': <hdmf.data_utils.DataChunkIterator object at 0x7f29d773cd90>, 'fillvalue': None, 'filters': None, 'link_data': False}, 'future_warnings': [], 'syntax_errors': [], 'type_errors': ["unrecognized argument: 'data_io_kwargs'"], 'value_errors': []} /usr/lib/python3.11/site-packages/hdmf/utils.py:656: TypeError _________________________________________________ test_simple_dynamic_table_override[hdf5] _________________________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_over0') backend = 'hdf5' @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_dynamic_table_override(tmpdir: Path, backend: Literal["hdf5", "zarr"]): data = np.zeros(shape=(30_000 * 5, 384), dtype="int16") nwbfile = mock_NWBFile() dynamic_table = DynamicTable( name="TestDynamicTable", description="", columns=[VectorData(name="TestColumn", description="", data=data)] ) nwbfile.add_acquisition(dynamic_table) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestDynamicTable/TestColumn/data"] smaller_chunk_shape = (30_000, 64) dataset_configuration.chunk_shape = smaller_chunk_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'hdf5' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': HDF5DatasetIOConfiguration(object_id='b623fca4-8991-48df-9964-245c4cd73380', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5})}) data = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) dataset_configuration = HDF5DatasetIOConfiguration(object_id='b623fca4-8991-48df-9964-245c4cd73380', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5}) dynamic_table = TestDynamicTable hdmf.common.table.DynamicTable at 0x139817687725776 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> higher_gzip_level = 5 nwbfile = root pynwb.file.NWBFile at 0x139817687722576 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 950250, tzinfo=tzlocal())] identifier: 8d3b70dc-153a-4856-8656-88724abf83ea session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_chunk_shape = (30000, 64) tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_over0') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ nwbfile = root pynwb.file.NWBFile at 0x139817687722576 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.D...ion_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': HDF5DatasetIOConfigur...4), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5})}) def configure_backend( nwbfile: NWBFile, backend_configuration: Union[HDF5BackendConfiguration, ZarrBackendConfiguration] ) -> None: """Configure all datasets specified in the `backend_configuration` with their appropriate DataIO and options.""" nwbfile_objects = nwbfile.objects data_io_class = backend_configuration.data_io_class for dataset_configuration in backend_configuration.dataset_configurations.values(): object_id = dataset_configuration.object_id dataset_name = dataset_configuration.dataset_name data_io_kwargs = dataset_configuration.get_data_io_kwargs() # TODO: update buffer shape in iterator, if present nwbfile_object = nwbfile_objects[object_id] is_dataset_linked = isinstance(nwbfile_object.fields.get(dataset_name), TimeSeries) # Table columns if isinstance(nwbfile_object, Data): > nwbfile_object.set_data_io(data_io_class=data_io_class, data_io_kwargs=data_io_kwargs) E AttributeError: 'VectorData' object has no attribute 'set_data_io' backend_configuration = HDF5BackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': HDF5DatasetIOConfiguration(object_id='b623fca4-8991-48df-9964-245c4cd73380', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5})}) data_io_class = <class 'hdmf.backends.hdf5.h5_utils.H5DataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compression': 'gzip', 'compression_opts': 5} dataset_configuration = HDF5DatasetIOConfiguration(object_id='b623fca4-8991-48df-9964-245c4cd73380', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5}) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817687722576 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 950250, tzinfo=tzlocal())] identifier: 8d3b70dc-153a-4856-8656-88724abf83ea session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = <hdmf.common.table.VectorData object at 0x7f29d79c3e50> nwbfile_objects = {'246021d9-18f3-46ba-aea1-927fca00c809': root pynwb.file.NWBFile at 0x139817687722576 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 950250, tzinfo=tzlocal())] identifier: 8d3b70dc-153a-4856-8656-88724abf83ea session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , '4225e8ec-ef75-4ad0-a437-68e8f1a0112b': <hdmf.common.table.ElementIdentifiers object at 0x7f29d79c3f50>, 'b623fca4-8991-48df-9964-245c4cd73380': <hdmf.common.table.VectorData object at 0x7f29d79c3e50>, 'd724484d-96c4-4cb1-b84a-418a268efa3e': TestDynamicTable hdmf.common.table.DynamicTable at 0x139817687725776 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> } object_id = 'b623fca4-8991-48df-9964-245c4cd73380' ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:30: AttributeError _________________________________________________ test_simple_dynamic_table_override[zarr] _________________________________________________ tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_over1') backend = 'zarr' @pytest.mark.parametrize("backend", ["hdf5", "zarr"]) def test_simple_dynamic_table_override(tmpdir: Path, backend: Literal["hdf5", "zarr"]): data = np.zeros(shape=(30_000 * 5, 384), dtype="int16") nwbfile = mock_NWBFile() dynamic_table = DynamicTable( name="TestDynamicTable", description="", columns=[VectorData(name="TestColumn", description="", data=data)] ) nwbfile.add_acquisition(dynamic_table) backend_configuration = get_default_backend_configuration(nwbfile=nwbfile, backend=backend) dataset_configuration = backend_configuration.dataset_configurations["acquisition/TestDynamicTable/TestColumn/data"] smaller_chunk_shape = (30_000, 64) dataset_configuration.chunk_shape = smaller_chunk_shape higher_gzip_level = 5 if backend == "hdf5": dataset_configuration.compression_options = dict(level=higher_gzip_level) elif backend == "zarr": dataset_configuration.compression_options = dict(level=higher_gzip_level) > configure_backend(nwbfile=nwbfile, backend_configuration=backend_configuration) backend = 'zarr' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': ZarrDatasetIOConfiguration(object_id='14a226c7-4013-4caf-9d2b-1f5774e8cb5b', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) data = array([[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], dtype=int16) dataset_configuration = ZarrDatasetIOConfiguration(object_id='14a226c7-4013-4caf-9d2b-1f5774e8cb5b', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) dynamic_table = TestDynamicTable hdmf.common.table.DynamicTable at 0x139817687719760 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> higher_gzip_level = 5 nwbfile = root pynwb.file.NWBFile at 0x139817687713296 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 961078, tzinfo=tzlocal())] identifier: efbc60b2-465c-46f1-a36f-0922b35aeb7c session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 smaller_chunk_shape = (30000, 64) tmpdir = local('/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/pytest-of-portage/pytest-0/test_simple_dynamic_table_over1') tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py:100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ nwbfile = root pynwb.file.NWBFile at 0x139817687713296 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.D...ion_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': ZarrDatasetIOConfigur...ression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) def configure_backend( nwbfile: NWBFile, backend_configuration: Union[HDF5BackendConfiguration, ZarrBackendConfiguration] ) -> None: """Configure all datasets specified in the `backend_configuration` with their appropriate DataIO and options.""" nwbfile_objects = nwbfile.objects data_io_class = backend_configuration.data_io_class for dataset_configuration in backend_configuration.dataset_configurations.values(): object_id = dataset_configuration.object_id dataset_name = dataset_configuration.dataset_name data_io_kwargs = dataset_configuration.get_data_io_kwargs() # TODO: update buffer shape in iterator, if present nwbfile_object = nwbfile_objects[object_id] is_dataset_linked = isinstance(nwbfile_object.fields.get(dataset_name), TimeSeries) # Table columns if isinstance(nwbfile_object, Data): > nwbfile_object.set_data_io(data_io_class=data_io_class, data_io_kwargs=data_io_kwargs) E AttributeError: 'VectorData' object has no attribute 'set_data_io' backend_configuration = ZarrBackendConfiguration(dataset_configurations={'acquisition/TestDynamicTable/TestColumn/data': ZarrDatasetIOConfiguration(object_id='14a226c7-4013-4caf-9d2b-1f5774e8cb5b', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None)}, number_of_jobs=11) data_io_class = <class 'hdmf_zarr.utils.ZarrDataIO'> data_io_kwargs = {'chunks': (30000, 64), 'compressor': GZip(level=5), 'filters': None} dataset_configuration = ZarrDatasetIOConfiguration(object_id='14a226c7-4013-4caf-9d2b-1f5774e8cb5b', location_in_file='acquisition/TestDynamicTable/TestColumn/data', dataset_name='data', dtype=dtype('int16'), full_shape=(150000, 384), chunk_shape=(30000, 64), buffer_shape=(150000, 384), compression_method='gzip', compression_options={'level': 5}, filter_methods=None, filter_options=None) dataset_name = 'data' is_dataset_linked = False nwbfile = root pynwb.file.NWBFile at 0x139817687713296 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 961078, tzinfo=tzlocal())] identifier: efbc60b2-465c-46f1-a36f-0922b35aeb7c session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 nwbfile_object = <hdmf.common.table.VectorData object at 0x7f29d79c0c10> nwbfile_objects = {'14a226c7-4013-4caf-9d2b-1f5774e8cb5b': <hdmf.common.table.VectorData object at 0x7f29d79c0c10>, '19a12062-bddf-43ed-894d-b3535dee0adf': root pynwb.file.NWBFile at 0x139817687713296 Fields: acquisition: { TestDynamicTable <class 'hdmf.common.table.DynamicTable'> } file_create_date: [datetime.datetime(2024, 3, 21, 12, 56, 51, 961078, tzinfo=tzlocal())] identifier: efbc60b2-465c-46f1-a36f-0922b35aeb7c session_description: session_description session_start_time: 1970-01-01 00:00:00-05:00 timestamps_reference_time: 1970-01-01 00:00:00-05:00 , '45fa3b1d-a286-4456-80ae-e2726dedf8a8': <hdmf.common.table.ElementIdentifiers object at 0x7f29d776f050>, 'ff883a7c-ffe7-4ac5-a473-f66b1a2d4062': TestDynamicTable hdmf.common.table.DynamicTable at 0x139817687719760 Fields: colnames: ['TestColumn'] columns: ( TestColumn <class 'hdmf.common.table.VectorData'> ) id: id <class 'hdmf.common.table.ElementIdentifiers'> } object_id = '14a226c7-4013-4caf-9d2b-1f5774e8cb5b' ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/nwb_helpers/_configure_backend.py:30: AttributeError _______________________________________ TestMockRecordingInterface.test_conversion_as_lone_interface _______________________________________ self = <test_mock_recording_interface.TestMockRecordingInterface testMethod=test_conversion_as_lone_interface> def test_conversion_as_lone_interface(self): interface_kwargs = self.interface_kwargs if isinstance(interface_kwargs, dict): interface_kwargs = [interface_kwargs] for num, kwargs in enumerate(interface_kwargs): with self.subTest(str(num)): self.case = num self.test_kwargs = kwargs self.interface = self.data_interface_cls(**self.test_kwargs) assert isinstance(self.interface, BaseRecordingExtractorInterface) if not self.interface.has_probe(): self.interface.set_probe( generate_mock_probe(num_channels=self.interface.recording_extractor.get_num_channels()), group_mode="by_shank", ) self.check_metadata_schema_valid() self.check_conversion_options_schema_valid() self.check_metadata() self.nwbfile_path = str(self.save_directory / f"{self.__class__.__name__}_{num}.nwb") self.run_conversion(nwbfile_path=self.nwbfile_path) > self.check_read_nwb(nwbfile_path=self.nwbfile_path) interface_kwargs = [{'durations': [0.1]}] kwargs = {'durations': [0.1]} num = 0 self = <test_mock_recording_interface.TestMockRecordingInterface testMethod=test_conversion_as_lone_interface> ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/testing/data_interface_mixins.py:518: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <test_mock_recording_interface.TestMockRecordingInterface testMethod=test_conversion_as_lone_interface> nwbfile_path = '/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/tmpl6m4noal/TestMockRecordingInterface_0.nwb' def check_read_nwb(self, nwbfile_path: str): from spikeinterface.extractors import NwbRecordingExtractor recording = self.interface.recording_extractor electrical_series_name = self.interface.get_metadata()["Ecephys"][self.interface.es_key]["name"] if recording.get_num_segments() == 1: # Spikeinterface behavior is to load the electrode table channel_name property as a channel_id self.nwb_recording = NwbRecordingExtractor( file_path=nwbfile_path, electrical_series_name=electrical_series_name ) # Set channel_ids right for comparison # Neuroconv ALWAYS writes a string property `channel_name`` to the electrode table. # And the NwbRecordingExtractor always uses `channel_name` property as the channel_ids # `check_recordings_equal` compares ids so we need to rename the channels or the original recordings # So they match properties_in_the_recording = recording.get_property_keys() if "channel_name" in properties_in_the_recording: channel_name = recording.get_property("channel_name").astype("str", copy=False) else: channel_name = recording.get_channel_ids().astype("str", copy=False) > recording = recording.rename_channels(new_channel_ids=channel_name) E AttributeError: 'NoiseGeneratorRecording' object has no attribute 'rename_channels' NwbRecordingExtractor = <class 'spikeinterface.extractors.nwbextractors.NwbRecordingExtractor'> channel_name = array(['0', '1', '2', '3'], dtype='<U21') electrical_series_name = 'ElectricalSeries' nwbfile_path = '/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/tmpl6m4noal/TestMockRecordingInterface_0.nwb' properties_in_the_recording = ['contact_vector', 'location', 'group'] recording = NoiseGeneratorRecording: 4 channels - 30.0kHz - 1 segments - 3,000 samples - 0.10s (100.00 ms) float32 dtype - 46.88 KiB self = <test_mock_recording_interface.TestMockRecordingInterface testMethod=test_conversion_as_lone_interface> ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/testing/data_interface_mixins.py:307: AttributeError ----------------------------------------------------------- Captured stdout call ----------------------------------------------------------- NWB file saved at /var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/tmpl6m4noal/TestMockRecordingInterface_0.nwb! ============================================================= warnings summary ============================================================= ../../../../../../../usr/lib/python3.11/site-packages/hdmf/container.py:10 /usr/lib/python3.11/site-packages/hdmf/container.py:10: DeprecationWarning: Pyarrow will become a required dependency of pandas in the next major release of pandas (pandas 3.0), (to allow more performant data types, such as the Arrow string type, and better interoperability with other libraries) but was not found to be installed on your system. If this would cause problems for you, please provide us feedback at https://github.com/pandas-dev/pandas/issues/54466 import pandas as pd ../neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/yaml_conversion_specification/_yaml_conversion_specification.py:7 /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/yaml_conversion_specification/_yaml_conversion_specification.py:7: DeprecationWarning: jsonschema.RefResolver is deprecated as of v4.18.0, in favor of the https://github.com/python-jsonschema/referencing library, which provides more compliant referencing behavior as well as more flexible APIs for customization. A future release will remove RefResolver. Please file a feature request (on referencing) if you are missing an API for the kind of customization you need. from jsonschema import RefResolver, validate tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_baseline_mean_int_dtype_float_assertion /usr/lib/python3.11/site-packages/hdmf/spec/namespace.py:531: UserWarning: Ignoring cached namespace 'hdmf-common' version 1.5.1 because version 1.8.0 is already loaded. warn("Ignoring cached namespace '%s' version %s because version %s is already loaded." tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_baseline_mean_int_dtype_float_assertion /usr/lib/python3.11/site-packages/hdmf/spec/namespace.py:531: UserWarning: Ignoring cached namespace 'core' version 2.5.0 because version 2.6.0-alpha is already loaded. warn("Ignoring cached namespace '%s' version %s because version %s is already loaded." tests/test_minimal/test_testing/test_mocks/test_mock_ttl.py::TestMockTTLSignals::test_baseline_mean_int_dtype_float_assertion /usr/lib/python3.11/site-packages/hdmf/spec/namespace.py:531: UserWarning: Ignoring cached namespace 'hdmf-experimental' version 0.2.0 because version 0.5.0 is already loaded. warn("Ignoring cached namespace '%s' version %s because version %s is already loaded." tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_external_image_series[hdf5] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py::test_configuration_on_external_image_series[zarr] /usr/lib/python3.11/site-packages/pynwb/image.py:97: DeprecationWarning: ImageSeries 'TestImageSeries': The value for 'format' has been changed to 'external'. Setting a default value for 'format' is deprecated and will be changed to raising a ValueError in the next major release. warnings.warn( tests/test_ecephys/test_ecephys_interfaces.py::TestRecordingInterface::test_stub_multi_segment tests/test_ecephys/test_ecephys_interfaces.py::TestRecordingInterface::test_stub_single_segment tests/test_ecephys/test_mock_nidq_interface.py::TestMockSpikeGLXNIDQInterface::test_mock_run_conversion /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/basedatainterface.py:97: UserWarning: Using DataInterface.run_conversion without specifying nwbfile_path is deprecated. To create an NWBFile object in memory, use DataInterface.create_nwbfile. To append to an existing NWBFile object, use DataInterface.add_to_nwbfile. warnings.warn( # TODO: remove on or after 6/21/2024 tests/test_ecephys/test_mock_recording_interface.py::TestMockRecordingInterface::test_interface_alignment /usr/lib/python3.11/site-packages/spikeinterface/core/baserecording.py:418: UserWarning: Setting times with Recording.set_times() is not recommended because times are not always propagated across preprocessingUse this carefully! warn( tests/test_ecephys/test_tools_spikeinterface.py: 23 warnings /var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8-python3_11/install/usr/lib/python3.11/site-packages/neuroconv/tools/spikeinterface/spikeinterface.py:121: UserWarning: When adding ElectrodeGroup, no Devices were found on nwbfile. Creating a Device now... warnings.warn("When adding ElectrodeGroup, no Devices were found on nwbfile. Creating a Device now...") tests/test_ecephys/test_tools_spikeinterface.py::TestAddElectricalSeriesWriting::test_write_with_lzf_compression /usr/lib/python3.11/site-packages/hdmf/backends/hdf5/h5_utils.py:595: UserWarning: lzf compression may not be available on all installations of HDF5. Use of gzip is recommended to ensure portability of the generated HDF5 files. warnings.warn(str(self.__iosettings['compression']) + " compression may not be available " tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/core_tools.py:312: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpabgrn57b/KKBT236Z/traces_cached_seg0.raw' mode='r+' encoding='UTF-8'> executor.run() Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/baserecording.py:550: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpabgrn57b/KKBT236Z/traces_cached_seg0.raw' mode='r' encoding='UTF-8'> cached.set_probegroup(probegroup) Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/base.py:864: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpabgrn57b/KKBT236Z/traces_cached_seg0.raw' mode='r' encoding='UTF-8'> cached = self._save(folder=folder, verbose=verbose, **save_kwargs) Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/core_tools.py:312: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpflnso311/9X3UFBPA/traces_cached_seg0.raw' mode='r+' encoding='UTF-8'> executor.run() Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/core_tools.py:312: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpflnso311/9X3UFBPA/traces_cached_seg1.raw' mode='r+' encoding='UTF-8'> executor.run() Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/baserecording.py:550: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpflnso311/9X3UFBPA/traces_cached_seg1.raw' mode='r' encoding='UTF-8'> cached.set_probegroup(probegroup) Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/baserecording.py:550: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpflnso311/9X3UFBPA/traces_cached_seg0.raw' mode='r' encoding='UTF-8'> cached.set_probegroup(probegroup) Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/base.py:864: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpflnso311/9X3UFBPA/traces_cached_seg1.raw' mode='r' encoding='UTF-8'> cached = self._save(folder=folder, verbose=verbose, **save_kwargs) Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/core/base.py:864: ResourceWarning: unclosed file <_io.TextIOWrapper name='/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/spikeinterface_cache/tmpflnso311/9X3UFBPA/traces_cached_seg0.raw' mode='r' encoding='UTF-8'> cached = self._save(folder=folder, verbose=verbose, **save_kwargs) Enable tracemalloc to get traceback where the object was allocated. See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/qualitymetrics/misc_metrics.py:142: UserWarning: Bin duration of 60s is larger than recording duration. Presence ratios are set to NaN. warnings.warn( tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/qualitymetrics/misc_metrics.py:842: UserWarning: Units [0, 1, 2, 3] have too few spikes and amplitude_cutoff is set to NaN warnings.warn(f"Units {nan_units} have too few spikes and " "amplitude_cutoff is set to NaN") tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property tests/test_ecephys/test_tools_spikeinterface.py::TestWriteWaveforms::test_group_name_property /usr/lib/python3.11/site-packages/spikeinterface/qualitymetrics/misc_metrics.py:696: UserWarning: warnings.warn("") -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ========================================================= short test summary info ========================================================== SKIPPED [2] tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_get_default_dataset_io_configurations.py:235: The extra testing package 'ndx-events' is not installed! SKIPPED [1] tests/test_ecephys/test_ecephys_interfaces.py:53: Only testing with Python 3.10! FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[hdf5-unwrapped-<lambda>-iterator_options0] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[hdf5-generic-SliceableDataChunkIterator-iterator_options1] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[hdf5-classic-DataChunkIterator-iterator_options2] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[zarr-unwrapped-<lambda>-iterator_options0] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[zarr-generic-SliceableDataChunkIterator-iterator_options1] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_time_series[zarr-classic-DataChunkIterator-iterator_options2] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_dynamic_table[hdf5] - AttributeError: 'VectorData' object has no attribute 'set_data_io' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_simple_dynamic_table[zarr] - AttributeError: 'VectorData' object has no attribute 'set_data_io' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[hdf5-unwrapped-<lambda>-data_iterator_options0-timestamps_iterator_options0] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[hdf5-generic-SliceableDataChunkIterator-data_iterator_options1-timestamps_iterator_options1] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[hdf5-classic-DataChunkIterator-data_iterator_options2-timestamps_iterator_options2] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[zarr-unwrapped-<lambda>-data_iterator_options0-timestamps_iterator_options0] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[zarr-generic-SliceableDataChunkIterator-data_iterator_options1-timestamps_iterator_options1] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_defaults.py::test_time_series_timestamps_linkage[zarr-classic-DataChunkIterator-data_iterator_options2-timestamps_iterator_options2] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[hdf5-unwrapped-<lambda>-iterator_options0] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[hdf5-generic-SliceableDataChunkIterator-iterator_options1] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[hdf5-classic-DataChunkIterator-iterator_options2] - TypeError: H5DataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[zarr-unwrapped-<lambda>-iterator_options0] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[zarr-generic-SliceableDataChunkIterator-iterator_options1] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_time_series_override[zarr-classic-DataChunkIterator-iterator_options2] - TypeError: ZarrDataIO.__init__: unrecognized argument: 'data_io_kwargs' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_dynamic_table_override[hdf5] - AttributeError: 'VectorData' object has no attribute 'set_data_io' FAILED tests/test_minimal/test_tools/test_backend_and_dataset_configuration/test_helpers/test_configure_backend_overrides.py::test_simple_dynamic_table_override[zarr] - AttributeError: 'VectorData' object has no attribute 'set_data_io' FAILED tests/test_ecephys/test_mock_recording_interface.py::TestMockRecordingInterface::test_conversion_as_lone_interface - AttributeError: 'NoiseGeneratorRecording' object has no attribute 'rename_channels' ========================================= 23 failed, 244 passed, 3 skipped, 50 warnings in 32.42s ========================================== * ERROR: sci-biology/neuroconv-0.4.8::science failed (test phase): * pytest failed with python3.11 * * Call stack: * ebuild.sh, line 136: Called src_test * environment, line 3902: Called distutils-r1_src_test * environment, line 1928: Called _distutils-r1_run_foreach_impl 'python_test' * environment, line 722: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' * environment, line 3505: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' * environment, line 3062: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' * environment, line 3060: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' * environment, line 1154: Called distutils-r1_run_phase 'python_test' * environment, line 1851: Called python_test * environment, line 3789: Called epytest 'tests/test_minimal' 'tests/test_ecephys' * environment, line 2484: Called die * The specific snippet of code: * "${@}" || die -n "pytest failed with ${EPYTHON}"; * * If you need support, post the output of `emerge --info '=sci-biology/neuroconv-0.4.8::science'`, * the complete build log and the output of `emerge -pqv '=sci-biology/neuroconv-0.4.8::science'`. * The complete build log is located at '/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/build.log'. * The ebuild environment file is located at '/var/tmp/portage/sci-biology/neuroconv-0.4.8/temp/environment'. * Working directory: '/var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8' * S: '/var/tmp/portage/sci-biology/neuroconv-0.4.8/work/neuroconv-0.4.8'