pandas.
read_hdf
Read from the store, close it if we opened it.
Retrieve pandas object stored in file, optionally based on where criteria.
Warning
Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the “fixed” format. Loading pickled data received from untrusted sources can be unsafe.
See: https://docs.python.org/3/library/pickle.html for more.
Any valid string path is acceptable. The string could be a URL. Valid URL schemes include http, ftp, s3, and file. For file URLs, a host is expected. A local file could be: file://localhost/path/to/table.h5.
file://localhost/path/to/table.h5
If you want to pass in a path object, pandas accepts any os.PathLike.
os.PathLike
Alternatively, pandas accepts an open pandas.HDFStore object.
pandas.HDFStore
By file-like object, we refer to objects with a read() method, such as a file handle (e.g. via builtin open function) or StringIO.
read()
open
StringIO
The group identifier in the store. Can be omitted if the HDF file contains a single pandas object.
Mode to use when opening the file. Ignored if path_or_buf is a pandas.HDFStore. Default is ‘r’.
Specifies how encoding and decoding errors are to be handled. See the errors argument for open() for a full list of options.
open()
A list of Term (or convertible) objects.
Row number to start selection.
Row number to stop selection.
A list of columns names to return.
Return an iterator object.
Number of rows to include in an iteration when using an iterator.
Additional keyword arguments passed to HDFStore.
The selected object. Return type depends on the object stored.
See also
DataFrame.to_hdf
Write a HDF file from a DataFrame.
HDFStore
Low-level access to HDF files.
Examples
>>> df = pd.DataFrame([[1, 1.0, 'a']], columns=['x', 'y', 'z']) >>> df.to_hdf('./store.h5', 'data') >>> reread = pd.read_hdf('./store.h5')