pandas.HDFStore.append#
- HDFStore.append(key, value, format=None, axes=None, index=True, append=True, complib=None, complevel=None, columns=None, min_itemsize=None, nan_rep=None, chunksize=None, expectedrows=None, dropna=None, data_columns=None, encoding=None, errors='strict')[source]#
Append to Table in file.
Node must already exist and be Table format.
- Parameters
- keystr
- value{Series, DataFrame}
- format‘table’ is the default
Format to use when storing object in HDFStore. Value can be one of:
'table'
Table format. Write as a PyTables Table structure which may perform worse but allow more flexible operations like searching / selecting subsets of the data.
- indexbool, default True
Write DataFrame index as a column.
- appendbool, default True
Append the input data to the existing.
- data_columnslist of columns, or True, default None
List of columns to create as indexed data columns for on-disk queries, or True to use all columns. By default only the axes of the object are indexed. See here.
- min_itemsizedict of columns that specify minimum str sizes
- nan_repstr to use as str nan representation
- chunksizesize to chunk the writing
- expectedrowsexpected TOTAL row size of this table
- encodingdefault None, provide an encoding for str
- dropnabool, default False, optional
Do not write an ALL nan row to the store settable by the option ‘io.hdf.dropna_table’.
Notes
Does not check if data being appended overlaps with existing data in the table, so be careful