HDFStore.append(self, key: str, value: ~ FrameOrSeries, format=None, axes=None, index=True, append=True, complib=None, complevel: Union[int, NoneType] = None, columns=None, min_itemsize: Union[int, Dict[str, int], NoneType] = None, nan_rep=None, chunksize=None, expectedrows=None, dropna: Union[bool, NoneType] = None, data_columns: Union[List[str], NoneType] = None, encoding=None, errors: str = 'strict')[source]

Append to Table in file. Node must already exist and be Table format.

value{Series, DataFrame}
format‘table’ is the default
table(t)table format

Write as a PyTables Table structure which may perform worse but allow more flexible operations like searching / selecting subsets of the data.

appendbool, default True

Append the input data to the existing.

data_columnslist of columns, or True, default None

List of columns to create as indexed data columns for on-disk queries, or True to use all columns. By default only the axes of the object are indexed. See here.

min_itemsizedict of columns that specify minimum string sizes
nan_repstring to use as string nan representation
chunksizesize to chunk the writing
expectedrowsexpected TOTAL row size of this table
encodingdefault None, provide an encoding for strings
dropnabool, default False

Do not write an ALL nan row to the store settable by the option ‘io.hdf.dropna_table’.


Does not check if data being appended overlaps with existing data in the table, so be careful