pandas.read_pickle(filepath_or_buffer, compression='infer', storage_options=None)[source]

Load pickled pandas object (or any object) from file.


Loading pickled data received from untrusted sources can be unsafe. See here.

filepath_or_bufferstr, path object or file-like object

File path, URL, or buffer where the pickled object will be loaded from.

Changed in version 1.0.0: Accept URL. URL is not limited to S3 and GCS.

compression{‘infer’, ‘gzip’, ‘bz2’, ‘zip’, ‘xz’, None}, default ‘infer’

If ‘infer’ and ‘path_or_url’ is path-like, then detect compression from the following extensions: ‘.gz’, ‘.bz2’, ‘.zip’, or ‘.xz’ (otherwise no compression) If ‘infer’ and ‘path_or_url’ is not path-like, then use None (= no decompression).

storage_optionsdict, optional

Extra options that make sense for a particular storage connection, e.g. host, port, username, password, etc., if using a URL that will be parsed by fsspec, e.g., starting “s3://”, “gcs://”. An error will be raised if providing this argument with a non-fsspec URL. See the fsspec and backend storage implementation docs for the set of allowed keys and values.

New in version 1.2.0.

unpickledsame type as object stored in file

See also


Pickle (serialize) DataFrame object to file.


Pickle (serialize) Series object to file.


Read HDF5 file into a DataFrame.


Read SQL query or database table into a DataFrame.


Load a parquet object, returning a DataFrame.


read_pickle is only guaranteed to be backwards compatible to pandas 0.20.3.


>>> original_df = pd.DataFrame({"foo": range(5), "bar": range(5, 10)})
>>> original_df
   foo  bar
0    0    5
1    1    6
2    2    7
3    3    8
4    4    9
>>> pd.to_pickle(original_df, "./dummy.pkl")
>>> unpickled_df = pd.read_pickle("./dummy.pkl")
>>> unpickled_df
   foo  bar
0    0    5
1    1    6
2    2    7
3    3    8
4    4    9
>>> import os
>>> os.remove("./dummy.pkl")