CheckpointIO¶
- class lightning.fabric.plugins.io.checkpoint_io.CheckpointIO[source]¶
Bases:
ABCInterface to save/load checkpoints as they are saved through the
Strategy.Warning
This is an experimental feature.
Typically most plugins either use the Torch based IO Plugin;
TorchCheckpointIObut may require particular handling depending on the plugin.In addition, you can pass a custom
CheckpointIOby extending this class and passing it to the Trainer, i.eTrainer(plugins=[MyCustomCheckpointIO()]).Note
For some plugins, it is not possible to use a custom checkpoint plugin as checkpointing logic is not modifiable.
- abstract load_checkpoint(path, map_location=None, weights_only=None)[source]¶
Load checkpoint from a path when resuming or loading ckpt for test/validate/predict stages.
- Parameters:
map_location¶ (
Optional[Any]) – a function,torch.device, string or a dict specifying how to remap storage locations.weights_only¶ (
Optional[bool]) – Defaults toNone. IfTrue, restricts loading tostate_dictsof plaintorch.Tensorand other primitive types. If loading a checkpoint from a trusted source that contains annn.Module, useweights_only=False. If loading checkpoint from an untrusted source, we recommend usingweights_only=True. For more information, please refer to the PyTorch Developer Notes on Serialization Semantics.
- Return type:
Returns: The loaded checkpoint.