5 Best Ways to Manually Save Keras Model Weights Using Python

Rate this post

πŸ’‘ Problem Formulation: In machine learning, after training a neural network model, it is crucial to be able to save and later retrieve the model’s learned weights. Specifically, for Keras models in Python, there needs to be a way to manually save the weights to file to prevent retraining time losses, ensure model portability, and facilitate further analysis. An example of the input is a trained Keras model, and the desired output is a saved file containing the model’s weights.

Method 1: Save Weights to HDF5 File

One of the standard and widely used methods for saving Keras model weights is storing them in a Hierarchical Data Format version 5 (HDF5) file. The model.save_weights() function allows you to specify the filename, and by using the ‘.h5’ extension, the weights are saved in the HDF5 format. This method is secure, efficient and the resulting file is easily transferable.

Here’s an example:

from keras.models import Sequential
from keras.layers import Dense

# Create a model example
model = Sequential([
    Dense(units=10, input_shape=(10,))

# Assuming the model is trained here

# Save the weights

Output: A file named ‘my_model_weights.h5’ is created containing the weights of the model.

This code snippet creates a simple Sequential model with one dense layer, trains the model, and then uses the model.save_weights() method to write the weights to an HDF5 file. The HDF5 format is a suitable way to store large numerical weights efficiently.

Method 2: Save Weights to a TensorFlow SavedModel

Another approach is to use the TensorFlow SavedModel format, which not only saves the weights but also the entire model architecture, optimized for TensorFlow serving. This method is invoked by calling the model.save() method with the desired directory path and setting the save_format to ‘tf’.

Here’s an example:

# Create a model and train it, as illustrated in Method 1.

# Save the model
model.save('my_saved_model', save_format='tf')

Output: A directory named ‘my_saved_model’ is created, which includes the model architecture and its weights.

By calling model.save(), the entire model structure along with the weights is saved in the TensorFlow SavedModel format. The ‘tf’ save format creates a folder with various sub-files that collectively describe the complete model, not just the weights, making it a comprehensive model-saving method.

Method 3: Save Weights as a Checkpoint During Training

During training, one can also save the model weights automatically at certain intervals or after improvements using the ModelCheckpoint callback. It’s a way of periodically saving the model during the training process and can be configured to save only the weights using the save_weights_only=True argument.

Here’s an example:

from keras.callbacks import ModelCheckpoint

# Create and train model as before

# Save the model weights after every epoch
checkpoint = ModelCheckpoint('weights.{epoch:02d}-{val_loss:.2f}.hdf5', save_weights_only=True)
model.fit(X_train, Y_train, validation_data=(X_val, Y_val), callbacks=[checkpoint])

Output: Multiple files named ‘weights.xx-yy.hdf5’, where ‘xx’ is the epoch number and ‘yy’ is the validation loss, saved regularly during training.

This code utilizes Keras Callbacks to save the model’s weights at the end of every epoch. This method ensures you have the latest best weights saved without having to manually save after every training iteration, which is especially useful for long training sessions or when searching for the best model performance over epochs.

Method 4: Serialize Weights to JSON or YAML

Weights can be saved in a human-readable format by first converting them to a JSON or YAML string and then writing to a file. The method involves using model.to_json() or model.to_yaml() and then specifically saving the weights to a separate file.

Here’s an example:

import json

# Create and train model as before

# Serialize model architecture to JSON
model_json = model.to_json()

# Write the model architecture file
with open("model_architecture.json", "w") as json_file:
    json.dump(model_json, json_file)

# Save weights to a separate file

Output: Two files created, ‘model_architecture.json’ containing the model’s configuration, and ‘model_weights.h5’ containing the model’s weights.

This example demonstrates saving the model architecture and weights separately. The architecture is written in a human-readable JSON format, while the weights are saved in the binary HDF5 format, retrieving both files allows full model reconstruction without the original code.

Bonus One-Liner Method 5: Use get_weights() and np.save()

You can retrieve all model weights as a list of numpy arrays using the get_weights() method from the model and then save that list using numpy’s np.save() function.

Here’s an example:

import numpy as np

# Grab the weights
weights = model.get_weights()

# Save the weights using numpy
np.save('my_model_weights.npy', weights)

Output: A file named ‘my_model_weights.npy’ containing the weights in numpy array format.

This one-liner offers a quick and flexible approach to retrieving the model’s weights as a numpy array and then saving them to disk. It is particularly useful for quick experiments or when you need to process the weights with other numpy-based tools.


  • Method 1: HDF5 Format. Best for interoperability with other HPC tools and long-term storage. Requires additional libraries to interpret the file format.
  • Method 2: TensorFlow SavedModel. Best for end-to-end TensorFlow workflows. Saves entire model, not just weights. Larger file sizes compared to separate weights.
  • Method 3: Checkpoints during Training. Ideal for long training processes and enables recovery from interruptions. May result in a large number of weight files, requiring management.
  • Method 4: JSON/YAML Serialization. Human-readable format for model architecture. Requires separate weights saving. Useful for understanding and documenting the model structure.
  • Method 5: Numpy One-Liner. Quick and easy, perfect for fast saving and loading in a Python-centric workflow. May not capture all the model metadata necessary for reconstruction.