You can save epoch number in a separate file (pickle or json file).
import json
train_parameters = {'iter': iteration, 'batch_size': batch_size'}
# saving
json.dump(trainParameters, open(output_path+"trainParameters.txt",'w'))
# loading
trainParameters = json.load(open(path_to_saved_model+"trainParameters.txt"))
input = tf.random.uniform([8, 24], 0, 100, dtype=tf.int32)
model.compile(optimizer=optimizer, loss=training_loss, metrics=evaluation_accuracy)
hist = model.fit((input, input), input, epochs=1,
steps_per_epoch=1, verbose=0)
model.load_weights(path_to_saved_model+'saved.h5')
But if you need to save learning rate step – save optimizer state. The state contain iteration number (number of batches passed).
CLICK HERE to find out more related problems solutions.