@Begoodpy, I suggest you combine the 2 models into a single one and train it as you would usually do.
supermodel = keras.Sequential( [ model1(), model2(), ]
If you need more control over the models, try this:
all_vars = model1.trainable_variables + model2.trainable_variables grads = tape.gradient(loss_value2, all_vars) optimizer.apply_gradients(zip(grads, all_vars))
CLICK HERE to find out more related problems solutions.