Sean Owen takes us through a few techniques for speeding up neural network model training:
Step #2: Use Early Stopping
Keras (and other frameworks) have built-in support for stopping when further training appears to be making the model worse. In Keras, it’s the EarlyStopping callback. Using it means passing the validation data to the training process for evaluation on every epoch. Training will stop after several epochs have passed with no improvement. restore_best_weights=True ensures that the final model’s weights are from its best epoch, not just the last one. This should be your default.
Sean focuses here on Keras + TensorFlow on Spark, but several of the tips are cross-product and generally applicable.
Comments closed