Common Training Issues
Below are common things to look for while training different types of models. These can be viewed in real-time via PerceptiLabs' Training View.
Tip: Be sure to check out our Machine Learning Minute videos on YouTube where we provide brief overviews of modeling.
Classification Training
Gradients: gradients of 0 indicate that the model is not training and gradients that grow infinitely mean that the model is changing too much. Note that it's usually sufficient to focus on the gradient in the last layer.
Loss: if loss increases during validation, this means the model is overfitting.
Predictions per class: there should be one color per class, otherwise the model is mixing classes together. The model may need to be made more complex to rectify this.
Segmentation Training
Gradients: if gradients die (i.e., decrease to nothing) quickly, try normalizing the input using batch normalization and/or try a different output activation function.
Prediction: if the predictions seem to contain a lot of noise then the model has not been trained enough yet or needs to be made more complex. If you are only predicting one's (i.e., a white image) follow the tips in the previous point.
Tip: Check out the following video for a brief introduction on the importance of gradients during debugging:
General Training
CPU/GPU usage: if resources aren't being utilized used as much as they should be, try larger batch sizes (this can be adjusted in the Model Training Settings).
Last updated