Neural Network training and evaluation in the GUI#
Before training your model, the first step is to assemble your training dataset.
Create Training Dataset: Move to the corresponding tab and click Create Training Dataset
. For starters, the default settings will do just fine. While there are more powerful models and data augmentations you might want to consider, you can trust that for most projects the defaults are an ideal place to start.
💡 Note: This guide assumes you have a GPU on your local machine. If you’re CPU-bound and finding training challenging, consider using Google Colab. Our Colab Guide can help you get started!
Kickstarting the Training Process#
With your training dataset ready, it’s time to train your model.
Navigate to Train Network: Head over to the
Train Network
tab.Set Training Parameters: Here, you’ll specify:
Display iterations/epochs
: To specify how often the training progress will be visually updated. Note that our TensorFlow models are “iterations” while PyTorch is epochs.Maximum Iterations/epochs
: Decide how many iterations to run. For TensorFlow models for a quick demo, 10K is great. For PyTorch models, 200 epochs is fine!Number of Snapshots to keep
: Choose how many snapshots of the model you want to keep,Save iterations
: and at what iteration intervals they should be saved.
Launch Training: Click on
Train Network
to begin.
You can keep an eye on the training progress via your terminal window. This will give you a real-time update on how your model is learning (added bonus of the PyTorch model is it also shows you evaluation metrics after each epoch!).
Evaluate the Network#
After training, it’s time to see how well your model performs.
Steps to Evaluate the Network#
Find and click on the
Evaluate Network
tab.Choose Evaluation Options:
Plot Predictions: Select this to visualize the model’s predictions, similar to standard DeepLabCut (DLC) evaluations.
Compare Bodyparts: Opt to compare all the bodyparts for a comprehensive evaluation.
Click the
Evaluate Network
button, located on the right side of the main window.
đź’ˇ Tip: If you wish to evaluate all saved snapshots, go to the configuration file and change the
snapshotindex
parameter toall
.
Understanding the Evaluation Results#
Performance Metrics: DLC will assess the latest snapshot of your model, generating a
.CSV
file with performance metrics. This file is stored in theevaluate network
folder within your project.
)
Visual Feedback: Additionally, DLC creates subfolders containing your frames overlaid with both the labeled bodyparts and the model’s predictions, allowing you to visually gauge the network’s performance.
)