Skip to content

Latest commit

 

History

History
 
 

utils

Metrics in GDL

Pixel based Metrics

Intersection over Union (IoU)

The IOU metric is a pixel based metric which measures overlap using the number of pixels common between groundtruth and predictions divided by the total pixels across both.

IoU=  (groundtruth ∩ prediction) /
      (groundtruth u predictions)

Dice Similarity Coefficient (dice)

The dice metric scores model performance by measuring overlap between groundthruth and predictions divided by sum of pixels of both groundtruth and predictions.

dice= 2 * (groundtruth ∩ prediction) /
          (groundtruth + prediction) 

Note: IoU and Dice metrics weigh factors differently, however both metrics are positively correlated. This means if model A is better than B then this is captured similarly in both metrics.

Precision and Recall

By ploting a confusion matrix which indicates ground-truth and predicted classes with number of pixels classified in each class, precision and recall is easily computed.

precision= true positives /
           true positives + false positives
recall= true positives /
        true positives + false negatives

Comparing common pixel based metrics

Accuracy IoU Dice Coefficient
Counts the number of correctly classified pixels Counts pixels in both label and pred Similar to IoU, has its own strengths
Not suitable for Imbalanced datasets Counts pixels in either label and pred Measures average performance to IoU’s measure of worst case performance.
High accuracy may not translate to quality predictions Statistically correlates the counts (ratio)
Accounts for imbalanced data by penalizing FP and FN

Shape based Metrics

** New shape based metrics would be added soon **