0 / 0
Average absolute odds difference metric
Last updated: Feb 05, 2025
Average absolute odds difference metric

The average absolute odds difference metric compares the average of absolute difference in false positive rates and true positive rates between monitored and reference groups.

Metric details

Average absolute odds difference is a fairness evaluation metric that can help determine whether your asset produces biased outcomes.

Scope

The average absolute odds difference metric evaluates generative AI assets and machine learning models.

  • Types of AI assets:
    • Prompt templates
    • Machine learning models
  • Generative AI tasks: Text classification
  • Machine learning problem type: Binary classification

Scores and values

The average absolute odds difference metric score indicates the average absolute difference in false positive and false negative rates between protected groups. Lower scores indicate better fairness between groups.

  • Range of values: 0.0-1.0
  • Best possible score: 0.0
  • Ratios:
    • At 0: No disparity between groups
    • Over 0: Increasing disparity between groups

Do the math

The following formula is used for calculating false positive rate (FPR):

false positive rate formula is displayed

The following formula is used for calculating true positive rate (TPR):

true positive rate formula is displayed

The following formula is used for calculating average absolute odds difference:

average absolute odds difference formula is displayed

Parent topic: Evaluation metrics