plexus.analysis.metrics.gwet_ac1 module

Implementation of Gwet’s AC1 agreement coefficient.

This module provides the implementation of Gwet’s AC1 statistic for measuring inter-rater agreement, which is more robust than Cohen’s Kappa when dealing with highly imbalanced class distributions.

class plexus.analysis.metrics.gwet_ac1.GwetAC1

Bases: Metric

Implementation of Gwet’s AC1 statistic for measuring inter-rater agreement.

Gwet’s AC1 is an alternative to Cohen’s Kappa and Fleiss’ Kappa that is more robust to the “Kappa paradox” where high observed agreement can result in low or negative Kappa values when there is high class imbalance.

References: - Gwet, K. L. (2008). Computing inter-rater reliability and its variance in the

presence of high agreement. British Journal of Mathematical and Statistical Psychology, 61(1), 29-48.

calculate(input_data: Input) Result

Calculate Gwet’s AC1 agreement coefficient.

Args:

input_data: Metric.Input containing reference and prediction lists

Returns:

Metric.Result with the Gwet’s AC1 value and metadata