Options
All
  • Public
  • Public/Protected
  • All
Menu

Class ConfusionMatrix<T>

Constructs a confusion matrix

example

const CM = new ConfusionMatrix([[13, 2], [10, 5]], ['cat', 'dog'])

param matrix

The confusion matrix, a 2D Array. Rows represent the actual label and columns the predicted label.

param labels

Labels of the confusion matrix, a 1D Array

Type parameters

  • T: Label

Hierarchy

  • ConfusionMatrix

Index

Constructors

  • new ConfusionMatrix<T>(matrix: number[][], labels: T[]): ConfusionMatrix<T>

Accessors

  • get accuracy(): number
  • Compute the general prediction accuracy

    deprecated

    Use getAccuracy

    Returns number

    The prediction accuracy ([0-1]

  • get total(): number
  • Compute the number of predicted observations

    deprecated

    Use getTotalCount

    Returns number

Methods

  • getAccuracy(): number
  • Get total accuracy. The ratio between the number of true predictions and total number of classifications ([0-1])

    Returns number

  • getConfusionTable(label: T): number[][]
  • Get the confusion table.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number[][]

    The 2x2 confusion table. [[TP, FN], [FP, TN]]

  • getCount(actual: T, predicted: T): number
  • Returns the element in the confusion matrix that corresponds to the given actual and predicted labels.

    Parameters

    • actual: T

      The true label

    • predicted: T

      The predicted label

    Returns number

    The element in the confusion matrix

  • getF1Score(label: T): number
  • getFalseCount(): number
  • Get the total number of false predictions.

    Returns number

  • getFalseDiscoveryRate(label: T): number
  • getFalseNegativeCount(label: T): number
  • Get the number of false negative predictions.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getFalseNegativeRate(label: T): number
  • getFalseOmissionRate(label: T): number
  • False omission rate (FOR)

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getFalsePositiveCount(label: T): number
  • Get the number of false positive predictions.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getFalsePositiveRate(label: T): number
  • getIndex(label: T): number
  • Get the index in the confusion matrix that corresponds to the given label

    throws

    if the label is not found

    Parameters

    • label: T

      The label to search for

    Returns number

  • getInformedness(label: T): number
  • getLabels(): T[]
  • getMarkedness(label: T): number
  • Markedness

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getMatrix(): number[][]
  • Get the confusion matrix

    Returns number[][]

  • getMatthewsCorrelationCoefficient(label: T): number
  • getNegativeCount(label: T): number
  • Get the number of real negative samples.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getNegativePredictiveValue(label: T): number
  • getPositiveCount(label: T): number
  • Get the number of real positive samples.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getPositivePredictiveValue(label: T): number
  • getTotalCount(): number
  • Get the total number of samples

    Returns number

  • getTrueCount(): number
  • Get the total number of true predictions

    Returns number

  • getTrueNegativeCount(label: T): number
  • Get the number of true negative predictions.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getTrueNegativeRate(label: T): number
  • Get the true negative rate a.k.a. specificity. Computes the ration between the number of true negative predictions and the total number of negative samples. https://en.wikipedia.org/wiki/Sensitivity_and_specificity

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

    The true negative rate a.k.a. specificity.

  • getTruePositiveCount(label: T): number
  • Get the number of true positive predictions.

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

  • getTruePositiveRate(label: T): number
  • Get the true positive rate a.k.a. sensitivity. Computes the ratio between the number of true positive predictions and the total number of positive samples. https://en.wikipedia.org/wiki/Sensitivity_and_specificity

    Parameters

    • label: T

      The label that should be considered "positive"

    Returns number

    The true positive rate [0-1]

  • fromLabels<T>(actual: T[], predicted: T[], options?: FromLabelsOptions<T>): ConfusionMatrix<T>
  • Construct confusion matrix from the predicted and actual labels (classes). Be sure to provide the arguments in the correct order!

    Type parameters

    • T: Label

    Parameters

    • actual: T[]

      The predicted labels of the classification

    • predicted: T[]

      The actual labels of the classification. Has to be of same length as predicted.

    • options: FromLabelsOptions<T> = {}

    Returns ConfusionMatrix<T>

    Confusion matrix