Compute the general prediction accuracy
The prediction accuracy ([0-1]
Compute the number of predicted observations
Get total accuracy. The ratio between the number of true predictions and total number of classifications ([0-1])
Get the confusion table.
The label that should be considered "positive"
The 2x2 confusion table. [[TP, FN], [FP, TN]]
Returns the element in the confusion matrix that corresponds to the given actual and predicted labels.
The true label
The predicted label
The element in the confusion matrix
The label that should be considered "positive"
Get the total number of false predictions.
False discovery rate (FDR) https://en.wikipedia.org/wiki/False_discovery_rate
The label that should be considered "positive"
Get the number of false negative predictions.
The label that should be considered "positive"
False negative rate a.k.a. miss rate. https://en.wikipedia.org/wiki/Type_I_and_type_II_errors#False_positive_and_false_negative_rates
The label that should be considered "positive"
False omission rate (FOR)
The label that should be considered "positive"
Get the number of false positive predictions.
The label that should be considered "positive"
False positive rate a.k.a. fall-out rate. https://en.wikipedia.org/wiki/Type_I_and_type_II_errors#False_positive_and_false_negative_rates
The label that should be considered "positive"
Get the index in the confusion matrix that corresponds to the given label
The label to search for
Informedness https://en.wikipedia.org/wiki/Youden%27s_J_statistic
The label that should be considered "positive"
Markedness
The label that should be considered "positive"
Get the confusion matrix
Matthews correlation coefficient (MCC) https://en.wikipedia.org/wiki/Matthews_correlation_coefficient
The label that should be considered "positive"
Get the number of real negative samples.
The label that should be considered "positive"
Negative predictive value https://en.wikipedia.org/wiki/Positive_and_negative_predictive_values
The label that should be considered "positive"
Get the number of real positive samples.
The label that should be considered "positive"
Get the positive predictive value a.k.a. precision. Computes TP / (TP + FP) https://en.wikipedia.org/wiki/Positive_and_negative_predictive_values
The label that should be considered "positive"
the positive predictive value a.k.a. precision.
Get the total number of samples
Get the total number of true predictions
Get the number of true negative predictions.
The label that should be considered "positive"
Get the true negative rate a.k.a. specificity. Computes the ration between the number of true negative predictions and the total number of negative samples. https://en.wikipedia.org/wiki/Sensitivity_and_specificity
The label that should be considered "positive"
The true negative rate a.k.a. specificity.
Get the number of true positive predictions.
The label that should be considered "positive"
Get the true positive rate a.k.a. sensitivity. Computes the ratio between the number of true positive predictions and the total number of positive samples. https://en.wikipedia.org/wiki/Sensitivity_and_specificity
The label that should be considered "positive"
The true positive rate [0-1]
Construct confusion matrix from the predicted and actual labels (classes). Be sure to provide the arguments in the correct order!
The predicted labels of the classification
The actual labels of the classification. Has to be of same length as predicted.
Confusion matrix
Constructs a confusion matrix
const CM = new ConfusionMatrix([[13, 2], [10, 5]], ['cat', 'dog'])
The confusion matrix, a 2D Array. Rows represent the actual label and columns the predicted label.
Labels of the confusion matrix, a 1D Array