LabVIEW Analytics and Machine Learning Toolkit API Reference

Train Classification Model VI

  • Updated2023-02-21
  • 4 minute(s) read

Train Classification Model VI

Owning Palette: Classification VIs

Requires: Analytics and Machine Learning Toolkit

Trains a classification model.

Examples

model in specifies the information about the entire workflow of the model.
classification model info in specifies the initialized classification model for training.

You can acquire an initialized classification model from the following VIs:
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
model out returns the information about the entire workflow of the model. Wire model out to the reference input of a standard Property Node to get an AML Analytics Property Node.
classification model info out returns the trained classification model.

Wire classification model info out to the reference input of a standard Property Node to get the properties of the trained classification model. The following table displays the VI you wire to classification model info in and the corresponding Property Node you get from classification model info out.

VI NameProperty Node
Initialize Classification Model (LR) VIAML Logistic Regression
Initialize Classification Model (NN) VIAML Neural Network
Initialize Classification Model (SVM) VIAML SVM
confusion matrix returns the confusion matrix from the evaluation result. A confusion matrix describes the performance of a classification model by reporting the number of true positive cases, true negative cases, false positive cases, and false negative cases. Each row of a confusion matrix represents the actual class and each column represents the predicted class.

For example, for 100 samples, there are two possible classes: positive and negative. The following table is a confusion matrix for the two classes.

Predicted Class
PositiveNegative
Actual ClassPositive655
Negative1911


The confusion matrix contains 65 true positive cases, 5 false negative cases, 19 false positive cases, and 11 true negative cases.
metrics returns metrics from the evaluation result.
accuracy returns the accuracy metric value.

The following equation defines the accuracy metric:



where
TP is the number of true positive cases in the data
TN is the number of true negative cases in the data
P is the number of real positive cases in the data
N is the number of real negative cases in the data
precision returns the precision metric value.

The following equation defines the precision metric:



where
TP is the number of true positive cases in the data
FP is the number of false positive cases in the data
recall returns the recall metric value.

The following equation defines the recall metric:



where
TP is the number of true positive cases in the data
FN is the number of false negative cases in the data
f1 score returns the f1 score metric value.

The following equation defines the f1 score metric:



where
TP is the number of true positive cases in the data
FP is the number of false positive cases in the data
FN is the number of false negative cases in the data
error out contains error information. This output provides standard error out functionality.

Examples

Refer to the following VIs for examples of using the Train Classification Model VI:

  • Classification (Set Parameters, Training) VI: labview\examples\AML\Classification
  • Classification (Search Parameters, Training) VI: labview\examples\AML\Classification

Log in to get a better experience