Evaluation Criteria
Below are the common evaluation metrics for classification and regression models, along with guidance on how to assess them in SPSS:
Classification Model Evaluation Metrics:
- Accuracy:
- Definition: Proportion of correctly classified instances out of the total instances.
- Assessment in SPSS: Displayed in the classification table. In the “Classification” table, look for the percentage of correct predictions.
- Precision:
- Definition: Proportion of true positives among instances predicted as positive.
- Assessment in SPSS: Not directly available in SPSS, but you can calculate precision using the values from the confusion matrix.
- Recall (Sensitivity):
- Definition: Proportion of true positives among actual positive instances.
- Assessment in SPSS: Similar to precision, not directly available, but can be calculated from the confusion matrix.
- F1 Score:
- Definition: Harmonic mean of precision and recall, providing a balance between the two.
- Assessment in SPSS: Needs to be calculated manually using precision and recall values.
- ROC-AUC (Receiver Operating Characteristic – Area Under the Curve):
- Definition: Measures the area under the ROC curve, indicating the model’s ability to distinguish between classes.
- Assessment in SPSS: Typically used for binary classification problems. ROC-AUC is not directly available in SPSS, but you can use other software or tools to generate the ROC curve and calculate the AUC.
Regression Model Evaluation Metrics:
- Mean Squared Error (MSE):
- Definition: Average squared difference between predicted and actual values.
- Assessment in SPSS: Available in the “R-Square” table. Look for the “Mean Square Error” value.
- Root Mean Squared Error (RMSE):
- Definition: Square root of the MSE, providing a more interpretable scale.
- Assessment in SPSS: Not directly available in SPSS, but you can calculate RMSE by taking the square root of the MSE.
- R-Squared (Coefficient of Determination):
- Definition: Proportion of the variance in the dependent variable explained by the model.
- Assessment in SPSS: Displayed in the “R-Square” table. Look for the “R-Square” value.
- Adjusted R-Squared:
- Definition: Adjusts R-squared for the number of predictors in the model.
- Assessment in SPSS: Available in the “R-Square” table. Look for the “Adjusted R-Square” value.
When evaluating models, it’s crucial to consider the specific goals of your analysis. Some metrics may be more important depending on whether you prioritize precision, recall, accuracy, or other factors. Additionally, cross-validation can be used to assess the model’s generalizability.