Percent Residual Accuracy for Quantifying Goodness-of-fit of Linear Calibration Curves
Linear models for calibration curves are overwhelmingly created based on minimization of least squares error, with their goodness-of-fit (GOF) quantified using the square of the correlation coefficient (R2). Yet, R2 has well-known disadvantages when used to quantify GOF of calibration curves stemming from its calculation based on the absolute error of the signal (i.e., calculated vs. experimental). These disadvantages are exacerbated when using a geometric series of concentrations for calibration standards (e.g., 1, 2, 5, 10, etc.) and when calibration curves span 2–3 orders of magnitude, which is typical for modern analytical techniques. While there are multiple alternative GOF measures, R2 overwhelmingly persists in the field of Analytical Chemistry as the most reported measure of GOF. We evaluated R2, alternative GOF measures, and multiple quantitative bias parameters, along with residual analysis, for over 60 experimental calibration curves. R2 did a poor job of consistently and accurately quantifying the GOF over the entire calibration curve. This was especially true for situations where the low concentration calibrators were not accurately described by the calibration equation. While other GOF parameters, including the sum of the absolute percent error, mean absolute percent error, and quality coefficient, did a better job of describing GOF of calibration curves, each had significant theoretical and/or practical disadvantages. Therefore, we introduce a descriptive GOF parameter called Percent Residual Accuracy (%RA or PRA) which equally weights the accuracy of all calibrators into a single value, generally falling between 0% and 100%, with 100% representing a perfect fit and a “good” fit for calibration data producing a %RA of 90–100%. The %RA much more effectively described the GOF for the entire calibration range than R2, and it similarly quantified GOF as compared to the other GOF parameters tested. With the performance and practical advantages of %RA, we conclude that it is the most advantageous GOF parameter and that it should be reported as a standard GOF measure for calibration curves.
DOI of Published Version
Logue, Brian A., "Percent Residual Accuracy for Quantifying Goodness-of-fit of Linear Calibration Curves" (2018). Chemistry and Biochemistry Faculty Publications. 77.