Here is a message from subscriber Dkaguni:
“Historical ranges are often used to determine the range for coagulation assays. Are there general guidelines as to how large these ranges can be?”
Hello, and thank you for your question.
I’ll begin by assuming your question addresses internal assay precision, which is typically expressed as coefficient of variation (CV%). For assistance, I checked with John Olson, MD, University of Texas Medical Center, San Antonio. He reports there is no historical range or rule of thumb for an acceptable CV%. Manufacturers publish the CV% for each of their assays in the package insert, and while some may be as small as 4%, others are clinically effective at >10%. The key issue, recommended by Dr. Olson, is that the laboratory perform and record local (in-house) precision studies to confirm manufacturer’s claims. Methods for in-house precision studies are presented in audio modules 3 and 4.
If your question is directed to preparation of reference intervals (normal ranges), the standard approach is to assay aliquots from a well-defined cohort of normal subjects and compute the mean and standard deviation. Assuming a Gaussian distribution, the typical reference interval is ± 2 standard deviations. This may also be computed as a 95% confidence interval. Methods for computing reference intervals are provided in audio modules 5 and 6.
No comments here.