<div><p>Introduction</p><p>The level of minimal residual disease (MRD) in marrow predicts outcome and guides treatment in childhood acute lymphoblastic leukemia (ALL) but accurate prediction depends on accurate measurement.</p><p>Methods</p><p>Forty-one children with ALL were studied at the end of induction. Two samples were obtained from each iliac spine and each sample was assayed twice. Assay, sample and side-to-side variation were quantified by analysis of variance and presumptively incorrect decisions related to high-risk disease were determined using the result from each MRD assay, the mean MRD in the patient as the measure of the true value, and each of 3 different MRD cut-off levels which have been used for making decisions on treatment.</p><p>Results</p><p>Variation between assays, samples and sides each differed significantly from zero and the overall standard deviation for a single MRD estimation was 0.60 logs. Multifocal residual disease seemed to be at least partly responsible for the variation between samples. Decision errors occurred at a frequency of 13–14% when the mean patient MRD was between 10<sup>−2</sup> and 10<sup>−5</sup>. Decision errors were observed only for an MRD result within 1 log of the cut-off value used for assessing high risk. Depending on the cut-off used, 31–40% of MRD results were within 1 log of the cut-off value and 21–16% of such results would have resulted in a decision error.</p><p>Conclusion</p><p>When the result obtained for the level of MRD is within 1 log of the cut-off value used for making decisions, variation in the assay and/or sampling may result in a misleading assessment of the true level of marrow MRD. This may lead to an incorrect decision on treatment.</p></div
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.