We performed sensitive polymerase chain reaction-based minimal residual
disease (MRD) analyses on bone marrow samples at 9 follow-up time points
in 71 children with T-lineage acute lymphoblastic leukemia (T-ALL) and
compared the results with the precursor B-lineage ALL (B-ALL) results (n =
210) of our previous study. At the first 5 follow-up time points, the
frequency of MRD-positive patients and the MRD levels were higher in T-ALL
than in precursor-B-ALL, reflecting the more frequent occurrence of
resistant disease in T-ALL. Subsequently, patients were classified
according to their MRD level at time point 1 (TP1), taken at the end of
induction treatment (5 weeks), and at TP2 just before the start of
consolidation treatment (3 months). Patients were considered at low risk
if TP1 and TP2 were MRD negative and at high risk if MRD levels at TP1 and
TP2 were 10(-3) or higher; remaining patients were considered at
intermediate risk. The relative distribution of patients with T-ALL (n =
43) over the MRD-based risk groups differed significantly from that of
precursor B-ALL (n = 109). Twenty-three percent of patients with T-ALL and
46% of patients with precursor B-ALL were classified in the low-risk group
(P =.01) and had a 5-year relapse-free survival (RFS) rate of 98% or
greater. In contrast, 28% of patients with T-ALL were classified in the
MRD-based high-risk group compared to only 11% of patients with precursor
B-ALL (P =.02), and the RFS rates were 0% and 25%, respectively (P =.03).
Not only was the distribution of patients with T-ALL different over the
MRD-based risk groups, the prognostic value of MRD levels at TP1 and TP2
was higher in T-ALL (larger RFS gradient), and consistently higher RFS
rates were found for MRD-negative T-ALL patients at the first 5 follow-up
time points