17 research outputs found

    Interference of fetal hemoglobin in the determination of carboxyhemoglobin by spectrophotometry

    Get PDF
    Determination of carboxyhemoglobin (HbCO) is routinely performed in suspected cases of carbon monoxide intoxication and unexplained deaths. However, some authors have suggested that measured HbCO may be falsely elevated in infants (0–12 months) due to the presence of fetal hemoglobin (HbF). The purpose of this study was to evaluate the impact of fetal hemoglobin on the spectrophotometric determination of carboxyhemoglobin. The interference of HbF in the determination of HbCO in infants aged from 0 to 12 months was evaluated using 16 ante-mortem and 19 post-mortem blood samples. The %HbCO was quantified spectrophotometrically by calculating the 560 nm/530 nm absorbance ratio, using a dual beam spectrophotometer. The average measured HbCO in infants of 3 months of age or under was 17%, which is abnormally elevated. No significant difference in HbCO measurement was found between ante-mortem and post-mortem samples. These results highlight the fact that care must be taken in interpretation of carboxyhemoglobin measurements in infants when using a spectrophotometric method

    Determination of Confidence Intervals in Non-normal Data: Application of the Bootstrap to Cocaine Concentration in Femoral Blood

    Get PDF
    Calculating the confidence interval is a common procedure in data analysis and is readily obtained from normally distributed populations with the familiar formula. However, when working with non-normally distributed data, determining the confidence interval is not as obvious. For this type of data, there are fewer references in the literature, and they are much less accessible. We describe, in simple language, the percentile and bias-corrected and accelerated variations of the bootstrap method to calculate confidence intervals. This method can be applied to a wide variety of parameters (mean, median, slope of a calibration curve, etc.) and is appropriate for normal and non-normal data sets. As a worked example, the confidence interval around the median concentration of cocaine in femoral blood is calculated using bootstrap techniques. The median of the non-toxic concentrations was 46.7 ng/mL with a 95% confidence interval of 23.9–85.8 ng/mL in the non-normally distributed set of 45 postmortem cases. This method should be used to lead to more statistically sound and accurate confidence intervals for non-normally distributed populations, such as reference values of therapeutic and toxic drug concentration, as well as situations of truncated concentration values near the limit of quantification or cutoff of a method

    Cyanide quantification in post-mortem biological matrices by headspace GC–MS

    Get PDF
    Cyanide is a powerful chemical asphyxiant found in some forensic cases following voluntary (suicide) or involuntary ingestion (fire, accidental exposure). A quantification method for cyanide that is specifically suited to post-mortem forensic purposes was developed. Determination was performed by headspace gas chromatography coupled to mass spectrometry using a GS-GASPRO column on an HP-6890 gas chromatograph with an HP-5973N mass detector. The biological sample was treated with an internal standard, frozen, glacial acetic acid was added and the sample was then incubated at 60 °C for 15 min. The headspace was sampled with a disposable syringe, and analyzed to quantify hydrogen cyanide. Isotopically labeled cyanide (13C15N) was used as the internal standard to minimize matrix effect and sampling error. The method produced an extended linear dynamic range (0.07–50 μg/mL), and a method detection limit of 0.02 μg/mL. Identical calibration curves were obtained when blood, gastric contents and aqueous solutions were used as the calibration standard matrix. This method was also successful in quantitating cyanide in gastric contents, one of the most variable biological fluids. The method has been validated and is being used for current forensic cases such as fire victims and suicides

    A case of fatal idiosyncratic reaction to the designer drug 3,4-methylenedioxypyrovalerone (MDPV) and review of the literature

    Get PDF
    The stimulant designer drug 3,4-methylenedioxypyrovalerone (MDPV) was first synthesized by Boehringer Ingelheim in 1969 and introduced on the black market in 2006. Only a small number of fatal intoxication cases have been reported in the literature, all with significant blood MDPV concentrations. In this report, we describe one fatality attributed to an idiosyncratic reaction to MDPV. The victim displayed agitation, violent behavior and delirium followed by cardiac arrest. Hyperthermia was observed at the hospital. The MDPV cardiac and femoral blood concentrations were 6 ng/mL. The presence of excited delirium syndrome and MDPV, a drug with a pharmacology similar to cocaine, leads to the conclusion that the victim suffered a fatal adverse reaction to MDPV. This is the first published case of idiosyncratic reaction to MDPV

    A Tool for Automatic Correction of Endogenous Concentrations: Application to BHB Analysis by LC–MS-MS and GC-MS

    Get PDF
    Several substances relevant for forensic toxicology purposes have an endogenous presence in biological matrices: beta-hydroxybutyric acid (BHB), gamma-hydroxybutyric acid (GHB), steroids and human insulin, to name only a few. The presence of significant amounts of these endogenous substances in the biological matrix used to prepare calibration standards and quality control samples (QCs) can compromise validation steps and quantitative analyses. Several approaches to overcome this problem have been suggested, including using an analog matrix or analyte, relying entirely on standard addition analyses for these analytes, or simply ignoring the endogenous contribution provided that it is small enough. Although these approaches side-step the issue of endogenous analyte presence in spiked matrix-matched samples, they create serious problems with regards to the accuracy of the analyses or production capacity. We present here a solution that addresses head-on the problem of endogenous concentrations in matrices used for calibration standards and quality control purposes. The endogenous analyte concentration is estimated via a standard-addition type process. This estimated concentration, plus the spiked concentration are then used as the de facto analyte concentration present in the sample. These de facto concentrations are then used in data analysis software (MultiQuant, Mass Hunter, etc.) as the sample’s concentration. This yields an accurate quantification of the analyte, free from interference of the endogenous contribution. This de facto correction has been applied in a production setting on two BHB quantification methods (GC-MS and LC–MS-MS), allowing the rectification of BHB biases of up to 30 μg/mL. The additional error introduced by this correction procedure is minimal, although the exact amount will be highly method-dependent. The endogenous concentration correction process has been automated with an R script. The final procedure is therefore highly efficient, only adding four mouse clicks to the data analysis operations

    Interférence de l’hémoglobine foetale dans la quantification de la carboxyhémoglobine par spectrophotométrie

    Get PDF
    La quantification de la carboxyhémoglobine (HbCO) est régulièrement effectuée lorsqu’une intoxication au monoxyde de carbone est soupçonnée, ainsi que dans les cas de morts inexpliquées. Cependant, certains auteurs ont soulevé la problématique de la fausse élévation de la HbCO mesurée chez les enfants (0 à 12 mois), due à la présence d’hémoglobine foetale (HbF). Le but de cette étude est d’évaluer l’impact de l’hémoglobine foetale sur la quantification de la carboxyhémoglobine par spectrophotométrie. L’interférence de l’HbF dans la quantification de la HbCO chez les enfants âgés de 0 à 12 mois a été évaluée en utilisant 16 échantillons de sang ante-mortem et 19 échantillons de sang post-mortem. Le pourcentage de HbCO (%HbCO) a été quantifié par spectrophotométrie en calculant le ratio d’absorbance 560 nm/530 nm, en utilisant un spectrophotomètre à double faisceau. La moyenne des mesures de HbCO chez les enfants de 3 mois et moins était de 17%, ce qui est anormalement élevé. Aucune différence significative dans la mesure de la HbCO n’a été déterminée entre les échantillons ante-mortem et post-mortem

    Qualitative method validation and uncertainty evaluation via the binary output: I – Validation guidelines and theoretical foundations

    Get PDF
    Qualitative methods have an important place in forensic toxicology, filling central needs in, amongst others, screening and analyses linked to per se legislation. Nevertheless, bioanalytical method validation guidelines either do not discuss this type of method, or describe method validation procedures ill adapted to qualitative methods. The output of qualitative methods are typically categorical, binary results such as “presence”/“absence” or “above cut-off”/“below cut-off”. Since the goal of any method validation is to demonstrate fitness for use under production conditions, guidelines should evaluate performance by relying on the discrete results, instead of the continuous measurements obtained (e.g. peak height, area ratio). We have developed a tentative validation guideline for decision point qualitative methods by modeling measurements and derived binary results behaviour, based on the literature and experimental results. This preliminary guideline was applied to an LC-MS/MS method for 40 analytes, each with a defined cut-off concentration. The standard deviation of measurements at cut-off ( ) was estimated based on 10 spiked samples. Analytes were binned according to their %RSD (8.00%, 16.5%, 25.0%). Validation parameters calculated from the analysis of 30 samples spiked at and (false negative rate, false positive rate, selectivity rate, sensitivity rate and reliability rate) showed a surprisingly high failure rate. Overall, 13 out of the 40 analytes were not considered validated. Subsequent examination found that this was attributable to an appreciable shift in the standard deviation of the area ratio between different batches of samples analyzed. Keeping this behaviour in mind when setting the validation concentrations, the developed guideline can be used to validate qualitative decision point methods, relying on binary results for performance evaluation and taking into account measurement uncertainty. An application of this method validation scheme is presented in the accompanying paper (II – Application to a multi-analyte LC-MS/MS method for oral fluid)

    Procedure for the selection and validation of a calibration model: I —Description and Application

    Get PDF
    Calibration model selection is required for all quantitative methods in toxicology and more broadly in bioanalysis. This typically involves selecting the equation order (quadratic or linear) and weighting factor correctly modelizing the data. A mis-selection of the calibration model will generate lower quality control (QC) accuracy, with an error up to 154%. Unfortunately, simple tools to perform this selection and tests to validate the resulting model are lacking. We present a stepwise, analyst-independent scheme for selection and validation of calibration models. The success rate of this scheme is on average 40% higher than a traditional “fit and check the QCs accuracy” method of selecting the calibration model. Moreover, the process was completely automated through a script (available in Supplemental Data 3) running in RStudio (free, open-source software). The need for weighting was assessed through an F-test using the variances of the upper limit of quantification and lower limit of quantification replicate measurements. When weighting was required, the choice between 1/x and 1/(x^2) was determined by calculating which option generated the smallest spread of weighted normalized variances. Finally, model order was selected through a partial F-test. The chosen calibration model was validated through Cramer–von Mises or Kolmogorov–Smirnov normality testing of the standardized residuals. Performance of the different tests was assessed using 50 simulated data sets per possible calibration model (e.g., linear-no weight, quadratic-no weight, linear-1/x, etc.). This first of two papers describes the tests, procedures and outcomes of the developed procedure using real LC-MS/MS results for the quantification of cocaine and naltrexone

    Procedure for the selection and validation of a calibration model: II —Theoretical basis

    Get PDF
    In the first part of this paper (I — Description and application), an automated, stepwise and analyst independent process for the selection and validation of calibration models was put forward and applied to two model analytes. This second part presents the mathematical reasoning and experimental work underlying the selection of the different components of this procedure. Different replicate analysis designs (intra/inter-day and intra/interextraction) were tested and their impact on test results was evaluated. For most methods, the use of intra-day/intra-extraction measurement replicates is recommended due to its decreased variability. This process should be repeated three times during the validation process in order to assess the time stability of the underlying model. Strategies for identification of heteroscedasticity and their potential weaknesses were examined and a unilateral F-test using the lower limit of quantification and upper limit of quantification replicates was chosen. Three different options for model selection were examined and tested: ANOVA lack-of-fit (LOF), partial F-test and significance of the second-order term. Examination of mathematical assumptions for each test and LC-MS/MS experimental results lead to selection of the partial F-test as being the most suitable. The advantages and drawbacks of ANOVA-LOF, examination of the standardized residuals graph and residuals normality testing (Kolmogorov-Smirnov or Cramer-Von Mises) for validation of the calibration model were examined with the last option proving the best in light of its robustness and accuracy. Choosing the correct calibration model improves QC accuracy, and simulations have shown that this automated scheme has a much better performance than a more traditional method of fitting with increasingly complex models until QC accuracies pass below a threshold

    Qualitative method validation and uncertainty evaluation via the binary output: II - Application to a multi-analyte LC-MS/MS method for oral fluid

    Get PDF
    A study of impaired driving rates in the province of Québec is currently planned following the legalization of recreational cannabis in Canada. Oral fluid (OF) samples are to be collected with a Quantisal device and sent to the laboratory for analysis. In order to prepare for this project, a qualitative decision point analysis method monitoring for the presence of 97 drugs and metabolites in OF was validated according to the guidelines presented in the first part of this paper (I – Validation guidelines and statistical foundations). This high throughput method uses incubation with a precipitation solvent (acetone:acetonitrile 30:70 v:v) to boost drug recovery from the collecting device and improve stability of benzodiazepines (e.g. α-hydroxyalprazolam, clonazepam, 7-aminoclonazepam, flunitrazepam, 7-aminoflunitrazepam, N-desmethylflunitrazepam, nitrazepam). The Quantisal device has polyglycol in its stabilizing buffer but timed use of the mass spectrometer waste valve proved sufficient to avoid the glycol interferences for nearly all analytes. Interferences from OF matrices and 140 potentially interfering compounds, carryover, ion ratios, stability, recovery, reproducibility, robustness, false positive rate, false negative rate, selectivity, sensitivity and reliability rates were tested in the validation process. Five of the targeted analytes (olanzapine, oxazepam, 7-aminoclonazepam, flunitrazepam and nitrazepam) did not meet the set validation criteria but will be monitored for identification purposes (no comparison to a cut-off level). Blind internal proficiency teting was performed, where six OF samples were tested and analytes were classified as “negative”, “likely positive” or “positive” with success. The final validated OF qualitative decision point method covers 92 analytes, and the presence of 5 additional analytes is screened in this high hroughput analysis
    corecore