3 research outputs found

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    The Linearity of Low Frequency Traffic Flow: An Intrinsic I/O Property in Queueing Systems

    No full text
    Consider a class of queueing systems which can be modeled by a finite Quasi-Birth-Death (QBD) process. In this paper we develop a powerful computational technique for spectral analyses (i.e. second-order statistics) of output, queue and loss. Emphasis is placed on output power spectrum and input-output coherence function in response to various input power spectral properties and system parameters. The coherence function is defined to measure linear relationship between input and output processes. A key technical contribution of this paper is the exploration of linearity of low frequency traffic flow. Through the evaluation of the coherence function, one can identify a so-called nonlinear break frequency, ! b , under which the low frequency traffic stay intact via a queueing system. Such a low frequency I/O linearity plays an important role in characterizing the output process, which may form a partial input to other "downstream" queues of the network. After all, it is the "upstream" ou..

    Systematic evaluation of common natural language processing techniques to codify clinical notes.

    No full text
    Proper codification of medical diagnoses and procedures is essential for optimized health care management, quality improvement, research, and reimbursement tasks within large healthcare systems. Assignment of diagnostic or procedure codes is a tedious manual process, often prone to human error. Natural Language Processing (NLP) has been suggested to facilitate this manual codification process. Yet, little is known on best practices to utilize NLP for such applications. With Large Language Models (LLMs) becoming more ubiquitous in daily life, it is critical to remember, not every task requires that level of resource and effort. Here we comprehensively assessed the performance of common NLP techniques to predict current procedural terminology (CPT) from operative notes. CPT codes are commonly used to track surgical procedures and interventions and are the primary means for reimbursement. Our analysis of 100 most common musculoskeletal CPT codes suggest that traditional approaches can outperform more resource intensive approaches like BERT significantly (P-value = 4.4e-17) with average AUROC of 0.96 and accuracy of 0.97, in addition to providing interpretability which can be very helpful and even crucial in the clinical domain. We also proposed a complexity measure to quantify the complexity of a classification task and how this measure could influence the effect of dataset size on model's performance. Finally, we provide preliminary evidence that NLP can help minimize the codification error, including mislabeling due to human error
    corecore