4,949 research outputs found
Recommended from our members
Precision cancer monitoring using a novel, fully integrated, microfluidic array partitioning digital PCR platform.
A novel digital PCR (dPCR) platform combining off-the-shelf reagents, a micro-molded plastic microfluidic consumable with a fully integrated single dPCR instrument was developed to address the needs for routine clinical diagnostics. This new platform offers a simplified workflow that enables: rapid time-to-answer; low potential for cross contamination; minimal sample waste; all within a single integrated instrument. Here we showcase the capability of this fully integrated platform to detect and quantify non-small cell lung carcinoma (NSCLC) rare genetic mutants (EGFR T790M) with precision cell-free DNA (cfDNA) standards. Next, we validated the platform with an established chronic myeloid leukemia (CML) fusion gene (BCR-ABL1) assay down to 0.01% mutant allele frequency to highlight the platform's utility for precision cancer monitoring. Thirdly, using a juvenile myelomonocytic leukemia (JMML) patient-specific assay we demonstrate the ability to precisely track an individual cancer patient's response to therapy and show the patient's achievement of complete molecular remission. These three applications highlight the flexibility and utility of this novel fully integrated dPCR platform that has the potential to transform personalized medicine for cancer recurrence monitoring
Recommended from our members
Are geometric morphometric analyses replicable? Evaluating landmark measurement error and its impact on extant and fossil Microtus classification.
Geometric morphometric analyses are frequently employed to quantify biological shape and shape variation. Despite the popularity of this technique, quantification of measurement error in geometric morphometric datasets and its impact on statistical results is seldom assessed in the literature. Here, we evaluate error on 2D landmark coordinate configurations of the lower first molar of five North American Microtus (vole) species. We acquired data from the same specimens several times to quantify error from four data acquisition sources: specimen presentation, imaging devices, interobserver variation, and intraobserver variation. We then evaluated the impact of those errors on linear discriminant analysis-based classifications of the five species using recent specimens of known species affinity and fossil specimens of unknown species affinity. Results indicate that data acquisition error can be substantial, sometimes explaining >30% of the total variation among datasets. Comparisons of datasets digitized by different individuals exhibit the greatest discrepancies in landmark precision, and comparison of datasets photographed from different presentation angles yields the greatest discrepancies in species classification results. All error sources impact statistical classification to some extent. For example, no two landmark dataset replicates exhibit the same predicted group memberships of recent or fossil specimens. Our findings emphasize the need to mitigate error as much as possible during geometric morphometric data collection. Though the impact of measurement error on statistical fidelity is likely analysis-specific, we recommend that all geometric morphometric studies standardize specimen imaging equipment, specimen presentations (if analyses are 2D), and landmark digitizers to reduce error and subsequent analytical misinterpretations
Knowledge Collaboration: Working with Data and Web Specialists
When resources are finite, people strive to manage resources jointly (if they do not rudely take possession of them). Organizing helps achieve—and even amplify—common purpose but often succumbs in time to organizational silos, teaming for the sake of teaming, and the obstacle course of organizational learning. The result is that organizations, be they in the form of hierarchies, markets, or networks (or, gradually more, hybrids of these), fail to create the right value for the right people at the right time. In the 21st century, most organizations are in any event lopsided and should be redesigned to serve a harmonious mix of economic, human, and social functions. In libraries as elsewhere, the three Ss of Strategy—Structure—Systems must give way to the three Ps of Purpose—Processes—People. Thence, with entrepreneurship and knowledge behaviors, data and web specialists can synergize in mutually supportive relationships of shared destiny
Tools for quantitative form description : an evaluation of different software packages for semi-landmark analysis
The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages ('Edgewarp' and 'Morpho') for the same sliding task, and investigate potential differences in the results and biological interpretation. 'Morpho' is much faster than 'Edgewarp,' notably as a result of the greater computational power of the 'Morpho' software routines and the complexity of the 'Edgewarp' workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to performthe analyses
Artificial Intelligence for the Financial Services Industry: What Challenges Organizations to Succeed?
As a research field, artificial intelligence (AI) exists for several years. More recently, technological breakthroughs, coupled with the fast availability of data, have brought AI closer to commercial use. Internet giants such as Google, Amazon, Apple or Facebook invest significantly into AI, thereby underlining its relevance for business models worldwide. For the highly data driven finance industry, AI is of intensive interest within pilot projects, still, few AI applications have been implemented so far. This study analyzes drivers and inhibitors of a successful AI application in the finance industry based on panel data comprising 22 semi-structured interviews with experts in AI in finance. As theoretical lens, we structured our results using the TOE framework. Guidelines for applying AI successfully reveal AI-specific role models and process competencies as crucial, before trained algorithms will have reached a quality level on which AI applications will operate without human intervention and moral concerns
An inkjet printing compatible platform for sensitive detection of dengue virus via gel based LAMP
In these current times of the COVID-19 pandemic, the need for a widely available diagnostic platform for diagnosis of viral infections cannot be overstated. Such a platform that can be easily manufactured on a large scale would be beneficial both for making clinical decisions as well as provide new tools to epidemiologists broadly screening the population during epidemic threats. Fraunhofer USA CMI has focused on manufacturability of point-of-care (POC) diagnostics as a strategic point of development that can reduce costs and close the gap between research efforts and getting devices to the market. Here we report an inkjet printed platform that has the potential to increase the sensitivity of the molecular assay, be compatible with mass manufacturing methods and allow for reagent storage on-chip. Isothermal methods such as loop-mediated isothermal amplification (LAMP) have been used for rapid disease diagnosis in low resource settings due to their increased sensitivity and lack of thermal cycling. Fluorescent based LAMP readout techniques like QUASR also lend themselves to easy deployment in field settings via smartphones. However, rather than taking the standard approach, we developed a hydrogel based LAMP (gel-RTLAMP) assay by incorporating an engineered hydrogel, highly methacrylated gelatin (GM10). In this study we show that this hydrogel is compatible with both the LAMP reaction and piezoelectric inkjet printing. Furthermore, via limit of detection studies we also show that this gel-RTLAMP assay is 100 fold more sensitive than a standard RT-LAMP assay and results in clinically relevant detection of DENV RNA in analytical samples (10 copies per reaction). Unfortunately we could not characterize the final inkjet printed platform due to time limitations and other issues, but instead we report a proof-of-principle study to show that this gel-RTLAMP assay can be adapted in a microarray format via inkjet printing and has the potential for rapid multiplexing and digitization of the assay
Image processing mini manual
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC
Reports
Updates from conferences, the Mississippi Legislature, and institutions from around the state
- …