91 research outputs found

    Im-Promptu: In-Context Composition from Image Prompts

    Full text link
    Large language models are few-shot learners that can solve diverse tasks from a handful of demonstrations. This implicit understanding of tasks suggests that the attention mechanisms over word tokens may play a role in analogical reasoning. In this work, we investigate whether analogical reasoning can enable in-context composition over composable elements of visual stimuli. First, we introduce a suite of three benchmarks to test the generalization properties of a visual in-context learner. We formalize the notion of an analogy-based in-context learner and use it to design a meta-learning framework called Im-Promptu. Whereas the requisite token granularity for language is well established, the appropriate compositional granularity for enabling in-context generalization in visual stimuli is usually unspecified. To this end, we use Im-Promptu to train multiple agents with different levels of compositionality, including vector representations, patch representations, and object slots. Our experiments reveal tradeoffs between extrapolation abilities and the degree of compositionality, with non-compositional representations extending learned composition rules to unseen domains but performing poorly on combinatorial tasks. Patch-based representations require patches to contain entire objects for robust extrapolation. At the same time, object-centric tokenizers coupled with a cross-attention module generate consistent and high-fidelity solutions, with these inductive biases being particularly crucial for compositional generalization. Lastly, we demonstrate a use case of Im-Promptu as an intuitive programming interface for image generation

    Impact of a Multimodal Antimicrobial Stewardship Program on Pseudomonas aeruginosa Susceptibility and Antimicrobial Use in the Intensive Care Unit Setting

    Get PDF
    Objective. To study the impact of our multimodal antibiotic stewardship program on Pseudomonas aeruginosa susceptibility and antibiotic use in the intensive care unit (ICU) setting. Methods. Our stewardship program employed the key tenants of published antimicrobial stewardship guidelines. These included prospective audits with intervention and feedback, formulary restriction with preauthorization, educational conferences, guidelines for use, antimicrobial cycling, and de-escalation of therapy. ICU antibiotic use was measured and expressed as defined daily doses (DDD) per 1,000 patient-days. Results. Certain temporal relationships between antibiotic use and ICU resistance patterns appeared to be affected by our antibiotic stewardship program. In particular, the ICU use of intravenous ciprofloxacin and ceftazidime declined from 148 and 62.5 DDD/1,000 patient-days to 40.0 and 24.5, respectively, during 2004 to 2007. An increase in the use of these agents and resistance to these agents was witnessed during 2008–2010. Despite variability in antibiotic usage from the stewardship efforts, we were overall unable to show statistical relationships with P. aeruginosa resistance rate. Conclusion. Antibiotic resistance in the ICU setting is complex. Multimodal stewardship efforts attempt to prevent resistance, but such programs clearly have their limits

    Comparative analysis of four methods to extract DNA from paraffin-embedded tissues: effect on downstream molecular applications

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A large portion of tissues stored worldwide for diagnostic purposes is formalin-fixed and paraffin-embedded (FFPE). These FFPE-archived tissues are an extremely valuable source for retrospective (genetic) studies. These include mutation screening in cancer-critical genes as well as pathogen detection. In this study we evaluated the impact of several widely used DNA extraction methods on the quality of molecular diagnostics on FFPE tissues.</p> <p>Findings</p> <p>We compared 4 DNA extraction methods from 4 identically processed FFPE mammary-, prostate-, colon- and lung tissues with regard to PCR inhibition, real time SNP detection and amplifiable fragment size. The extraction methods, with and without proteinase K pre-treatment, tested were: 1) heat-treatment, 2) QIAamp DNA-blood-mini-kit, 3) EasyMAG NucliSens and 4) Gentra Capture-Column-kit.</p> <p>Amplifiable DNA fragment size was assessed by multiplexed 200-400-600 bp PCR and appeared highly influenced by the extraction method used. Proteinase K pre-treatment was a prerequisite for proper purification of DNA from FFPE. Extractions with QIAamp, EasyMAG and heat-treatment were found suitable for amplification of fragments up to 400 bp from all tissues, 600 bp amplification was marginally successful (best was QIAamp). QIAamp and EasyMAG extracts were found suitable for downstream real time SNP detection. Gentra extraction was unsuitable. Hands-on time was lowest for heat-treatment, followed by EasyMAG.</p> <p>Conclusions</p> <p>We conclude that the extraction method plays an important role with regard to performance in downstream molecular applications.</p

    Detection of Somatic Mutations by High-Resolution DNA Melting (HRM) Analysis in Multiple Cancers

    Get PDF
    Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM) curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each). HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples

    Design and implementation of a generalized laboratory data model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Investigators in the biological sciences continue to exploit laboratory automation methods and have dramatically increased the rates at which they can generate data. In many environments, the methods themselves also evolve in a rapid and fluid manner. These observations point to the importance of robust information management systems in the modern laboratory. Designing and implementing such systems is non-trivial and it appears that in many cases a database project ultimately proves unserviceable.</p> <p>Results</p> <p>We describe a general modeling framework for laboratory data and its implementation as an information management system. The model utilizes several abstraction techniques, focusing especially on the concepts of inheritance and meta-data. Traditional approaches commingle event-oriented data with regular entity data in <it>ad hoc </it>ways. Instead, we define distinct regular entity and event schemas, but fully integrate these via a standardized interface. The design allows straightforward definition of a "processing pipeline" as a sequence of events, obviating the need for separate workflow management systems. A layer above the event-oriented schema integrates events into a workflow by defining "processing directives", which act as automated project managers of items in the system. Directives can be added or modified in an almost trivial fashion, i.e., without the need for schema modification or re-certification of applications. Association between regular entities and events is managed via simple "many-to-many" relationships. We describe the programming interface, as well as techniques for handling input/output, process control, and state transitions.</p> <p>Conclusion</p> <p>The implementation described here has served as the Washington University Genome Sequencing Center's primary information system for several years. It handles all transactions underlying a throughput rate of about 9 million sequencing reactions of various kinds per month and has handily weathered a number of major pipeline reconfigurations. The basic data model can be readily adapted to other high-volume processing environments.</p

    Initial sequencing and analysis of the human genome

    Full text link
    The human genome holds an extraordinary trove of information about human development, physiology, medicine and evolution. Here we report the results of an international collaboration to produce and make freely available a draft sequence of the human genome. We also present an initial analysis of the data, describing some of the insights that can be gleaned from the sequence.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/62798/1/409860a0.pd

    Genome engineering for improved recombinant protein expression in Escherichia coli

    Get PDF

    Guided self-organization and cortical plate formation in human brain organoids.

    Get PDF
    Three-dimensional cell culture models have either relied on the self-organizing properties of mammalian cells or used bioengineered constructs to arrange cells in an organ-like configuration. While self-organizing organoids excel at recapitulating early developmental events, bioengineered constructs reproducibly generate desired tissue architectures. Here, we combine these two approaches to reproducibly generate human forebrain tissue while maintaining its self-organizing capacity. We use poly(lactide-co-glycolide) copolymer (PLGA) fiber microfilaments as a floating scaffold to generate elongated embryoid bodies. Microfilament-engineered cerebral organoids (enCORs) display enhanced neuroectoderm formation and improved cortical development. Furthermore, reconstitution of the basement membrane leads to characteristic cortical tissue architecture, including formation of a polarized cortical plate and radial units. Thus, enCORs model the distinctive radial organization of the cerebral cortex and allow for the study of neuronal migration. Our data demonstrate that combining 3D cell culture with bioengineering can increase reproducibility and improve tissue architecture
    corecore