913 research outputs found

    Medical students as EMTs: skill building, confidence and professional formation

    Get PDF
    Objective: The first course of the medical curriculum at the Hofstra North Shore-LIJ School of Medicine, From the Person to the Professional: Challenges, Privileges and Responsibilities, provides an innovative early clinical immersion. The course content specific to the Emergency Medical Technician (EMT) curriculum was developed using the New York State Emergency Medical Technician curriculum. Students gain early legitimate clinical experience and practice clinical skills as team members in the pre-hospital environment. We hypothesized this novel curriculum would increase students’ confidence in their ability to perform patient care skills and enhance students’ comfort with team-building skills early in their training. Methods: Quantitative and qualitative data were collected from first-year medical students (n=97) through a survey developed to assess students’ confidence in patient care and team-building skills. The survey was completed prior to medical school, during the final week of the course, and at the end of their first year. A paired-samples t-test was conducted to compare self-ratings on 12 patient care and 12 team-building skills before and after the course, and a theme analysis was conducted to examine open-ended responses. Results: Following the course, student confidence in patient care skills showed a significant increase from baseline (p<0.05) for all identified skills. Student confidence in team-building skills showed a significant increase (p<0.05) in 4 of the 12 identified skills. By the end of the first year, 84% of the first-year students reported the EMT curriculum had ‘some impact’ to ‘great impact’ on their patient care skills, while 72% reported the EMT curriculum had ‘some impact’ to ‘great impact’ on their team-building skills. Conclusions: The incorporation of EMT training early in a medical school curriculum provides students with meaningful clinical experiences that increase their self-reported level of confidence in the performance of patient care skills early in their medical education

    Probabilistic grammar induction from sentences and structured meanings

    Get PDF
    The meanings of natural language sentences may be represented as compositional logical-forms. Each word or lexicalised multiword-element has an associated logicalform representing its meaning. Full sentential logical-forms are then composed from these word logical-forms via a syntactic parse of the sentence. This thesis develops two computational systems that learn both the word-meanings and parsing model required to map sentences onto logical-forms from an example corpus of (sentence, logical-form) pairs. One of these systems is designed to provide a general purpose method of inducing semantic parsers for multiple languages and logical meaning representations. Semantic parsers map sentences onto logical representations of their meanings and may form an important part of any computational task that needs to interpret the meanings of sentences. The other system is designed to model the way in which a child learns the semantics and syntax of their first language. Here, logical-forms are used to represent the potentially ambiguous context in which childdirected utterances are spoken and a psycholinguistically plausible training algorithm learns a probabilistic grammar that describes the target language. This computational modelling task is important as it can provide evidence for or against competing theories of how children learn their first language. Both of the systems presented here are based upon two working hypotheses. First, that the correct parse of any sentence in any language is contained in a set of possible parses defined in terms of the sentence itself, the sentence’s logical-form and a small set of combinatory rule schemata. The second working hypothesis is that, given a corpus of (sentence, logical-form) pairs that each support a large number of possible parses according to the schemata mentioned above, it is possible to learn a probabilistic parsing model that accurately describes the target language. The algorithm for semantic parser induction learns Combinatory Categorial Grammar (CCG) lexicons and discriminative probabilistic parsing models from corpora of (sentence, logical-form) pairs. This system is shown to achieve at or near state of the art performance across multiple languages, logical meaning representations and domains. As the approach is not tied to any single natural or logical language, this system represents an important step towards widely applicable black-box methods for semantic parser induction. This thesis also develops an efficient representation of the CCG lexicon that separately stores language specific syntactic regularities and domain specific semantic knowledge. This factorised lexical representation improves the performance of CCG based semantic parsers in sparse domains and also provides a potential basis for lexical expansion and domain adaptation for semantic parsers. The algorithm for modelling child language acquisition learns a generative probabilistic model of CCG parses from sentences paired with a context set of potential logical-forms containing one correct entry and a number of distractors. The online learning algorithm used is intended to be psycholinguistically plausible and to assume as little information specific to the task of language learning as is possible. It is shown that this algorithm learns an accurate parsing model despite making very few initial assumptions. It is also shown that the manner in which both word-meanings and syntactic rules are learnt is in accordance with observations of both of these learning tasks in children, supporting a theory of language acquisition that builds upon the two working hypotheses stated above

    Historia filmu jako archeologia mediów – z Thomasem Elsaesserem rozmawia Fryderyk Kwiatkowski

    Get PDF
    † Thomas Elsaesser, prof.; był profesorem w Zakładzie Mediów i Kultury w Uniwersytecie Amsterdamskim; między 2006 i 2012 rokiem był profesorem wizytującym w Uniwersytecie Yale’a, a od 2013 do 2019 uczył na niepełny etat w Uniwersytecie Columbia; autor między innymi Teoria filmu wprowadzenie przez zmysły (2009, wydanie polskie 2015), Film History as Media Archaeology (2016). Do jego ostatnich książek należały European Cinema and Continental Philosophy: Film as Thought Experiment (2018) i Kino – maszyna myślenia. Refleksje nad kinem epoki cyfrowej (2018). W obrębie jego zainteresowań naukowych znajdowała się historia filmu niemieckiego, kino amerykańskie i europejskie, archeologia mediów, a także związki między filmem i filozofią. Fryderyk Kwiatkowski, mgr; doktorant, Uniwersytet w Groningen; publikował między innymi w „Gnosis: Journal of Gnostic Studies”, „CLCWeb: Comparative Literature and Culture”, „Canadian-American Slavic Studies”, „Journal of Religion and Film”; w swojej pracy badawczej skupia się na recepcji gnostycyzmu w kulturze Zachodu, przede wszystkim w zachodnim ezoteryzmie, dwudziestowiecznej filozofii politycznej i myśli krytycznej, nowych ruchach religijnych i narracjach fikcjonalnych

    Recovery from disturbance requires resynchronization of ecosystem nutrient cycles

    Get PDF
    Nitrogen (N) and phosphorus (P) are tightly cycled in most terrestrial ecosystems, with plant uptake more than 10 times higher than the rate of supply from deposition and weathering. This near-total dependence on recycled nutrients and the stoichiometric constraints on resource use by plants and microbes mean that the two cycles have to be synchronized such that the ratio of N:P in plant uptake, litterfall, and net mineralization are nearly the same. Disturbance can disrupt this synchronization if there is a disproportionate loss of one nutrient relative to the other. We model the resynchronization of N and P cycles following harvest of a northern hardwood forest. In our simulations, nutrient loss in the harvest is small relative to postharvest losses. The low N:P ratio of harvest residue results in a preferential release of P and retention of N. The P release is in excess of plant requirements and P is lost from the active ecosystem cycle through secondary mineral formation and leaching early in succession. Because external P inputs are small, the resynchronization of the N and P cycles later in succession is achieved by a commensurate loss of N. Through succession, the ecosystem undergoes alternating periods of N limitation, then P limitation, and eventually co-limitation as the two cycles resynchronize. However, our simulations indicate that the overall rate and extent of recovery is limited by P unless a mechanism exists either to prevent the P loss early in succession (e.g., P sequestration not stoichiometrically constrained by N) or to increase the P supply to the ecosystem later in succession (e.g., biologically enhanced weathering). Our model provides a heuristic perspective from which to assess the resynchronization among tightly cycled nutrients and the effect of that resynchronization on recovery of ecosystems from disturbance

    Optimizing Illumina next-generation sequencing library preparation for extremely AT-biased genomes.

    Get PDF
    BAckground: Massively parallel sequencing technology is revolutionizing approaches to genomic and genetic research. Since its advent, the scale and efficiency of Next-Generation Sequencing (NGS) has rapidly improved. In spite of this success, sequencing genomes or genomic regions with extremely biased base composition is still a great challenge to the currently available NGS platforms. The genomes of some important pathogenic organisms like Plasmodium falciparum (high AT content) and Mycobacterium tuberculosis (high GC content) display extremes of base composition. The standard library preparation procedures that employ PCR amplification have been shown to cause uneven read coverage particularly across AT and GC rich regions, leading to problems in genome assembly and variation analyses. Alternative library-preparation approaches that omit PCR amplification require large quantities of starting material and hence are not suitable for small amounts of DNA/RNA such as those from clinical isolates. We have developed and optimized library-preparation procedures suitable for low quantity starting material and tolerant to extremely high AT content sequences. Results: We have used our optimized conditions in parallel with standard methods to prepare Illumina sequencing libraries from a non-clinical and a clinical isolate (containing ~53% host contamination). By analyzing and comparing the quality of sequence data generated, we show that our optimized conditions that involve a PCR additive (TMAC), produces amplified libraries with improved coverage of extremely AT-rich regions and reduced bias toward GC neutral templates. Conclusion: We have developed a robust and optimized Next-Generation Sequencing library amplification method suitable for extremely AT-rich genomes. The new amplification conditions significantly reduce bias and retain the complexity of either extremes of base composition. This development will greatly benefit sequencing clinical samples that often require amplification due to low mass of DNA starting material

    Efficient depletion of host DNA contamination in malaria clinical sequencing.

    Get PDF
    The cost of whole-genome sequencing (WGS) is decreasing rapidly as next-generation sequencing technology continues to advance, and the prospect of making WGS available for public health applications is becoming a reality. So far, a number of studies have demonstrated the use of WGS as an epidemiological tool for typing and controlling outbreaks of microbial pathogens. Success of these applications is hugely dependent on efficient generation of clean genetic material that is free from host DNA contamination for rapid preparation of sequencing libraries. The presence of large amounts of host DNA severely affects the efficiency of characterizing pathogens using WGS and is therefore a serious impediment to clinical and epidemiological sequencing for health care and public health applications. We have developed a simple enzymatic treatment method that takes advantage of the methylation of human DNA to selectively deplete host contamination from clinical samples prior to sequencing. Using malaria clinical samples with over 80% human host DNA contamination, we show that the enzymatic treatment enriches Plasmodium falciparum DNA up to ∼9-fold and generates high-quality, nonbiased sequence reads covering >98% of 86,158 catalogued typeable single-nucleotide polymorphism loci

    Extreme mutation bias and high AT content in Plasmodium falciparum.

    Get PDF
    For reasons that remain unknown, the Plasmodium falciparum genome has an exceptionally high AT content compared to other Plasmodium species and eukaryotes in general - nearly 80% in coding regions and approaching 90% in non-coding regions. Here, we examine how this phenomenon relates to genome-wide patterns of de novo mutation. Mutation accumulation experiments were performed by sequential cloning of six P. falciparum isolates growing in human erythrocytes in vitro for 4 years, with 279 clones sampled for whole genome sequencing at different time points. Genome sequence analysis of these samples revealed a significant excess of G:C to A:T transitions compared to other types of nucleotide substitution, which would naturally cause AT content to equilibrate close to the level seen across the P. falciparum reference genome (80.6% AT). These data also uncover an extremely high rate of small indel mutation relative to other species, primarily associated with repetitive AT-rich sequences, in addition to larger-scale structural rearrangements focused in antigen-coding var genes. In conclusion, high AT content in P. falciparum is driven by a systematic mutational bias and ultimately leads to an unusual level of microstructural plasticity, raising the question of whether this contributes to adaptive evolution

    Evolutionary analysis of the most polymorphic gene family in falciparum malaria

    Get PDF
    The var gene family of the human malaria parasite Plasmodium falciparum encode proteins that are crucial determinants of both pathogenesis and immune evasion and are highly polymorphic. Here we have assembled nearly complete var gene repertoires from 2398 field isolates and analysed a normalised set of 714 from across 12 countries. This therefore represents the first large scale attempt to catalogue the worldwide distribution of var gene sequences We confirm the extreme polymorphism of this gene family but also demonstrate an unexpected level of sequence sharing both within and between continents. We show that this is likely due to both the remnants of selective sweeps as well as a worrying degree of recent gene flow across continents with implications for the spread of drug resistance. We also address the evolution of the var repertoire with respect to the ancestral genes within the Laverania and show that diversity generated by recombination is concentrated in a number of hotspots. An analysis of the subdomain structure indicates that some existing definitions may need to be revised From the analysis of this data, we can now understand the way in which the family has evolved and how the diversity is continuously being generated. Finally, we demonstrate that because the genes are distributed across the genome, sequence sharing between genotypes acts as a useful population genetic marker

    An Efficient and Generic Construction for Signal\u27s Handshake (X3DH): Post-Quantum, State Leakage Secure, and Deniable

    Get PDF
    The Signal protocol is a secure instant messaging protocol that underlies the security of numerous applications such as WhatsApp, Skype, Facebook Messenger among many others. The Signal protocol consists of two sub-protocols known as the X3DH protocol and the double ratchet protocol, where the latter has recently gained much attention. For instance, Alwen, Coretti, and Dodis (Eurocrypt\u2719) provided a concrete security model along with a generic construction based on simple building blocks that are instantiable from versatile assumptions, including post-quantum ones. In contrast, as far as we are aware, works focusing on the X3DH protocol seem limited. In this work, we cast the X3DH protocol as a specific type of authenticated key exchange (AKE) protocol, which we call a Signal-conforming AKE protocol, and formally define its security model based on the vast prior works on AKE protocols. We then provide the first efficient generic construction of a Signal-conforming AKE protocol based on standard cryptographic primitives such as key encapsulation mechanisms (KEM) and signature schemes. Specifically, this results in the first post-quantum secure replacement of the X3DH protocol on well-established assumptions. Similar to the X3DH protocol, our Signal-conforming AKE protocol offers a strong (or stronger) flavor of security, where the exchanged key remains secure even when all the non-trivial combinations of the long-term secrets and session-specific secrets are compromised. Moreover, our protocol has a weak flavor of deniability and we further show how to progressively strengthen it using ring signatures and/or non-interactive zero-knowledge proof systems. Finally, we provide a full-fledged, generic C implementation of our (weakly deniable) protocol. We instantiate it with several Round 3 candidates (finalists and alternates) to the NIST post-quantum standardization process and compare the resulting bandwidth and computation performances. Our implementation is publicly available
    corecore