4,611 research outputs found

    Rapid Pathogen Detection for Infected Ascites in Cirrhosis using Metagenome Next-Generation Sequencing: A Case Series

    Get PDF
    Empirical antibiotic therapy in patients with spontaneous bacterial peritonitis (SBP) is common as pathogen(s) are identified in only 5%-20% patients using conventional culture-based techniques. Metagenome next-generation sequencing (mNGS) test is a promising approach for diagnosis of infectious disease. The clinical application of mNGS for infected ascites in cirrhotic patients is rarely reported. Here, we describe three cases to preliminarily explore the potential role of mNGS for microbiological diagnosis of ascites infection in exploratory manner. The clinical performance of ascites mNGS in cirrhotic patients remains to be further evaluated

    Crossing the Resolution Limit in Near-Infrared Imaging of Silicon Chips: Targeting 10-nm Node Technology

    Get PDF
    The best reported resolution in optical failure analysis of silicon chips is 120-nm half pitch demonstrated by Semicaps Private Limited, whereas the current and future industry requirement for 10-nm node technology is 100-nm half pitch. We show the first experimental evidence for resolution of features with 100-nm half pitch buried in silicon (λ/10.6), thus fulfilling the industry requirement. These results are obtained using near-infrared reflection-mode imaging using a solid immersion lens. The key novel feature of our approach is the choice of an appropriately sized collection pinhole. Although it is usually understood that, in general, resolution is improved by using the smallest pinhole consistent with an adequate signal level, it is found that in practice for silicon chips there is an optimum pinhole size, determined by the generation of induced currents in the sample. In failure analysis of silicon chips, nondestructive imaging is important to avoid disturbing the functionality of integrated circuits. High-resolution imaging techniques like SEM or TEM require the transistors to be exposed destructively. Optical microscopy techniques may be used, but silicon is opaque in the visible spectrum, mandating the use of near-infrared light and thus poor resolution in conventional optical microscopy. We expect our result to change the way semiconductor failure analysis is performed

    Data Model Development for Fire Related Extreme Events - An Activity Theory and Semiotics Approach

    Get PDF
    Post analyses of major extreme events reveal that information sharing is critical for an effective emergency response. The lack of consistent data standards in the current emergency management practice however serves only to hinder efficient critical information flow among the incident responders. In this paper, we adopt a theory driven approach to develop a XML-based data model that prescribes a comprehensive set of data standards for fire related extreme events to better address the challenges of information interoperability. The data model development is guided by third generation Activity Theory and semiotics theories for requirement analyses. The model validation is achieved using a RFC-like process typical in standards development. This paper applies the standards to the real case of a fire incident scenario. Further, it complies with the national leading initiatives in emergency standards (National Information Exchange Model)

    An investigation of using various diesel-type fuels in homogeneous charge compression ignition engines and their effects on operational and controlling issues

    Get PDF
    Homogeneous charge compression ignition (HCCI) engines appear to be a future alternative to diesel and spark-ignited engines. The HCCI engine has the potential to deliver high efficiency and very low NOx and particulate matter emissions. There are, however, problems with the control of ignition and heat release range over the entire load and speed range which limits the practical application of this technology. The aim of this paper is to analyse the use of different types of diesel fuels in an HCCI engine and hence to find the most suitable with respect to operational and control issues. The single-zone combustion model with convective heat transfer loss is used to simulate the HCCI engine environment. n-Heptane, dimethyl ether and bio-diesel (methyl butanoate and methyl formate) fuels are investigated. Methyl butanoate and methyl formate represent surrogates of heavy and light bio-diesel fuel respectively. The effects of different engine parameters such as equivalence ratio and engine speed on the ignition timing are investigated. The use of internal exhaust gas recirculation is investigated as a potential strategy for controlling the ignition timing. The results indicate that the use of bio-diesel fuels will result in lower sensitivity of ignition timing to changes in operational parameters and in a better control of the ignition process when compared with the use of n-heptane and dimethyl ether

    Interpretation of the optical transfer function: Significance for image scanning microscopy

    Get PDF
    The optical transfer function (OTF) is widely used to compare the performance of different optical systems. Conventionally, the OTF is normalized to unity for zero spatial frequency, but in some cases it is better to consider the unnormalized OTF, which gives the absolute value of the image signal. Examples are in confocal microscopy and image scanning microscopy, where the signal level increases with pinhole or array size. Comparison of the respective unnormalized OTFs gives useful insight into their relative performance. The significance of other properties of the general OTF is discussed

    Low-bit Shift Network for End-to-End Spoken Language Understanding

    Full text link
    Deep neural networks (DNN) have achieved impressive success in multiple domains. Over the years, the accuracy of these models has increased with the proliferation of deeper and more complex architectures. Thus, state-of-the-art solutions are often computationally expensive, which makes them unfit to be deployed on edge computing platforms. In order to mitigate the high computation, memory, and power requirements of inferring convolutional neural networks (CNNs), we propose the use of power-of-two quantization, which quantizes continuous parameters into low-bit power-of-two values. This reduces computational complexity by removing expensive multiplication operations and with the use of low-bit weights. ResNet is adopted as the building block of our solution and the proposed model is evaluated on a spoken language understanding (SLU) task. Experimental results show improved performance for shift neural network architectures, with our low-bit quantization achieving 98.76 \% on the test set which is comparable performance to its full-precision counterpart and state-of-the-art solutions.Comment: Accepted at INTERSPEECH 202

    ODACH: A One-shot Distributed Algorithm for Cox model with heterogeneous multi-center data

    Get PDF
    We developed a One-shot Distributed Algorithm for Cox proportional-hazards model to analyze Heterogeneous multi-center time-to-event data (ODACH) circumventing the need for sharing patient-level information across sites. This algorithm implements a surrogate likelihood function to approximate the Cox log-partial likelihood function that is stratified by site using patient-level data from a lead site and aggregated information from other sites, allowing the baseline hazard functions and the distribution of covariates to vary across sites. Simulation studies and application to a real-world opioid use disorder study showed that ODACH provides estimates close to the pooled estimator, which analyzes patient-level data directly from all sites via a stratified Cox model. Compared to the estimator from meta-analysis, the inverse variance-weighted average of the site-specific estimates, ODACH estimator demonstrates less susceptibility to bias, especially when the event is rare. ODACH is thus a valuable privacy-preserving and communication-efficient method for analyzing multi-center time-to-event data

    Profiling the educational value of computer games

    Get PDF
    There are currently a number of suggestions for educators to include computer games in formal teaching and learning contexts. Educational value is based on claims that games promote the development of complex learning. Very little research, however, has explored what features should be present in a computer game to make it valuable or conducive to learning. We present a list of required features for an educational game to be of value, informed by two studies, which integrated theories of Learning Environments and Learning Styles. A user survey showed that some requirements were typical of games in a particular genre, while other features were present across all genres. The paper concludes with a proposed framework of games and features within and across genres to assist in the design and selection of games for a given educational scenari

    On the use of an explicit chemical mechanism to dissect peroxy acetyl nitrate formation.

    Get PDF
    Peroxy acetyl nitrate (PAN) is a key component of photochemical smog and plays an important role in atmospheric chemistry. Though it has been known that PAN is produced via reactions of nitrogen oxides (NOx) with some volatile organic compounds (VOCs), it is difficult to quantify the contributions of individual precursor species. Here we use an explicit photochemical model--Master Chemical Mechanism (MCM) model--to dissect PAN formation and identify principal precursors, by analyzing measurements made in Beijing in summer 2008. PAN production was sensitive to both NOx and VOCs. Isoprene was the predominant VOC precursor at suburb with biogenic impact, whilst anthropogenic hydrocarbons dominated at downtown. PAN production was attributable to a relatively small class of compounds including NOx, xylenes, trimethylbenzenes, trans/cis-2-butenes, toluene, and propene. MCM can advance understanding of PAN photochemistry to a species level, and provide more relevant recommendations for mitigating photochemical pollution in large cities
    corecore