1,972 research outputs found

    A Review and Evaluation of Techniques for Improved Feature Detection in Mass Spectrometry Data

    Get PDF
    Mass spectrometry (MS) is used in analysis of chemical samples to identify the molecules present and their quantities. This analytical technique has applications in many fields, from pharmacology to space exploration. Its impacts on medicine are particularly significant, since MS aids in the identification of molecules associated with disease; for instance, in proteomics, MS allows researchers to identify proteins that are associated with autoimmune disorders, cancers, and other conditions. Since the applications are so wide-ranging and the tool is ubiquitous across so many fields, it is critical that the analytical methods used to collect data are sound. Data analysis in MS is challenging. Experiments produce massive amounts of raw data that need to be processed algorithmically in order to generate interpretable results in a process known as feature detection, which is tasked with distinguishing signals associated with the chemical sample being analyzed from signals associated with background noise. These experimentally meaningful signals are also known as features or extracted ion chromatograms (XIC) and are the fundamental signal unit in mass spectrometry. There are many algorithms for analyzing raw mass spectrometry data tasked with distinguishing real isotopic signals from noise. While one or more of the available algorithms are typically chained together for end-to-end mass spectrometry analysis, analysis of each algorithm in isolation provides a specific measurement of the strengths and weaknesses of each algorithm without the confounding effects that can occur when multiple algorithmic tasks are chained together. Though qualitative opinions on extraction algorithm performance abound, quantitative performance has never been publicly ascertained. Quantitative evaluation has not occurred partly due to the lack of an available quantitative ground truth MS1 data set. Because XIC must be distinguished from noise, quality algorithms for this purpose are essential. Background noise is introduced through the mobile phase of the chemical matrix in which the sample of interest is introduced to the MS instrument, and as a result, MS data is full of signals representing low-abundance molecules (i.e. low-intensity signals). Noise generally presents in one of two ways: very low-intensity signals that comprise a majority of the data from an MS experiment, and noise features that are moderately low-intensity and can resemble signals from low-abundance molecules deriving from the actual sample of interest. Like XIC algorithms, noise reduction algorithms have yet to be quantitatively evaluated, to our knowledge; the performance of these algorithms is generally evaluated through consensus with other noise reduction algorithms. Using a recently published, manually-extracted XIC dataset as ground truth data, we evaluate the quality of popular XIC algorithms, including MaxQuant, MZMine2, and several methods from XCMS. XIC algorithms were applied to the manually extracted data using a grid search of possible parameters. Performance varied greatly between different parameter settings, though nearly all algorithms with parameter settings optimized with respect to the number of true positives recovered over 10,000 XIC. We also examine two popular algorithms for reducing background noise, the COmponent Detection Algorithm (CODA) and adaptive iteratively reweighted Penalized Least Squares (airPLS), and compare their performance to the results of feature detection alone using algorithms that achieved the best performance in a previous evaluation. Due to weaknesses inherent in the implementation of these algorithms, both noise reduction algorithms eliminate data identified by feature detection as significant

    Minimal Specialization: Coevolution of Network Structure and Dynamics

    Full text link
    The changing topology of a network is driven by the need to maintain or optimize network function. As this function is often related to moving quantities such as traffic, information, etc. efficiently through the network the structure of the network and the dynamics on the network directly depend on the other. To model this interplay of network structure and dynamics we use the dynamics on the network, or the dynamical processes the network models, to influence the dynamics of the network structure, i.e., to determine where and when to modify the network structure. We model the dynamics on the network using Jackson network dynamics and the dynamics of the network structure using minimal specialization, a variant of the more general network growth model known as specialization. The resulting model, which we refer to as the integrated specialization model, coevolves both the structure and the dynamics of the network. We show this model produces networks with real-world properties, such as right-skewed degree distributions, sparsity, the small-world property, and non-trivial equitable partitions. Additionally, when compared to other growth models, the integrated specialization model creates networks with small diameter, minimizing distances across the network. Along with producing these structural features, this model also sequentially removes the network's largest bottlenecks. The result are networks that have both dynamic and structural features that allow quantities to more efficiently move through the network.Comment: 20 pages, 6 figure

    Fate of Ascaris at various pH, temperature and moisture levels

    Get PDF
    Soil-transmitted helminths (STH) are intestinal worms that infect 24% of the world’s population. Stopping the spread of STH is difficult, as the eggs are resilient (can withstand high pH) and persistent (can remain viable in soils for several years). To ensure that new sanitation systems can inactivate STH, a better understanding of their resilience is required. This study assessed the inactivation of Ascaris eggs under various conditions, in terms of moisture content (MC) (90%), temperature (20–50 C) and pH (7–12.5). The results highlight that the exposure of Ascaris eggs to elevated pH (10.5–12.5) at temperatures 27.5 C for >70 days had no effect on egg viability. Compounding effects of alkaline pH (10.5) or decreasing MC (<20%) was observed at 35 C, with pH having more of an effect than decreasing MC. To accelerate the inactivation of STH, an increase in the treatment temperature is more effective than pH increase. Alkaline pH alone did not inactivate the eggs but can enhance the effect of ammonia, which is likely to be present in organic wastes

    Digi-Do: a digital information tool to support patients with breast cancer before, during, and after start of radiotherapy treatment: an RCT study protocol

    Get PDF
    Background: Radiation Therapy (RT) is a common treatment after breast cancer surgery and a complex process using high energy X-rays to eradicate cancer cells, important in reducing the risk of local recurrence. The high-tech environment and unfamiliar nature of RT can affect the patient\u27s experience of the treatment. Misconceptions or lack of knowledge about RT processes can increase levels of anxiety and enhance feelings of being unprepared at the beginning of treatment. Moreover, the waiting time is often quite long. The primary aim of this study will be to evaluate whether a digital information tool with VR-technology and preparatory information can decrease distress as well as enhance the self-efficacy and health literacy of patients affected by breast cancer before, during, and after RT. A secondary aim will be to explore whether the digital information tool increase patient flow while maintaining or increasing the quality of care. Method: The study is a prospective and longitudinal RCT study with an Action Research participatory design approach including mixed-methods data collection, i.e., standardised instruments, qualitative interviews (face-to-face and telephone) with a phenomenological hermeneutical approach, diaries, observations, and time measurements, and scheduled to take place from autumn 2020 to spring 2022. The intervention group (n=80), will receive standard care and information (oral and written) and the digital information tool; and the control group (n=80), will receive standard care and information (oral and written). Study recruitment and randomisation will be completed at two centres in the west of Sweden. Discussion: Research in this area is scarce and, to our knowledge, only few previous studies examine VR as a tool for increasing preparedness for patients with breast cancer about to undergo RT that also includes follow-ups six months after completed treatment. The participatory approach and design will safeguard the possibilities to capture the patient perspective throughout the development process, and the RCT design supports high research quality. Digitalisation brings new possibilities to provide safe, person-centred information that also displays a realistic picture of RT treatment and its contexts. The planned study will generate generalisable knowledge of relevance in similar health care contexts.Trial registration: ClinicalTrials.gov Identifier: NCT04394325. Registered May 19, 2020. Prospectively registered

    Integrating perspectives of patients, healthcare professionals, system developers and academics in the co-design of a digital information tool

    Get PDF
    Background Patients diagnosed with cancer who are due to commence radiotherapy, often, despite the provision of a considerable amount of information, report a range of unmet information needs about the treatment process. Factors such as inadequate provision of information, or the stressful situation of having to deal with information about unfamiliar things, may influence the patient’s ability to comprehend the information. There is a need to further advance the format in which such information is presented. The composition of information should be tailored according to the patient’s individual needs and style of learning. Method and findings The PD methodology is frequently used when a technology designed artefact is the desired result of the process. This research is descriptive of its kind and provides a transparent description of the co-design process used to develop an innovative digital information tool employing PD methodology where several stakeholders participated as co-designers. Involving different stakeholders in the process in line with recommended PD activities enabled us to develop a digital information tool that has the potential to be relevant and user-friendly for the ultimate consumer. Conclusions Facilitating collaboration, structured PD activities can help researchers, healthcare professionals and patients to co-design patient information that meets the end users’ needs. Furthermore, it can enhance the rigor of the process, ensure the relevance of the information, and finally have a potential to employ a positive effect on the reach of the related digital information tool

    Observed Loss of Polar Mesospheric Ozone Following Substorm-Driven Electron Precipitation

    Get PDF
    Several drivers cause precipitation of energetic electrons into the atmosphere. While some of these drivers are accounted for in proxies of energetic electron precipitation (EEP) used in atmosphere and climate models, it is unclear to what extent the proxies capture substorm‐induced EEP. The energies of these electrons allow them to reach altitudes between 55 and 95 km. EEP‐driven enhanced ionization is known to result in production of HOx and NOx, which catalytically destroy ozone. Substorm‐driven ozone loss has previously been simulated, but has not been observed before. We use mesospheric ozone observations from the Microwave Limb Sounder and Global Ozone Monitoring by Occultation of Stars instruments, to investigate the loss of ozone during substorms. Following substorm onset, we find reductions of polar mesospheric (∼76 km) ozone by up to 21% on average. This is the first observational evidence demonstrating the importance of substorms on the ozone balance within the polar atmosphere

    Infrared Lightcurves of Near Earth Objects

    Get PDF
    We present lightcurves and derive periods and amplitudes for a subset of 38 near earth objects (NEOs) observed at 4.5 microns with the IRAC camera on the the Spitzer Space Telescope, many of them having no previously reported rotation periods. This subset was chosen from about 1800 IRAC NEO observations as having obvious periodicity and significant amplitude. For objects where the period observed did not sample the full rotational period, we derived lower limits to these parameters based on sinusoidal fits. Lightcurve durations ranged from 42 to 544 minutes, with derived periods from 16 to 400 minutes. We discuss the effects of lightcurve variations on the thermal modeling used to derive diameters and albedos from Spitzer photometry. We find that both diameters and albedos derived from the lightcurve maxima and minima agree with our previously published results, even for extreme objects, showing the conservative nature of the thermal model uncertainties. We also evaluate the NEO rotation rates, sizes, and their cohesive strengths.Comment: 16 pages, 4 figures, 3 tables, to appear in the Astrophysical Journal Supplement Serie
    corecore