949 research outputs found

    Measuring Transit Signal Recovery in the Kepler Pipeline II: Detection Efficiency as Calculated in One Year of Data

    Get PDF
    The Kepler planet sample can only be used to reconstruct the underlying planet occurrence rate if the detection efficiency of the Kepler pipeline is known, here we present the results of a second experiment aimed at characterising this detection efficiency. We inject simulated transiting planet signals into the pixel data of ~10,000 targets, spanning one year of observations, and process the pixels as normal. We compare the set of detections made by the pipeline with the expectation from the set of simulated planets, and construct a sensitivity curve of signal recovery as a function of the signal-to-noise of the simulated transit signal train. The sensitivity curve does not meet the hypothetical maximum detection efficiency, however it is not as pessimistic as some of the published estimates of the detection efficiency. For the FGK stars in our sample, the sensitivity curve is well fit by a gamma function with the coefficients a = 4.35 and b = 1.05. We also find that the pipeline algorithms recover the depths and periods of the injected signals with very high fidelity, especially for periods longer than 10 days. We perform a simplified occurrence rate calculation using the measured detection efficiency compared to previous assumptions of the detection efficiency found in the literature to demonstrate the systematic error introduced into the resulting occurrence rates. The discrepancies in the calculated occurrence rates may go some way towards reconciling some of the inconsistencies found in the literature.Comment: 13 pages, 7 figures, 1 electronic table, accepted by Ap

    Contamination in the Kepler Field. Identification of 685 KOIs as False Positives Via Ephemeris Matching Based On Q1-Q12 Data

    Get PDF
    The Kepler mission has to date found almost 6000 planetary transit-like signals, utilizing three years of data for over 170,000 stars at extremely high photometric precision. Due to its design, contamination from eclipsing binaries, variable stars, and other transiting planets results in a significant number of these signals being false positives (FPs). This directly affects the determination of the occurrence rate of Earth-like planets in our Galaxy, as well as other planet population statistics. In order to detect as many of these FPs as possible, we perform ephemeris matching among all transiting planet, eclipsing binary, and variable star sources. We find that 685 Kepler Objects of Interest (KOIs)—12% of all those analyzed—are FPs as a result of contamination, due to 409 unique parent sources. Of these, 118 have not previously been identified by other methods. We estimate that ~35% of KOIs are FPs due to contamination, when performing a first-order correction for observational bias. Comparing single-planet candidate KOIs to multi-planet candidate KOIs, we find an observed FP fraction due to contamination of 16% and 2.4% respectively, bolstering the existing evidence that multi-planet KOIs are significantly less likely to be FPs. We also analyze the parameter distributions of the ephemeris matches and derive a simple model for the most common type of contamination in the Kepler field. We find that the ephemeris matching technique is able to identify low signal-to-noise FPs that are difficult to identify with other vetting techniques. We expect FP KOIs to become more frequent when analyzing more quarters of Kepler data, and note that many of them will not be able to be identified based on Kepler data alone

    Systems Biology Approaches For The Analysis Of High-Throughput Biological Data

    Get PDF
    The identification of biological processes involved with a certain phenotype, such as a disease or drug treatment, is the goal of the majority of life sciences experiments. Pathway analysis methods are used to interpret high-throughput biological data to identify such processes by incorporating information on biological systems to translate data into biological knowledge. Although widely used, current methods share a number of limitations. First, they do not take into account the individual contribution of each gene to the phenotype in analysis. Second, most of the methods include parameters of difficult interpretation, often arbitrarily set. Third, the results of all methods are affected by the fact that pathways are not independent entities, but communicate with each other by a phenomenon referred to as crosstalk. Crosstalk effects heavily influence the results of pathway analysis methods, adding a number of false positives and false negatives, making them difficult to interpret. We developed methods that address these limitations by i) allowing for the incorporation of individual gene contributions, ii) developing objective methods for the estimation of parameters of pathway analysis methods, and iii) developing an approach able to detect, quantify, and correct for crosstalk effects. We show on a number of real and simulated data that our approaches increase specificity and sensitivity of pathway analysis, allowing for a more effective identification of the processes and mechanisms underlying biological phenomena

    Automated testsystem of COGNISION headset for cognitive diagnosis.

    Get PDF
    There are more than 15 million Americans suffering from a chronic cognitive disability in the Unites States. Researchers have been exploring many different quantitative measures, such as event related potentials (ERP), electro-encephalogram (EEG), Magnetic Encephalogram (MEG) and Brain volumetry to accurately and repeatedly diagnose patients suffering from debilitating cognitive disorders. More than a million cases have been diagnosed every year, with many of those patients being misdiagnosed as a result of inadequate diagnostic and quality control tools. As a result, the medical device industry has been actively developing alternative diagnostic techniques, which implement one or more quantitative measures to improve diagnosis. For example, Neuronetrix (Louisville, KY) developed COGNISION™ that utilizes both ERP and EEG data to diagnose the cognitive ability of patients. The system has shown to be a powerful tool; however, its commercial success would be limited without lack of a fast and effective method of testing and validating the product. Thus, the goal of this study is to develop, test and validate a new “Testset” system for accurately and repeatedly validating the COGNISION™ Headset. A Testset was constructed that is comprised of a software control component designed using the Labview G programming language, which runs on a computer terminal, a Data Acquisition (DAQ) card and switching board. The Testset is connected to a series of testing fixtures for interfacing with the various components of the Headset. The Testset evaluates the Headset at multiple stages of the manufacturing process as a whole system or by its individual components. At the first stage of production the Electrode Strings, amplifier board (Uberyoke), and Headset Control Unit (HCU) are tested and operated as individual printed circuit boards (PCBs). These components are again tested as mid-level assemblies and/or at the finished product stage as a complete autonomous system with the Testset monitoring the process. All tests are automated, requiring only a few parameters to be defined before a test is initiated by a single button press, and then selected test sequences are begun for that particular component or system and are completed in a few minutes. A total of 2 Testsets were constructed and used to validate 10 Headsets. An automated software system was designed to control the Testset. The Testset demonstrated the ability to validate and test 100% of the individual components and completed assembled Headsets. The Testsets were found to be within 5% of the manufacturing specifications. Subsequently, the Automated Testset developed in this study enabled the manufacturer to provide a comprehensive report on the calibration parameters of the Headset, which is retained on file for each unit sold. The automated testsystem’s statistical analysis shows that the two Testsets yielded reliable and consistent results with each other

    Root Cause Analysis for Autonomous Optical Network Security Management

    Get PDF
    The ongoing evolution of optical networks towards autonomous systems supporting high-performance services beyond 5G requires advanced functionalities for automated security management. To cope with evolving threat landscape, security diagnostic approaches should be able to detect and identify the nature not only of existing attack techniques, but also those hitherto unknown or insufficiently represented. Machine Learning (ML)-based algorithms perform well when identifying known attack types, but cannot guarantee precise identification of unknown attacks. This makes Root Cause Analysis (RCA) crucial for enabling timely attack response when human intervention is unavoidable. We address these challenges by establishing an ML-based framework for security assessment and analyzing RCA alternatives for physical-layer attacks. We first scrutinize different Network Management System (NMS) architectures and the corresponding security assessment capabilities. We then investigate the applicability of supervised and unsupervised learning (SL and UL) approaches for RCA and propose a novel UL-based RCA algorithm called Distance-Based Root Cause Analysis (DB-RCA). The framework’s applicability and performance for autonomous optical network security management is validated on an experimental physical-layer security dataset, assessing the benefits and drawbacks of the SL-and UL-based RCA. Besides confirming that SL-based approaches can provide precise RCA output for known attack types upon training, we show that the proposed UL-based RCA approach offers meaningful insight into the anomalies caused by novel attack types, thus supporting the human security officers in advancing the physical-layer security diagnostics

    Identification of mineral components in tropical soils using reflectance spectroscopy and advanced spaceborne thermal emission and reflection radiometer (ASTER) data.

    Get PDF
    Soil characteristics provide important support for understanding transformations that occur in environmental systems. Physical characteristics and chemical compositions of soils controlled by pedogenetic processes, climatic changes and land use imply different types of environmental transformations. Reflectance spectroscopy is an alternative soil mapping technique that uses spectral absorption features between visible (VIS) and short-wave infrared (SWIR) wavelengths (0.3?2.5 μm) for determining soil mineralogy. Soil analysis by means of reflectance spectroscopy and orbital optical sensors have provided favorable results in mapping transformation processes in environmental systems, particularly in arid and semiarid climates in extra-tropical terrains. In the case of inter-tropical environments, these methods cannot be readily applied due to local factors such as lack of exposed regolith, high amounts of soil moisture and the presence of dense vegetation. This study uses Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and reflectance spectroscopy data to map mineral components of soils covering a part of the state of SĂŁo Paulo, Brazil, which could be linked to key aspects of environmental transformations in this tropical area (e.g., climate change, shifts in agriculture fronts, ph, and soil characteristics). We collected forty-two (42) soil samples at a depth of 0?20 cm, considering that this superficial layer corresponds to the highest correlation with soil properties detected by the ASTER sensor. These samples were measured using a FieldSpec FR spectrometer, and the derived spectra were interpreted for mineral composition. Interpretation was supported by X-ray diffraction analysis on the same samples. The spectral signatures were re-sampled to ASTER VNIR (AST1-4: 0.52?0.86 μm) and SWIR (AST5-9: 1.60?2.43 μm) spectral bandwidths and validated by comparing reflectance spectra of field samples with those extracted from atmospherically corrected and calibrated ASTER pixels. The agreement between spectral signatures measured from soil samples and those derived from ASTER imagery pixels proved plausible, with R2 correlation values ranging from 0.6493 to 0.7886. This signifies that diagnostic spectral features of key minerals in tropical soils can be mapped at the spectral resolution of 9-band ASTER VNIR through SWIR reflectance. We used these spectral signatures as end-members in hyperspectral routine classifications adapted for use with ASTER data. Results proved possible the identification and remote mapping of minerals such as kaolinite, montmorillonite and gibbsite, as well as the distinction between iron-rich and iron-poor soils

    Miniaturizing High Throughput Droplet Assays For Ultrasensitive Molecular Detection On A Portable Platform

    Get PDF
    Digital droplet assays – in which biological samples are compartmentalized into millions of femtoliter-volume droplets and interrogated individually – have generated enormous enthusiasm for their ability to detect biomarkers with single-molecule sensitivity. These assays have untapped potential for point-of-care diagnostics but are mainly confined to laboratory settings due to the instrumentation necessary to serially generate, control, and measure millions of compartments. To address this challenge, we developed an optofluidic platform that miniaturizes digital assays into a mobile format by parallelizing their operation. This technology has three key innovations: 1. the integration and parallel operation of hundred droplet generators onto a single chip that operates \u3e100x faster than a single droplet generator. 2. the fluorescence detection of droplets at \u3e100x faster than conventional in-flow detection using time-domain encoded mobile-phone imaging, and 3. the integration of on-chip delay lines and sample processing to allow serum-to-answer device operation. By using this time-domain modulation with cloud computing, we overcome the low framerate of digital imaging, and achieve throughputs of one million droplets per second. To demonstrate the power of this approach, we performed a duplex digital enzyme-linked immunosorbent assay (ELISA) in serum to show a 1000x improvement over standard ELISA and matching that of the existing laboratory-based gold standard digital ELISA system. This work has broad potential for ultrasensitive, highly multiplexed detection, in a mobile format. Building on our initial demonstration, we explored the following: (i) we demonstrated that the platform can be extended to \u3e100x multiplexing by using time-domain encoded light sources to detect color-coded beads that each correspond to a unique assay, (ii) we demonstrated that the platform can be extended to the detection of nucleic acid by implementing polymerase chain reaction, and (iii) we demonstrated that sensitivity can be improved with a nanoparticle-enhanced ELISA. Clinical applications can be expanded to measure numerous biomarkers simultaneously such as surface markers, proteins, and nucleic acids. Ultimately, by building a robust device, suitable for low-cost implementation with ultrasensitive capabilities, this platform can be used as a tool to quantify numerous medical conditions and help physicians choose optimal treatment strategies to enable personalized medicine in a cost-effective manner
    • …
    corecore