780 research outputs found

    ODISEES: Ontology-Driven Interactive Search Environment for Earth Sciences

    Get PDF
    This paper discusses the Ontology-driven Interactive Search Environment for Earth Sciences (ODISEES) project currently being developed to aid researchers attempting to find usable data among an overabundance of closely related data. ODISEES' ontological structure relies on a modular, adaptable concept modeling approach, which allows the domain to be modeled more or less as it is without worrying about terminology or external requirements. In the model, variables are individually assigned semantic content based on the characteristics of the measurements they represent, allowing intuitive discovery and comparison of data without requiring the user to sift through large numbers of data sets and variables to find the desired information

    High-Sensitivity Measurement of 3He-4He Isotopic Ratios for Ultracold Neutron Experiments

    Get PDF
    Research efforts ranging from studies of solid helium to searches for a neutron electric dipole moment require isotopically purified helium with a ratio of 3He to 4He at levels below that which can be measured using traditional mass spectroscopy techniques. We demonstrate an approach to such a measurement using accelerator mass spectroscopy, reaching the 10e-14 level of sensitivity, several orders of magnitude more sensitive than other techniques. Measurements of 3He/4He in samples relevant to the measurement of the neutron lifetime indicate the need for substantial corrections. We also argue that there is a clear path forward to sensitivity increases of at least another order of magnitude.Comment: 11 pages, 10 figure

    V2368 Oph: An eclipsing and double-lined spectroscopic binary used as a photometric comparison star for U Oph

    Full text link
    The A-type star HR 6412 = V2368 Oph was used by several investigators as a photometric comparison star for the known eclipsing binary U Oph but was found to be variable by three independent groups, including us. By analysing series of new spectral and photometric observations and a critical compilation of available radial velocities, we were able to find the correct period of light and radial-velocity variations and demonstrate that the object is an eclipsing and double-lined spectroscopic binary moving in a highly eccentric orbit. We derived a linear ephemeris T min.I = HJD (2454294.67 +/- 0.01) + (38.32712 +/- 0.00004)d x E and estimated preliminary basic physical properties of the binary. The dereddened UBV magnitudes and effective temperatures of the primary and secondary, based on our light- and velocity-curve solutions, led to distance estimates that agree with the Hipparcos distance within the errors. We find that the mass ratio must be close to one, but the limited number and wavelength range of our current spectra does not allow a truly precise determination of the binary masses. Nevertheless, our results show convincingly that both binary components are evolved away from the main sequence, which makes this system astrophysically very important. There are only a few similarly evolved A-type stars among known eclipsing binaries. Future systematic observations and careful analyses can provide very stringent tests for the stellar evolutionary theory.Comment: 10 pages, 7 figs, in press 2011 A&

    AI ATAC 1: An Evaluation of Prominent Commercial Malware Detectors

    Full text link
    This work presents an evaluation of six prominent commercial endpoint malware detectors, a network malware detector, and a file-conviction algorithm from a cyber technology vendor. The evaluation was administered as the first of the Artificial Intelligence Applications to Autonomous Cybersecurity (AI ATAC) prize challenges, funded by / completed in service of the US Navy. The experiment employed 100K files (50/50% benign/malicious) with a stratified distribution of file types, including ~1K zero-day program executables (increasing experiment size two orders of magnitude over previous work). We present an evaluation process of delivering a file to a fresh virtual machine donning the detection technology, waiting 90s to allow static detection, then executing the file and waiting another period for dynamic detection; this allows greater fidelity in the observational data than previous experiments, in particular, resource and time-to-detection statistics. To execute all 800K trials (100K files ×\times 8 tools), a software framework is designed to choreographed the experiment into a completely automated, time-synced, and reproducible workflow with substantial parallelization. A cost-benefit model was configured to integrate the tools' recall, precision, time to detection, and resource requirements into a single comparable quantity by simulating costs of use. This provides a ranking methodology for cyber competitions and a lens through which to reason about the varied statistical viewpoints of the results. These statistical and cost-model results provide insights on state of commercial malware detection

    Beyond the Hype: A Real-World Evaluation of the Impact and Cost of Machine Learning-Based Malware Detection

    Full text link
    There is a lack of scientific testing of commercially available malware detectors, especially those that boast accurate classification of never-before-seen (i.e., zero-day) files using machine learning (ML). The result is that the efficacy and gaps among the available approaches are opaque, inhibiting end users from making informed network security decisions and researchers from targeting gaps in current detectors. In this paper, we present a scientific evaluation of four market-leading malware detection tools to assist an organization with two primary questions: (Q1) To what extent do ML-based tools accurately classify never-before-seen files without sacrificing detection ability on known files? (Q2) Is it worth purchasing a network-level malware detector to complement host-based detection? We tested each tool against 3,536 total files (2,554 or 72% malicious, 982 or 28% benign) including over 400 zero-day malware, and tested with a variety of file types and protocols for delivery. We present statistical results on detection time and accuracy, consider complementary analysis (using multiple tools together), and provide two novel applications of a recent cost-benefit evaluation procedure by Iannaconne & Bridges that incorporates all the above metrics into a single quantifiable cost. While the ML-based tools are more effective at detecting zero-day files and executables, the signature-based tool may still be an overall better option. Both network-based tools provide substantial (simulated) savings when paired with either host tool, yet both show poor detection rates on protocols other than HTTP or SMTP. Our results show that all four tools have near-perfect precision but alarmingly low recall, especially on file types other than executables and office files -- 37% of malware tested, including all polyglot files, were undetected.Comment: Includes Actionable Takeaways for SOC

    High-sensitivity measurement of ^3He−^4He isotopic ratios for ultracold neutron experiments

    Get PDF
    Research efforts ranging from studies of solid helium to searches for a neutron electric dipole moment require isotopically purified helium with a ratio of ^3He to ^4He at levels below that which can be measured using traditional mass spectroscopy techniques. We demonstrate an approach to such a measurement using accelerator mass spectroscopy, reaching the 10^(−14) level of sensitivity, several orders of magnitude more sensitive than other techniques. Measurements of ^3He/^4He in samples relevant to the measurement of the neutron lifetime indicate the need for substantial corrections. We also argue that there is a clear path forward to sensitivity increases of at least another order of magnitude

    Supernova / Acceleration Probe: A Satellite Experiment to Study the Nature of the Dark Energy

    Full text link
    The Supernova / Acceleration Probe (SNAP) is a proposed space-based experiment designed to study the dark energy and alternative explanations of the acceleration of the Universe's expansion by performing a series of complementary systematics-controlled measurements. We describe a self-consistent reference mission design for building a Type Ia supernova Hubble diagram and for performing a wide-area weak gravitational lensing study. A 2-m wide-field telescope feeds a focal plane consisting of a 0.7 square-degree imager tiled with equal areas of optical CCDs and near infrared sensors, and a high-efficiency low-resolution integral field spectrograph. The SNAP mission will obtain high-signal-to-noise calibrated light-curves and spectra for several thousand supernovae at redshifts between z=0.1 and 1.7. A wide-field survey covering one thousand square degrees resolves ~100 galaxies per square arcminute. If we assume we live in a cosmological-constant-dominated Universe, the matter density, dark energy density, and flatness of space can all be measured with SNAP supernova and weak-lensing measurements to a systematics-limited accuracy of 1%. For a flat universe, the density-to-pressure ratio of dark energy can be similarly measured to 5% for the present value w0 and ~0.1 for the time variation w'. The large survey area, depth, spatial resolution, time-sampling, and nine-band optical to NIR photometry will support additional independent and/or complementary dark-energy measurement approaches as well as a broad range of auxiliary science programs. (Abridged)Comment: 40 pages, 18 figures, submitted to PASP, http://snap.lbl.go
    corecore