744 research outputs found

    Vanuatu

    Get PDF

    Pupil Dilation Dynamics Track Attention to High-Level Information

    Get PDF
    It has long been thought that the eyes index the inner workings of the mind. Consistent with this intuition, empirical research has demonstrated that pupils dilate as a consequence of attentional effort. Recently, Smallwood et al. (2011) demonstrated that pupil dilations not only provide an index of overall attentional effort, but are time-locked to stimulus changes during attention (but not during mind-wandering). This finding suggests that pupil dilations afford a dynamic readout of conscious information processing. However, because stimulus onsets in their study involved shifts in luminance as well as information, they could not determine whether this coupling of stimulus and pupillary dynamics reflected attention to low-level (luminance) or high-level (information) changes. Here, we replicated the methodology and findings of Smallwood et al. (2011) while controlling for luminance changes. When presented with isoluminant digit sequences, participants\u27 pupillary dilations were synchronized with stimulus onsets when attending, but not when mind-wandering. This replicates Smallwood et al. (2011) and clarifies their finding by demonstrating that stimulus-pupil coupling reflects online cognitive processing beyond sensory gain

    Estimation of conditional laws given an extreme component

    Full text link
    Let (X,Y)(X,Y) be a bivariate random vector. The estimation of a probability of the form P(YyX>t)P(Y\leq y \mid X >t) is challenging when tt is large, and a fruitful approach consists in studying, if it exists, the limiting conditional distribution of the random vector (X,Y)(X,Y), suitably normalized, given that XX is large. There already exists a wide literature on bivariate models for which this limiting distribution exists. In this paper, a statistical analysis of this problem is done. Estimators of the limiting distribution (which is assumed to exist) and the normalizing functions are provided, as well as an estimator of the conditional quantile function when the conditioning event is extreme. Consistency of the estimators is proved and a functional central limit theorem for the estimator of the limiting distribution is obtained. The small sample behavior of the estimator of the conditional quantile function is illustrated through simulations.Comment: 32 pages, 5 figur

    Bone mineral content after renal transplantation

    Get PDF
    Forearm bone mineral content (BMC), as evaluated by photonabsorption densitometry, was measured in 28 cadaver kidney donor recipients who entered the study 8 weeks postoperatively and were followed up for 18 months. BMC decreased signifiantly (p<0.05) but marginally in placebo-treated patients (n=14) (initial BMC 1.09±0.25 g/cm; final BMC 1.05±0.24). Fourteen patients were prophylactically given 1,25(OH)2vitamin D3 in a dose which avoided hypercalcemia and hypercalciuria (sim0.25 µg/day); under 1,25(OH)2 vitamin D3 prophylaxis a significant decrease of forearm BMC was observed no longer (initial BMC 0.94±0.21 g/cm; final BMC 0.95±0.21), but the difference between placebo and 1,25(OH)2 vitamin D3 narrowly missed statistical significance (p=0.066). It is concluded that the decrease of forearm BMC is negligible in transplant recipients with low steroid regimens. The data suggest a trend for prophylaxis with 1,25(OH)2 vitamin D3 to slightly ameliorate forearm (cortical) BMC loss

    High-Sensitivity Measurement of 3He-4He Isotopic Ratios for Ultracold Neutron Experiments

    Get PDF
    Research efforts ranging from studies of solid helium to searches for a neutron electric dipole moment require isotopically purified helium with a ratio of 3He to 4He at levels below that which can be measured using traditional mass spectroscopy techniques. We demonstrate an approach to such a measurement using accelerator mass spectroscopy, reaching the 10e-14 level of sensitivity, several orders of magnitude more sensitive than other techniques. Measurements of 3He/4He in samples relevant to the measurement of the neutron lifetime indicate the need for substantial corrections. We also argue that there is a clear path forward to sensitivity increases of at least another order of magnitude.Comment: 11 pages, 10 figure

    AI ATAC 1: An Evaluation of Prominent Commercial Malware Detectors

    Full text link
    This work presents an evaluation of six prominent commercial endpoint malware detectors, a network malware detector, and a file-conviction algorithm from a cyber technology vendor. The evaluation was administered as the first of the Artificial Intelligence Applications to Autonomous Cybersecurity (AI ATAC) prize challenges, funded by / completed in service of the US Navy. The experiment employed 100K files (50/50% benign/malicious) with a stratified distribution of file types, including ~1K zero-day program executables (increasing experiment size two orders of magnitude over previous work). We present an evaluation process of delivering a file to a fresh virtual machine donning the detection technology, waiting 90s to allow static detection, then executing the file and waiting another period for dynamic detection; this allows greater fidelity in the observational data than previous experiments, in particular, resource and time-to-detection statistics. To execute all 800K trials (100K files ×\times 8 tools), a software framework is designed to choreographed the experiment into a completely automated, time-synced, and reproducible workflow with substantial parallelization. A cost-benefit model was configured to integrate the tools' recall, precision, time to detection, and resource requirements into a single comparable quantity by simulating costs of use. This provides a ranking methodology for cyber competitions and a lens through which to reason about the varied statistical viewpoints of the results. These statistical and cost-model results provide insights on state of commercial malware detection

    Beyond the Hype: A Real-World Evaluation of the Impact and Cost of Machine Learning-Based Malware Detection

    Full text link
    There is a lack of scientific testing of commercially available malware detectors, especially those that boast accurate classification of never-before-seen (i.e., zero-day) files using machine learning (ML). The result is that the efficacy and gaps among the available approaches are opaque, inhibiting end users from making informed network security decisions and researchers from targeting gaps in current detectors. In this paper, we present a scientific evaluation of four market-leading malware detection tools to assist an organization with two primary questions: (Q1) To what extent do ML-based tools accurately classify never-before-seen files without sacrificing detection ability on known files? (Q2) Is it worth purchasing a network-level malware detector to complement host-based detection? We tested each tool against 3,536 total files (2,554 or 72% malicious, 982 or 28% benign) including over 400 zero-day malware, and tested with a variety of file types and protocols for delivery. We present statistical results on detection time and accuracy, consider complementary analysis (using multiple tools together), and provide two novel applications of a recent cost-benefit evaluation procedure by Iannaconne & Bridges that incorporates all the above metrics into a single quantifiable cost. While the ML-based tools are more effective at detecting zero-day files and executables, the signature-based tool may still be an overall better option. Both network-based tools provide substantial (simulated) savings when paired with either host tool, yet both show poor detection rates on protocols other than HTTP or SMTP. Our results show that all four tools have near-perfect precision but alarmingly low recall, especially on file types other than executables and office files -- 37% of malware tested, including all polyglot files, were undetected.Comment: Includes Actionable Takeaways for SOC

    Supernova / Acceleration Probe: A Satellite Experiment to Study the Nature of the Dark Energy

    Full text link
    The Supernova / Acceleration Probe (SNAP) is a proposed space-based experiment designed to study the dark energy and alternative explanations of the acceleration of the Universe's expansion by performing a series of complementary systematics-controlled measurements. We describe a self-consistent reference mission design for building a Type Ia supernova Hubble diagram and for performing a wide-area weak gravitational lensing study. A 2-m wide-field telescope feeds a focal plane consisting of a 0.7 square-degree imager tiled with equal areas of optical CCDs and near infrared sensors, and a high-efficiency low-resolution integral field spectrograph. The SNAP mission will obtain high-signal-to-noise calibrated light-curves and spectra for several thousand supernovae at redshifts between z=0.1 and 1.7. A wide-field survey covering one thousand square degrees resolves ~100 galaxies per square arcminute. If we assume we live in a cosmological-constant-dominated Universe, the matter density, dark energy density, and flatness of space can all be measured with SNAP supernova and weak-lensing measurements to a systematics-limited accuracy of 1%. For a flat universe, the density-to-pressure ratio of dark energy can be similarly measured to 5% for the present value w0 and ~0.1 for the time variation w'. The large survey area, depth, spatial resolution, time-sampling, and nine-band optical to NIR photometry will support additional independent and/or complementary dark-energy measurement approaches as well as a broad range of auxiliary science programs. (Abridged)Comment: 40 pages, 18 figures, submitted to PASP, http://snap.lbl.go
    corecore