103,374 research outputs found

    Measuring the Value of the Academic Library: Return on Investment and Other Value Measures

    Get PDF
    Return on investment (ROI) is one method of measuring the value of a library\u27s e-journal collection. In an international study designed to test an ROI formula developed as a case study at the University of Illinois, ROI of the value of e-journals to grants income was found to vary depending on the mission and subject emphasis of the institution. Faculty members report that e-journals have transformed the way they do research, including making them more productive and competitive. Future studies will examine ROI beyond grants income and beyond the value of e-journal collections

    Nose Heat: Exploring Stress-induced Nasal Thermal Variability through Mobile Thermal Imaging

    Full text link
    Automatically monitoring and quantifying stress-induced thermal dynamic information in real-world settings is an extremely important but challenging problem. In this paper, we explore whether we can use mobile thermal imaging to measure the rich physiological cues of mental stress that can be deduced from a person's nose temperature. To answer this question we build i) a framework for monitoring nasal thermal variable patterns continuously and ii) a novel set of thermal variability metrics to capture a richness of the dynamic information. We evaluated our approach in a series of studies including laboratory-based psychosocial stress-induction tasks and real-world factory settings. We demonstrate our approach has the potential for assessing stress responses beyond controlled laboratory settings

    How much baseline correction do we need in ERP research? Extended GLM model can replace baseline correction while lifting its limits

    Full text link
    Baseline correction plays an important role in past and current methodological debates in ERP research (e.g. the Tanner v. Maess debate in Journal of Neuroscience Methods), serving as a potential alternative to strong highpass filtering. However, the very assumptions that underlie traditional baseline also undermine it, making it statistically unnecessary and even undesirable and reducing signal-to-noise ratio. Including the baseline interval as a predictor in a GLM-based statistical approach allows the data to determine how much baseline correction is needed, including both full traditional and no baseline correction as subcases, while reducing the amount of variance in the residual error term and thus potentially increasing statistical power

    Fermi-LAT Observations of High- and Intermediate-Velocity Clouds: Tracing Cosmic Rays in the Halo of the Milky Way

    Full text link
    It is widely accepted that cosmic rays (CRs) up to at least PeV energies are Galactic in origin. Accelerated particles are injected into the interstellar medium where they propagate to the farthest reaches of the Milky Way, including a surrounding halo. The composition of CRs coming to the solar system can be measured directly and has been used to infer the details of CR propagation that are extrapolated to the whole Galaxy. In contrast, indirect methods, such as observations of gamma-ray emission from CR interactions with interstellar gas, have been employed to directly probe the CR densities in distant locations throughout the Galactic plane. In this article we use 73 months of data from the Fermi Large Area Telescope in the energy range between 300 MeV and 10 GeV to search for gamma-ray emission produced by CR interactions in several high- and intermediate-velocity clouds located at up to ~ 7 kpc above the Galactic plane. We achieve the first detection of intermediate-velocity clouds in gamma rays and set upper limits on the emission from the remaining targets, thereby tracing the distribution of CR nuclei in the halo for the first time. We find that the gamma-ray emissivity per H atom decreases with increasing distance from the plane at 97.5% confidence level. This corroborates the notion that CRs at the relevant energies originate in the Galactic disk. The emissivity of the upper intermediate-velocity Arch hints at a 50% decline of CR densities within 2 kpc from the plane. We compare our results to predictions of CR propagation models.Comment: Accepted for publication in the Astrophysical Journa

    Prospects for Annihilating Dark Matter in the inner Galactic halo by the Cherenkov Telescope Array

    Full text link
    We compute the sensitivity to dark matter annihilations for the forthcoming large Cherenkov Telescope Array (CTA) in several primary channels and over a range of dark matter masses from 30 GeV up to 80 TeV. For all channels, we include inverse Compton scattering of e±^\pm by dark matter annihilations on the ambient photon background, which yields substantial contributions to the overall gamma-ray flux. We improve the analysis over previous work by: i) implementing a spectral and morphological analysis of the gamma-ray emission; ii) taking into account the most up-to-date cosmic ray background obtained from a full CTA Monte Carlo simulation and a description of the diffuse astrophysical emission; and iii) including the systematic uncertainties in the rich observational CTA datasets. We find that our spectral and morphological analysis improves the CTA sensitivity by roughly a factor 2. For the hadronic channels, CTA will be able to probe thermal dark matter candidates over a broad range of masses if the systematic uncertainties in the datasets will be controlled better than the percent level. For the leptonic modes, the CTA sensitivity will be well below the thermal value of the annihilation cross-section. In this case, even with larger systematics, thermal dark matter candidates up to masses of a few TeV will be easily studied.Comment: 15 pages, 4 figures, v2: Jfactors for two different DM profiles in Tab.1 added; two new plots added; some clarifications and some references added; results unchanged; matches version published on Phys. Rev.

    Investigating the Uniformity of the Excess Gamma rays towards the Galactic Center Region

    Full text link
    We perform a composite likelihood analysis of subdivided regions within the central 26×2026^\circ\times20^\circ of the Milky Way, with the aim of characterizing the spectrum of the gamma-ray galactic center excess in regions of varying galactocentric distance. Outside of the innermost few degrees, we find that the radial profile of the excess is background-model dependent and poorly constrained. The spectrum of the excess emission is observed to extend upwards of 10 GeV outside 5\sim5^\circ in radius, but cuts off steeply between 10--20 GeV only in the innermost few degrees. If interpreted as a real feature of the excess, this radial variation in the spectrum has important implications for both astrophysical and dark matter interpretations of the galactic center excess. Single-component dark matter annihilation models face challenges in reproducing this variation; on the other hand, a population of unresolved millisecond pulsars contributing both prompt and secondary inverse Compton emission may be able to explain the spectrum as well as its spatial dependency. We show that the expected differences in the photon-count distributions of a smooth dark matter annihilation signal and an unresolved point source population are an order of magnitude smaller than the fluctuations in residuals after fitting the data, which implies that mismodeling is an important systematic effect in point source analyses aimed at resolving the gamma-ray excess.Comment: 27 pages, 9 figures. Matches accepted version: references added, typo corrected in Sec. 4.2, some additional discussion added (results unchanged

    Disambiguating the role of blood flow and global signal with partial information decomposition

    Get PDF
    Global signal (GS) is an ubiquitous construct in resting state functional magnetic resonance imaging (rs-fMRI), associated to nuisance, but containing by definition most of the neuronal signal. Global signal regression (GSR) effectively removes the impact of physiological noise and other artifacts, but at the same time it alters correlational patterns in unpredicted ways. Performing GSR taking into account the underlying physiology (mainly the blood arrival time) has been proven to be beneficial. From these observations we aimed to: 1) characterize the effect of GSR on network-level functional connectivity in a large dataset; 2) assess the complementary role of global signal and vessels; and 3) use the framework of partial information decomposition to further look into the joint dynamics of the global signal and vessels, and their respective influence on the dynamics of cortical areas. We observe that GSR affects intrinsic connectivity networks in the connectome in a non-uniform way. Furthermore, by estimating the predictive information of blood flow and the global signal using partial information decomposition, we observe that both signals are present in different amounts across intrinsic connectivity networks. Simulations showed that differences in blood arrival time can largely explain this phenomenon, while using hemodynamic and calcium mouse recordings we were able to confirm the presence of vascular effects, as calcium recordings lack hemodynamic information. With these results we confirm network-specific effects of GSR and the importance of taking blood flow into account for improving de-noising methods. Additionally, and beyond the mere issue of data denoising, we quantify the diverse and complementary effect of global and vessel BOLD signals on the dynamics of cortical areas
    corecore