149 research outputs found

    The academic, economic and societal impacts of Open Access: an evidence-based review

    Get PDF
    Ongoing debates surrounding Open Access to the scholarly literature are multifaceted and complicated by disparate and often polarised viewpoints from engaged stakeholders. At the current stage, Open Access has become such a global issue that it is critical for all involved in scholarly publishing, including policymakers, publishers, research funders, governments, learned societies, librarians, and academic communities, to be well-informed on the history, benefits, and pitfalls of Open Access. In spite of this, there is a general lack of consensus regarding the potential pros and cons of Open Access at multiple levels. This review aims to be a resource for current knowledge on the impacts of Open Access by synthesizing important research in three major areas: academic, economic and societal. While there is clearly much scope for additional research, several key trends are identified, including a broad citation advantage for researchers who publish openly, as well as additional benefits to the non-academic dissemination of their work. The economic impact of Open Access is less well-understood, although it is clear that access to the research literature is key for innovative enterprises, and a range of governmental and non-governmental services. Furthermore, Open Access has the potential to save both publishers and research funders considerable amounts of financial resources, and can provide some economic benefits to traditionally subscription-based journals. The societal impact of Open Access is strong, in particular for advancing citizen science initiatives, and leveling the playing field for researchers in developing countries. Open Access supersedes all potential alternative modes of access to the scholarly literature through enabling unrestricted re-use, and long-term stability independent of financial constraints of traditional publishers that impede knowledge sharing. However, Open Access has the potential to become unsustainable for research communities if high-cost options are allowed to continue to prevail in a widely unregulated scholarly publishing market. Open Access remains only one of the multiple challenges that the scholarly publishing system is currently facing. Yet, it provides one foundation for increasing engagement with researchers regarding ethical standards of publishing and the broader implications of 'Open Research'

    Flowing with Time: a New Approach to Nonlinear Cosmological Perturbations

    Full text link
    Nonlinear effects are crucial in order to compute the cosmological matter power spectrum to the accuracy required by future generation surveys. Here, a new approach is presented, in which the power spectrum, the bispectrum and higher order correlations, are obtained -- at any redshift and for any momentum scale -- by integrating a system of differential equations. The method is similar to the familiar BBGKY hierarchy. Truncating at the level of the trispectrum, the solution of the equations corresponds to the summation of an infinite class of perturbative corrections. Compared to other resummation frameworks, the scheme discussed here is particularly suited to cosmologies other than LambdaCDM, such as those based on modifications of gravity and those containing massive neutrinos. As a first application, we compute the Baryonic Acoustic Oscillation feature of the power spectrum, and compare the results with perturbation theory, the halo model, and N-body simulations. The density-velocity and velocity-velocity power spectra are also computed, showing that they are much less contaminated by nonlinearities than the density-density one. The approach can be seen as a particular formulation of the renormalization group, in which time is the flow parameter.Comment: 20 pages, 7 figures. Matches version published on JCA

    Photometric Redshift Estimation Using Spectral Connectivity Analysis

    Full text link
    The development of fast and accurate methods of photometric redshift estimation is a vital step towards being able to fully utilize the data of next-generation surveys within precision cosmology. In this paper we apply a specific approach to spectral connectivity analysis (SCA; Lee & Wasserman 2009) called diffusion map. SCA is a class of non-linear techniques for transforming observed data (e.g., photometric colours for each galaxy, where the data lie on a complex subset of p-dimensional space) to a simpler, more natural coordinate system wherein we apply regression to make redshift predictions. As SCA relies upon eigen-decomposition, our training set size is limited to ~ 10,000 galaxies; we use the Nystrom extension to quickly estimate diffusion coordinates for objects not in the training set. We apply our method to 350,738 SDSS main sample galaxies, 29,816 SDSS luminous red galaxies, and 5,223 galaxies from DEEP2 with CFHTLS ugriz photometry. For all three datasets, we achieve prediction accuracies on par with previous analyses, and find that use of the Nystrom extension leads to a negligible loss of prediction accuracy relative to that achieved with the training sets. As in some previous analyses (e.g., Collister & Lahav 2004, Ball et al. 2008), we observe that our predictions are generally too high (low) in the low (high) redshift regimes. We demonstrate that this is a manifestation of attenuation bias, wherein measurement error (i.e., uncertainty in diffusion coordinates due to uncertainty in the measured fluxes/magnitudes) reduces the slope of the best-fit regression line. Mitigation of this bias is necessary if we are to use photometric redshift estimates produced by computationally efficient empirical methods in precision cosmology.Comment: Resubmitted to MNRAS (11 pages, 8 figures

    Impact of Redshift Information on Cosmological Applications with Next-Generation Radio Surveys

    Get PDF
    In this paper, we explore how the forthcoming generation of large-scale radio continuum surveys, with the inclusion of some degree of redshift information, can constrain cosmological parameters. By cross-matching these radio surveys with shallow optical to near-infrared surveys, we can essentially separate the source distribution into a low- and a high-redshift sample, thus providing a constraint on the evolution of cosmological parameters such as those related to dark energy. We examine two radio surveys, the Evolutionary Map of the Universe (EMU) and the Westerbork Observations of the Deep APERTIF Northern sky (WODAN). A crucial advantage is their combined potential to provide a deep, full-sky survey. The surveys used for the cross-identifications are SkyMapper and SDSS, for the southern and northern skies, respectively. We concentrate on the galaxy clustering angular power spectrum as our benchmark observable, and find that the possibility of including such low redshift information yields major improvements in the determination of cosmological parameters. With this approach, and provided a good knowledge of the galaxy bias evolution, we are able to put strict constraints on the dark energy parameters, i.e. w_0=-0.9+/-0.041 and w_a=-0.24+/-0.13, with type Ia supernovae and CMB priors (with a one-parameter bias in this case); this corresponds to a Figure of Merit (FoM) > 600, which is twice better than what is obtained by using only the cross-identified sources and greater than four time better than the case without any redshift information at all.Comment: 12 pages, 6 figures, 6 tables; accepted for publication in MNRA

    Galaxy Zoo: Reproducing Galaxy Morphologies Via Machine Learning

    Get PDF
    We present morphological classifications obtained using machine learning for objects in SDSS DR6 that have been classified by Galaxy Zoo into three classes, namely early types, spirals and point sources/artifacts. An artificial neural network is trained on a subset of objects classified by the human eye and we test whether the machine learning algorithm can reproduce the human classifications for the rest of the sample. We find that the success of the neural network in matching the human classifications depends crucially on the set of input parameters chosen for the machine-learning algorithm. The colours and parameters associated with profile-fitting are reasonable in separating the objects into three classes. However, these results are considerably improved when adding adaptive shape parameters as well as concentration and texture. The adaptive moments, concentration and texture parameters alone cannot distinguish between early type galaxies and the point sources/artifacts. Using a set of twelve parameters, the neural network is able to reproduce the human classifications to better than 90% for all three morphological classes. We find that using a training set that is incomplete in magnitude does not degrade our results given our particular choice of the input parameters to the network. We conclude that it is promising to use machine- learning algorithms to perform morphological classification for the next generation of wide-field imaging surveys and that the Galaxy Zoo catalogue provides an invaluable training set for such purposes.Comment: 13 Pages, 5 figures, 10 tables. Accepted for publication in MNRAS. Revised to match accepted version

    Data Deluge in Astrophysics: Photometric Redshifts as a Template Use Case

    Get PDF
    Astronomy has entered the big data era and Machine Learning based methods have found widespread use in a large variety of astronomical applications. This is demonstrated by the recent huge increase in the number of publications making use of this new approach. The usage of machine learning methods, however is still far from trivial and many problems still need to be solved. Using the evaluation of photometric redshifts as a case study, we outline the main problems and some ongoing efforts to solve them.Comment: 13 pages, 3 figures, Springer's Communications in Computer and Information Science (CCIS), Vol. 82

    Designing Future Dark Energy Space Missions: II. Photometric Redshift of Space Weak Lensing Optimized Survey

    Full text link
    Accurate weak-lensing analysis requires not only accurate measurement of galaxy shapes but also precise and unbiased measurement of galaxy redshifts. The photometric redshift technique appears as the only possibility to determine the redshift of the background galaxies used in the weak-lensing analysis. Using the photometric redshift quality, simple shape measurement requirements, and a proper sky model, we explore what could be an optimal weak-lensing dark energy mission based on FoM calculation. We found that photometric redshifts reach their best accuracy for the bulk of the faint galaxy population when filters have a resolution R~3.2. We show that an optimal mission would survey the sky through 8 filters using 2 cameras (visible and near infrared). Assuming a 5-year mission duration, a mirror size of 1.5m, a 0.5deg2 FOV with a visible pixel scale of 0.15", we found that a homogeneous survey reaching IAB=25.6 (10sigma) with a sky coverage of ~11000deg2 maximizes the Weak Lensing FoM. The effective number density of galaxies then used for WL is ~45gal/arcmin2, at least a factor of two better than ground based survey. This work demonstrates that a full account of the observational strategy is required to properly optimize the instrument parameters to maximize the FoM of the future weak-lensing space dark energy mission.Comment: 25 pages, 39 figures, accepted in A&

    You Know What It Is: Learning Words through Listening to Hip-Hop

    Get PDF
    Music listeners have difficulty correctly understanding and remembering song lyrics. However, results from the present study support the hypothesis that young adults can learn African-American English (AAE) vocabulary from listening to hip-hop music. Non-African-American participants first gave free-response definitions to AAE vocabulary items, after which they answered demographic questions as well as questions addressing their social networks, their musical preferences, and their knowledge of popular culture. Results from the survey show a positive association between the number of hip-hop artists listened to and AAE comprehension vocabulary scores. Additionally, participants were more likely to know an AAE vocabulary item if the hip-hop artists they listen to use the word in their song lyrics. Together, these results suggest that young adults can acquire vocabulary through exposure to hip-hop music, a finding relevant for research on vocabulary acquisition, the construction of adolescent and adult identities, and the adoption of lexical innovations

    Herschel-ATLAS: VISTA VIKING near-IR counterparts in the Phase 1 GAMA 9h data

    Get PDF
    We identify near-infrared Ks band counterparts to Herschel-ATLAS sub-mm sources, using a preliminary object catalogue from the VISTA VIKING survey. The sub-mm sources are selected from the H-ATLAS Phase 1 catalogue of the GAMA 9h field, which includes all objects detected at 250, 350 or 500 um with the SPIRE instrument. We apply and discuss a likelihood ratio (LR) method for VIKING candidates within a search radius of 10" of the 22,000 SPIRE sources with a 5 sigma detection at 250 um. We find that 11,294(51%) of the SPIRE sources have a best VIKING counterpart with a reliability R≥0.8R\ge 0.8, and the false identification rate of these is estimated to be 4.2%. We expect to miss ~5% of true VIKING counterparts. There is evidence from Z-J and J-Ks colours that the reliable counterparts to SPIRE galaxies are marginally redder than the field population. We obtain photometric redshifts for ~68% of all (non-stellar) VIKING candidates with a median redshift of 0.405. Comparing to the results of the optical identifications supplied with the Phase I catalogue, we find that the use of medium-deep near-infrared data improves the identification rate of reliable counterparts from 36% to 51%.Comment: 20 pages, 20 figures, 3 tables, accepted by MNRA

    The Cross-Correlation between Galaxies and Groups: Probing the Galaxy Distribution in and around Dark Matter Haloes

    Full text link
    We determine the cross-correlation function between galaxies and galaxy groups, using both the Two-Degree Field Galaxy Redshift Survey (2dFGRS) and the Sloan Digital Sky Survey (SDSS). We study the cross-correlation as a function of group mass, and as a function of the luminosity, stellar mass, colour, spectral type and specific star formation rate of the galaxies. All these cross-correlation functions show a clear transition from the `1-halo' to the `2-halo' regimes on a scale comparable to the virial radius of the groups in consideration. On scales larger than the virial radius, all cross-correlation functions are roughly parallel, consistent with the linear bias model. In particular, the large scale correlation amplitudes are higher for more massive groups, and for brighter and redder galaxies. In the `1-halo' regime, the cross-correlation function depends strongly on the definition of the group center. We consider both a luminosity-weighted center (LWC) and a center defined by the location of the brightest group galaxy (BGC). With the first definition, the bright early-type galaxies in massive groups are found to be more centrally concentrated than the fainter, late-type galaxies. Using the BGC, and excluding the brightest galaxy from the cross correlation analysis, we only find significant segregation in massive groups (M \gta 10^{13}h^{-1}\msun) for galaxies of different spectral types (or colours or specific star formation rates). In haloes with masses \la 10^{13}h^{-1}\msun, there is a significant deficit of bright satellite galaxies. Comparing the results from the 2dFGRS with those obtained from realistic mock samples, we find that the distribution of galaxies in groups is much less concentrated than dark matter haloes predicted by the current Λ\LambdaCDM model. (Abridged)Comment: 18 pages, 11 figures. Accepted for publication in MNRAS, 1 table added, fig7 replace
    • …
    corecore