127 research outputs found

    The Impact of Line Misidentification on Cosmological Constraints from Euclid and other Spectroscopic Galaxy Surveys

    Full text link
    We perform forecasts for how baryon acoustic oscillation (BAO) scale and redshift-space distortion (RSD) measurements from future spectroscopic emission line galaxy (ELG) surveys such as Euclid are degraded in the presence of spectral line misidentification. Using analytic calculations verified with mock galaxy catalogs from log-normal simulations we find that constraints are degraded in two ways, even when the interloper power spectrum is modeled correctly in the likelihood. Firstly, there is a loss of signal-to-noise ratio for the power spectrum of the target galaxies, which propagates to all cosmological constraints and increases with contamination fraction, fcf_c. Secondly, degeneracies can open up between fcf_c and cosmological parameters. In our calculations this typically increases BAO scale uncertainties at the 10-20% level when marginalizing over parameters determining the broadband power spectrum shape. External constraints on fcf_c, or parameters determining the shape of the power spectrum, for example from cosmic microwave background (CMB) measurements, can remove this effect. There is a near-perfect degeneracy between fcf_c and the power spectrum amplitude for low fcf_c values, where fcf_c is not well determined from the contaminated sample alone. This has the potential to strongly degrade RSD constraints. The degeneracy can be broken with an external constraint on fcf_c, for example from cross-correlation with a separate galaxy sample containing the misidentified line, or deeper sub-surveys.Comment: 18 pages, 7 figures, updated to match version accepted by ApJ (extra paragraph added at the end of Section 4.3, minor text edits

    The NCAS mobile dual-polarisation Doppler X-band weather radar (NXPol)

    Get PDF
    In recent years, dual-polarisation Doppler X-band radars have become a widely used part of the atmospheric scientist's toolkit for examining cloud dynamics and microphysics and making quantitative precipitation estimates. This is especially true for research questions that require mobile radars. Here we describe the National Centre for Atmospheric Science (NCAS) mobile X-band dual-polarisation Doppler weather radar (NXPol) and the infrastructure used to deploy the radar and provide an overview of the technical specifications. It is the first radar of its kind in the UK. The NXPol is a Meteor 50DX manufactured by Selex- Gematronik (Selex ES GmbH), modified to operate with a larger 2.4m diameter antenna that produces a 0.98 halfpower beam width and without a radome. We provide an overview of the technical specifications of the NXPol with emphasis given to the description of the aspects of the infrastructure developed to deploy the radar as an autonomous observing facility in remote locations. To demonstrate the radar's capabilities, we also present examples of its use in three recent field campaigns and its ongoing observations at the NERC Facility for Atmospheric Radio Research (NFARR)

    The development of an unsupervised hierarchical clustering analysis of dual-polarization weather surveillance radar observations to assess nocturnal insect abundance and diversity

    Get PDF
    This is the final version. Available on open access from Wiley via the DOI in this recordContemporary analyses of insect population trends are based, for the most part, on a large body of heterogeneous and short-term datasets of diurnal species that are representative of limited spatial domains. This makes monitoring changes in insect biomass and biodiversity difficult. What is needed is a method for monitoring that provides a consistent, high-resolution picture of insect populations through time over large areas during day and night. Here, we explore the use of X-band weather surveillance radar (WSR) for the study of local insect populations using a high-quality, multi-week time series of nocturnal moth light trapping data. Specifically, we test the hypotheses that (i) unsupervised data-driven classification algorithms can differentiate meteorological and biological phenomena, (ii) the diversity of the classes of bioscatterers are quantitatively related to the diversity of insects as measured on the ground and (iii) insect abundance measured at ground level can be predicted quantitatively based on dual-polarization Doppler WSR variables. Adapting the quasi-vertical profile analysis method and data clustering techniques developed for the analysis of hydrometeors, we demonstrate that our bioscatterer classification algorithm successfully differentiates bioscatterers from hydrometeors over a large spatial scale and at high temporal resolutions. Furthermore, our results also show a clear relationship between biological and meteorological scatterers and a link between the abundance and diversity of radar-based bioscatterer clusters and that of nocturnal aerial insects. Thus, we demonstrate the potential utility of this approach for landscape scale monitoring of biodiversity.Natural Environment Research Council (NERC)Bill and Melinda Gates Foundatio

    The establishment of the Standard Cosmological Model through observations

    Full text link
    Over the last decades, observations with increasing quality have revolutionized our understanding of the general properties of the Universe. Questions posed for millenia by mankind about the origin, evolution and structure of the cosmos have found an answer. This has been possible mainly thanks to observations of the Cosmic Microwave Background, of the large-scale distribution of matter structure in the local Universe, and of type Ia supernovae that have revealed the accelerated expansion of the Universe. All these observations have successfully converged into the so-called "concordance model". In spite of all these observational successes, there are still some important open problems, the most obvious of which are what generated the initial matter inhomogeneities that led to the structure observable in today's Universe, and what is the nature of dark matter, and of the dark energy that drives the accelerated expansion. In this chapter I will expand on the previous aspects. I will present a general description of the Standard Cosmological Model of the Universe, with special emphasis on the most recent observations that have us allowed to consolidate this model. I will also discuss the shortfalls of this model, its most pressing open questions, and will briefly describe the observational programmes that are being planned to tackle these issues.Comment: Accepted for publication in the book "Reviews in Frontiers of Modern Astrophysics: From Space Debris to Cosmology" (eds Kabath, Jones and Skarka; publisher Springer Nature) funded by the European Union Erasmus+ Strategic Partnership grant "Per Aspera Ad Astra Simul" 2017-1-CZ01-KA203-03556

    Planck 2015 results. XIII. Cosmological parameters

    Get PDF
    We present results based on full-mission Planck observations of temperature and polarization anisotropies of the CMB. These data are consistent with the six-parameter inflationary LCDM cosmology. From the Planck temperature and lensing data, for this cosmology we find a Hubble constant, H0= (67.8 +/- 0.9) km/s/Mpc, a matter density parameter Omega_m = 0.308 +/- 0.012 and a scalar spectral index with n_s = 0.968 +/- 0.006. (We quote 68% errors on measured parameters and 95% limits on other parameters.) Combined with Planck temperature and lensing data, Planck LFI polarization measurements lead to a reionization optical depth of tau = 0.066 +/- 0.016. Combining Planck with other astrophysical data we find N_ eff = 3.15 +/- 0.23 for the effective number of relativistic degrees of freedom and the sum of neutrino masses is constrained to < 0.23 eV. Spatial curvature is found to be |Omega_K| < 0.005. For LCDM we find a limit on the tensor-to-scalar ratio of r <0.11 consistent with the B-mode constraints from an analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP data leads to a tighter constraint of r < 0.09. We find no evidence for isocurvature perturbations or cosmic defects. The equation of state of dark energy is constrained to w = -1.006 +/- 0.045. Standard big bang nucleosynthesis predictions for the Planck LCDM cosmology are in excellent agreement with observations. We investigate annihilating dark matter and deviations from standard recombination, finding no evidence for new physics. The Planck results for base LCDM are in agreement with BAO data and with the JLA SNe sample. However the amplitude of the fluctuations is found to be higher than inferred from rich cluster counts and weak gravitational lensing. Apart from these tensions, the base LCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets

    The Effect of Credit Conditions on the Dutch Housing Market

    Full text link
    It is widely perceived that the supply of mortgages, especially since the extensive liberalization of the mortgage market since the 1980s, has had implications for the Dutch housing market. In this paper we introduce a new method to estimate a credit condition index (CCI). The credit conditions index represents changes in the supply of credit over time, apart from changes in interest rates and income. Examples of these changes include (1) the development of markets for financial futures, options, swaps, securitized loans and synthetic securities which allowed for easy access to credit for financial intermediaries, (2) more sophisticated risk management, for example improved initial credit scoring, (3) changes in risk-perception by financial intermediaries due to changes in the macro-economic environment, like rate of unemployment, (4) introduction of new mortgage products, (5) reduced transaction costs and asymmetric information with innovations of IT, telephony and data management and (6) financial liberation. Financial liberation is the relaxation or tightening of credit controls like liquidity ratios on banks, down-payment requirements, maximum repayment periods, allowed types of mortgages, loan-to-value and loan-to-income ratios, etc. The credit conditions index is estimated as an unobserved component in an error-correction model in which the average amount of mortgage is explained by the borrowing capacity and additional control variables. The model is estimated on data representing first time buyers. For first time buyers we can assume that the housing and non-housing wealth is essentially zero. The credit condition index is subsequently used in an error-correction model for house prices representing not only first time buyers, but all households. The models are estimated using quarterly data from 1995 to 2012. The estimated credit condition index has a high correlation with the Bank Lending Survey, a quarterly survey in which banks are asked whether there is a tightening or relaxation of (mortgage) lending standards compared to the preceding period. The credit condition index has explanatory power in the error-correction model for housing prices. In real terms house prices declined about 25% from 2009 to 2012. The estimation results show that 12% point of this decline can be attributed to a decline in the credit conditions index
    • …
    corecore