5,150 research outputs found

    A trapped mercury 199 ion frequency standard

    Get PDF
    Mercury 199 ions confined in an RF quadrupole trap and optically pumped by mercury 202 ion resonance light are investigated as the basis for a high performance frequency standard with commercial possibilities. Results achieved and estimates of the potential performance of such a standard are given

    Angular Resolution of the LISA Gravitational Wave Detector

    Get PDF
    We calculate the angular resolution of the planned LISA detector, a space-based laser interferometer for measuring low-frequency gravitational waves from galactic and extragalactic sources. LISA is not a pointed instrument; it is an all-sky monitor with a quadrupolar beam pattern. LISA will measure simultaneously both polarization components of incoming gravitational waves, so the data will consist of two time series. All physical properties of the source, including its position, must be extracted from these time series. LISA's angular resolution is therefore not a fixed quantity, but rather depends on the type of signal and on how much other information must be extracted. Information about the source position will be encoded in the measured signal in three ways: 1) through the relative amplitudes and phases of the two polarization components, 2) through the periodic Doppler shift imposed on the signal by the detector's motion around the Sun, and 3) through the further modulation of the signal caused by the detector's time-varying orientation. We derive the basic formulae required to calculate the LISA's angular resolution ΔΩS\Delta \Omega_S for a given source. We then evaluate ΔΩS\Delta \Omega_S for two sources of particular interest: monchromatic sources and mergers of supermassive black holes. For these two types of sources, we calculate (in the high signal-to-noise approximation) the full variance-covariance matrix, which gives the accuracy to which all source parameters can be measured. Since our results on LISA's angular resolution depend mainly on gross features of the detector geometry, orbit, and noise curve, we expect these results to be fairly insensitive to modest changes in detector design that may occur between now and launch. We also expect that our calculations could be easily modified to apply to a modified design.Comment: 15 pages, 5 figures, RevTex 3.0 fil

    CARS Temperature Measurements in a Hypersonic Propulsion Test Facility

    Get PDF
    Nonintrusive diagnostic measurements were performed in the supersonic reacting flow of the Hypersonic Propulsion Test Cell 2 at NASA-Langley. A Coherent Anti-stokes Raman Spectroscopy (CARS) system was assembled specifically for the test cell environment. System design considerations were: (1) test cell noise and vibration; (2) contamination from flow field or atmospheric borne dust; (3) unwanted laser or electrically induced combustion (inside or outside the duct); (4) efficient signal collection; (5) signal splitting to span the wide dynamic range present throughout the flow field; (6) movement of the sampling volume in the flow; and (7) modification of the scramjet model duct to permit optical access to the reacting flow with the CARS system. The flow in the duct was a nominal Mach 2 flow with static pressure near one atmosphere. A single perpendicular injector introduced hydrogen into the flow behind a rearward facing step. CARS data was obtained in three planes downstream of the injection region. At least 20 CARS data points were collected at each of the regularly spaced sampling locations in each data plane. Contour plots of scramjet combustor static temperature in a reacting flow region are presented

    Choptuik scaling in six dimensions

    Full text link
    We perform numerical simulations of the critical gravitational collapse of a spherically symmetric scalar field in 6 dimensions. The critical solution has discrete self-similarity. We find the critical exponent \gamma and the self-similarity period \Delta.Comment: 8 pages, 3 figures RevTe

    A Bayesian approach to the follow-up of candidate gravitational wave signals

    Full text link
    Ground-based gravitational wave laser interferometers (LIGO, GEO-600, Virgo and Tama-300) have now reached high sensitivity and duty cycle. We present a Bayesian evidence-based approach to the search for gravitational waves, in particular aimed at the followup of candidate events generated by the analysis pipeline. We introduce and demonstrate an efficient method to compute the evidence and odds ratio between different models, and illustrate this approach using the specific case of the gravitational wave signal generated during the inspiral phase of binary systems, modelled at the leading quadrupole Newtonian order, in synthetic noise. We show that the method is effective in detecting signals at the detection threshold and it is robust against (some types of) instrumental artefacts. The computational efficiency of this method makes it scalable to the analysis of all the triggers generated by the analysis pipelines to search for coalescing binaries in surveys with ground-based interferometers, and to a whole variety of signal waveforms, characterised by a larger number of parameters.Comment: 9 page

    Gravitational Wave Chirp Search: Economization of PN Matched Filter Bank via Cardinal Interpolation

    Full text link
    The final inspiral phase in the evolution of a compact binary consisting of black holes and/or neutron stars is among the most probable events that a network of ground-based interferometric gravitational wave detectors is likely to observe. Gravitational radiation emitted during this phase will have to be dug out of noise by matched-filtering (correlating) the detector output with a bank of several 10510^5 templates, making the computational resources required quite demanding, though not formidable. We propose an interpolation method for evaluating the correlation between template waveforms and the detector output and show that the method is effective in substantially reducing the number of templates required. Indeed, the number of templates needed could be a factor 4\sim 4 smaller than required by the usual approach, when the minimal overlap between the template bank and an arbitrary signal (the so-called {\it minimal match}) is 0.97. The method is amenable to easy implementation, and the various detector projects might benefit by adopting it to reduce the computational costs of inspiraling neutron star and black hole binary search.Comment: scheduled for publicatin on Phys. Rev. D 6

    Recovering the stationary phase condition for accurately obtaining scattering and tunneling times

    Full text link
    The stationary phase method is often employed for computing tunneling {\em phase} times of analytically-continuous {\em gaussian} or infinite-bandwidth step pulses which collide with a potential barrier. The indiscriminate utilization of this method without considering the barrier boundary effects leads to some misconceptions in the interpretation of the phase times. After reexamining the above barrier diffusion problem where we notice the wave packet collision necessarily leads to the possibility of multiple reflected and transmitted wave packets, we study the phase times for tunneling/reflecting particles in a framework where an idea of multiple wave packet decomposition is recovered. To partially overcome the analytical incongruities which rise up when tunneling phase time expressions are obtained, we present a theoretical exercise involving a symmetrical collision between two identical wave packets and a one dimensional squared potential barrier where the scattered wave packets can be recomposed by summing the amplitudes of simultaneously reflected and transmitted waves.Comment: 32 pages, 5 figures, 1 tabl

    Has The Era Of Slow Growth For Prescription Drug Spending Ended?

    Get PDF
    In the period 2005–13 the US prescription drug market grew at an average annual pace of only 1.8 percent in real terms on an invoice price basis (that is, in constant dollars and before manufacturers’ rebates and discounts). But the growth rate increased dramatically in 2014, when the market expanded by 11.5 percent—which raised questions about future trends. We determined the impact of manufacturers’ rebates and discounts on prices and identified the underlying factors likely to influence prescription spending over the next decade. These include a strengthening of the innovation pipeline; consolidation among buyers such as wholesalers, pharmacy benefit managers, and health insurers; and reduced incidence of patent expirations, which means that fewer less costly generic drug substitutes will enter the market than in the recent past. While various forecasts indicate that pharmaceutical spending growth will moderate from its 2014 level, the business tension between buyers and sellers could play out in many different ways. This suggests that future spending trends remain highly uncertain.United States. National Institutes of Health (NIANIH/R01AG043560

    Estimating the detectable rate of capture of stellar mass black holes by massive central black holes in normal galaxies

    Get PDF
    The capture and subsequent inspiral of stellar mass black holes on eccentric orbits by central massive black holes, is one of the more interesting likely sources of gravitational radiation detectable by LISA. We estimate the rate of observable events and the associated uncertainties. A moderately favourable mass function could provide many detectable bursts each year, and a detection of at least one burst per year is very likely given our current understanding of the populations in cores of normal spiral galaxies.Comment: 3 pages 2-column revtex Latex macro. No figures. Classical and Quantum Gravity, accepte

    Evaluating the spatial transferability and temporal repeatability of remote sensing-based lake water quality retrieval algorithms at the European scale:a meta-analysis approach

    Get PDF
    Many studies have shown the considerable potential for the application of remote-sensing-based methods for deriving estimates of lake water quality. However, the reliable application of these methods across time and space is complicated by the diversity of lake types, sensor configuration, and the multitude of different algorithms proposed. This study tested one operational and 46 empirical algorithms sourced from the peer-reviewed literature that have individually shown potential for estimating lake water quality properties in the form of chlorophyll-a (algal biomass) and Secchi disc depth (SDD) (water transparency) in independent studies. Nearly half (19) of the algorithms were unsuitable for use with the remote-sensing data available for this study. The remaining 28 were assessed using the Terra/Aqua satellite archive to identify the best performing algorithms in terms of accuracy and transferability within the period 2001–2004 in four test lakes, namely Vänern, Vättern, Geneva, and Balaton. These lakes represent the broad continuum of large European lake types, varying in terms of eco-region (latitude/longitude and altitude), morphology, mixing regime, and trophic status. All algorithms were tested for each lake separately and combined to assess the degree of their applicability in ecologically different sites. None of the algorithms assessed in this study exhibited promise when all four lakes were combined into a single data set and most algorithms performed poorly even for specific lake types. A chlorophyll-a retrieval algorithm originally developed for eutrophic lakes showed the most promising results (R2 = 0.59) in oligotrophic lakes. Two SDD retrieval algorithms, one originally developed for turbid lakes and the other for lakes with various characteristics, exhibited promising results in relatively less turbid lakes (R2 = 0.62 and 0.76, respectively). The results presented here highlight the complexity associated with remotely sensed lake water quality estimates and the high degree of uncertainty due to various limitations, including the lake water optical properties and the choice of methods
    corecore