76,913 research outputs found

    Magnitude and frequency of wind speed shears and associated downdrafts

    Get PDF
    Data are presented indicating the frequency of occurrence of wind shear and downdrafts together with information on the simultaneous occurrence of these two phenomena. High resolution wind profile measurements recorded at a 150 meter ground winds tower facility were used for the analysis. From instantaneous measurements during horizontal wind speeds of gale-force and below intensity, vertical motion at the 10, 60, and 150 m levels was approximately 60 percent downward and 40 percent upward. At the 18 level the percentages were reversed. Updraft maxima were an order of magnitude or two greater than downdrafts at all levels. Frequency of vertical motion or = 9.7 kts for a year at four levels was 338 occurrences upward and 274 downward. Approximately 90 percent of these updrafts occurred at the 18 m level almost equally during summer and winter, and 65 percent of the downdrafts were at the 150 m level during summer

    Direct frequency comb laser cooling and trapping

    Full text link
    Continuous wave (CW) lasers are the enabling technology for producing ultracold atoms and molecules through laser cooling and trapping. The resulting pristine samples of slow moving particles are the de facto starting point for both fundamental and applied science when a highly-controlled quantum system is required. Laser cooled atoms have recently led to major advances in quantum information, the search to understand dark energy, quantum chemistry, and quantum sensors. However, CW laser technology currently limits laser cooling and trapping to special types of elements that do not include highly abundant and chemically relevant atoms such as hydrogen, carbon, oxygen, and nitrogen. Here, we demonstrate that Doppler cooling and trapping by optical frequency combs may provide a route to trapped, ultracold atoms whose spectra are not amenable to CW lasers. We laser cool a gas of atoms by driving a two-photon transition with an optical frequency comb, an efficient process to which every comb tooth coherently contributes. We extend this technique to create a magneto-optical trap (MOT), an electromagnetic beaker for accumulating the laser-cooled atoms for further study. Our results suggest that the efficient frequency conversion offered by optical frequency combs could provide a key ingredient for producing trapped, ultracold samples of nature's most abundant building blocks, as well as antihydrogen. As such, the techniques demonstrated here may enable advances in fields as disparate as molecular biology and the search for physics beyond the standard model.Comment: 10 pages, 5 figure

    Discrete local altitude sensing device Patent

    Get PDF
    Device for use in descending spacecraft as altitude sensor for actuating deceleration retrorocket

    The latent process decomposition of cDNA microarray data sets

    Get PDF
    We present a new computational technique (a software implementation, data sets, and supplementary information are available at http://www.enm.bris.ac.uk/lpd/) which enables the probabilistic analysis of cDNA microarray data and we demonstrate its effectiveness in identifying features of biomedical importance. A hierarchical Bayesian model, called latent process decomposition (LPD), is introduced in which each sample in the data set is represented as a combinatorial mixture over a finite set of latent processes, which are expected to correspond to biological processes. Parameters in the model are estimated using efficient variational methods. This type of probabilistic model is most appropriate for the interpretation of measurement data generated by cDNA microarray technology. For determining informative substructure in such data sets, the proposed model has several important advantages over the standard use of dendrograms. First, the ability to objectively assess the optimal number of sample clusters. Second, the ability to represent samples and gene expression levels using a common set of latent variables (dendrograms cluster samples and gene expression values separately which amounts to two distinct reduced space representations). Third, in contrast to standard cluster models, observations are not assigned to a single cluster and, thus, for example, gene expression levels are modeled via combinations of the latent processes identified by the algorithm. We show this new method compares favorably with alternative cluster analysis methods. To illustrate its potential, we apply the proposed technique to several microarray data sets for cancer. For these data sets it successfully decomposes the data into known subtypes and indicates possible further taxonomic subdivision in addition to highlighting, in a wholly unsupervised manner, the importance of certain genes which are known to be medically significant. To illustrate its wider applicability, we also illustrate its performance on a microarray data set for yeast

    Revisiting the 1954 Suspension Experiments of R. A.Bagnold

    Get PDF
    In 1954 R. A. Bagnold published his seminal findings on the rheological properties of a liquid-solid suspension. Although this work has been cited extensively over the last fifty years, there has not been a critical review of the experiments. The purpose of this study is to examine the work and to suggest an alternative reason for the experimental findings. The concentric cylinder rheometer was designed to measure simultaneously the shear and normal forces for a wide range of solid concentrations, fluid viscosities and shear rates. As presented by Bagnold, the analysis and experiments demonstrated that the shear and normal forces depended linearly on the shear rate in the 'macroviscous' regime; as the grain-to-grain interactions increased in the 'grain-inertia' regime, the stresses depended on the square of the shear rate and were independent of the fluid viscosity. These results, however, appear to be dictated by the design of the experimental facility. In Bagnold's experiments, the height (h) of the rheometer was relatively short compared to the spacing (t) between the rotating outer and stationary inner cylinder (h/t=4.6). Since the top and bottom end plates rotated with the outer cylinder, the flow contained two axisymmetric counter-rotating cells in which flow moved outward along the end plates and inward through the central region of the annulus. At higher Reynolds numbers, these cells contributed significantly to the measured torque, as demonstrated by comparing Bagnold's pure-fluid measurements with studies on laminar-to-turbulent transitions that pre-date the 1954 study. By accounting for the torque along the end walls, Bagnold's shear stress measurements can be estimated by modelling the liquid-solid mixture as a Newtonian fluid with a corrected viscosity that depends on the solids concentration. An analysis of the normal stress measurements was problematic because the gross measurements were not reported and could not be obtained

    Time- and frequency-domain polariton interference

    Full text link
    We present experimental observations of interference between an atomic spin coherence and an optical field in a {\Lambda}-type gradient echo memory. The interference is mediated by a strong classical field that couples a weak probe field to the atomic coherence through a resonant Raman transition. Interference can be observed between a prepared spin coherence and another propagating optical field, or between multiple {\Lambda} transitions driving a single spin coherence. In principle, the interference in each scheme can yield a near unity visibility.Comment: 11 pages, 5 figure

    Measurement based entanglement under conditions of extreme photon loss

    Full text link
    The act of measuring optical emissions from two remote qubits can entangle them. By demanding that a photon from each qubit reaches the detectors, one can ensure than no photon was lost. But the failure rate then rises quadratically with loss probability. In [1] this resulted in 30 successes per billion attempts. We describe a means to exploit the low grade entanglement heralded by the detection of a lone photon: A subsequent perfect operation is quickly achieved by consuming this noisy resource. We require only two qubits per node, and can tolerate both path length variation and loss asymmetry. The impact of photon loss upon the failure rate is then linear; realistic high-loss devices can gain orders of magnitude in performance and thus support QIP.Comment: Contains an extension of the protocol that makes it robust against asymmetries in path length and photon los
    corecore