1,114 research outputs found

    Book Reviews

    Get PDF

    Laminar-turbulent boundary-layer transition over a rough rotating disk

    Get PDF
    Boundary-layer transition over a disk spinning under water is investigated. Transitional Reynolds numbers, Re-c, and associated boundary-layer velocity profiles are determined from flow-visualizations and hot-film measurements, respectively. The value of Re-c and the velocity profiles are studied as a function of the disk's surface roughness. It is found that transition over rough disks occurs in a similar fashion to that over smooth disks, i.e., abruptly and axisymmetrically at well-defined radii. Wall roughness has little effect on Re-c until a threshold relative roughness is reached. Above the threshold Re-c decreases sharply. The decrease is consistent with the drop one expects for our flow for the absolute instability discovered by Lingwood [J. Fluid Mech. 299, 17 (1995); 314, 373 (1996); 331, 405 (1997)]. This indicates that the Lingwood absolute instability may continue to play a major role in the transition process even for large relative roughness. (C) 2003 American Institute of Physics

    High-Resolution Continuum Imaging at 1.3 and 0.7 cm of the W3 IRS 5 Region

    Full text link
    High-resolution images of the hypercompact HII regions (HCHII) in W3 IRS 5 taken with the Very Large Array (VLA) at 1.3 and 0.7 cm are presented. Four HCHII regions were detected with sufficient signal-to-noise ratios to allow the determination of relevant parameters such as source position, size and flux density. The sources are slightly extended in our ~0.2 arcsecond beams; the deconvolved radii are less than 240 AU. A comparison of our data with VLA images taken at epoch 1989.1 shows proper motions for sources IRS 5a and IRS 5f. Between 1989.1 and 2002.5, we find a proper motion of 210 mas at a position angle of 12 deg for IRS 5f and a proper motion of 190 mas at a position angle of 50 deg for IRS 5a. At the assumed distance to W3 IRS 5, 1.83 +/- 0.14 kpc, these offsets translate to proper motions of ~135 km/s and ~122 km/s$ respectively. These sources are either shock ionized gas in an outflow or ionized gas ejected from high mass stars. We find no change in the positions of IRS 5d1/d2 and IRS 5b; and we show through a comparison with archival NICMOS 2.2 micron images that these two radio sources coincide with the infrared double constituting W3 IRS 5. These sources contain B or perhaps O stars. The flux densities of the four sources have changed compared to the epoch 1989.1 results. In our epoch 2002.5 data, none of the spectral indicies obtained from flux densities at 1.3 and 0.7 cm are consistent with optically thin free-free emission; IRS 5d1/d2 shows the largest increase in flux density from 1.3 cm to 0.7 cm. This may be an indication of free-free optical depth within an ionized wind, a photoevaporating disk, or an accretion flow. It is less likely that this increase is caused by dust emission at 0.7 cm.Comment: 13 pages, 3 figures To be published in The Astrophysical Journa

    Streaming Algorithm for Euler Characteristic Curves of Multidimensional Images

    Full text link
    We present an efficient algorithm to compute Euler characteristic curves of gray scale images of arbitrary dimension. In various applications the Euler characteristic curve is used as a descriptor of an image. Our algorithm is the first streaming algorithm for Euler characteristic curves. The usage of streaming removes the necessity to store the entire image in RAM. Experiments show that our implementation handles terabyte scale images on commodity hardware. Due to lock-free parallelism, it scales well with the number of processor cores. Our software---CHUNKYEuler---is available as open source on Bitbucket. Additionally, we put the concept of the Euler characteristic curve in the wider context of computational topology. In particular, we explain the connection with persistence diagrams

    Using the Incremental Net Benefit Framework for Quantitative Benefit–Risk Analysis in Regulatory Decision-Making—A Case Study of Alosetron in Irritable Bowel Syndrome

    Get PDF
    AbstractObjectiveThere is consensus that a more transparent, explicit, and rigorous approach to benefit–risk evaluation is required. The objective of this study is to evaluate the incremental net benefit (INB) framework for undertaking quantitative benefit–risk assessment by performing a quantitative benefit–risk analysis of alosetron for the treatment of irritable bowel syndrome from the patients’ perspective.MethodsA discrete event simulation model was developed to determine the INB of alosetron relative to placebo, calculated as “relative value-adjusted life-years (RVALYs).”ResultsIn the base case analysis, alosetron resulted in a mean INB of 34.1 RVALYs per 1000 patients treated relative to placebo over 52 weeks of treatment. Incorporating parameter uncertainty into the model, probabilistic sensitivity analysis revealed a mean INB of 30.4 (95% confidence interval 15.9–45.4) RVALYs per 1000 patients treated relative to placebo over 52 weeks of treatment. Overall, there was >99% chance that both the incremental benefit and incremental risk associated with alosetron are greater than placebo. As hypothesized, the INB of alosetron was greatest in patients with the worst quality of life experienced at baseline. The mean INB associated with alosetron in patients with mild, moderate, and severe symptoms at baseline was 17.97 (−0.55 to 36.23), 29.98 (17.05–43.37), and 35.98 (23.49–48.77) RVALYs per 1000 patients treated, respectively.ConclusionsThis study demonstrates the potential utility of applying the INB framework to real-life decision-making, and the ability to use simulation modeling incorporating outcomes data from different sources as a benefit–risk decision aid

    the COMIT’ID study protocol for using a Delphi process and face-to-face meetings to establish consensus

    Get PDF
    Background The reporting of outcomes in clinical trials of subjective tinnitus indicates that many different tinnitus-related complaints are of interest to investigators, from perceptual attributes of the sound (e.g. loudness) to psychosocial impacts (e.g. quality of life). Even when considering one type of intervention strategy for subjective tinnitus, there is no agreement about what is critically important for deciding whether a treatment is effective. The main purpose of this observational study is, therefore to, develop Core Outcome Domain Sets for the three different intervention strategies (sound, psychological, and pharmacological) for adults with chronic subjective tinnitus that should be measured and reported in every clinical trial of these interventions. Secondary objectives are to identify the strengths and limitations of our study design for recruiting and reducing attrition of participants, and to explore uptake of the core outcomes. Methods The ‘Core Outcome Measures in Tinnitus: International Delphi’ (COMIT’ID) study will use a mixed-methods approach that incorporates input from health care users at the pre-Delphi stage, a modified three-round Delphi survey and final consensus meetings (one for each intervention). The meetings will generate recommendations by stakeholder representatives on agreed Core Outcome Domain Sets specific to each intervention. A subsequent step will establish a common cross-cutting Core Outcome Domain Set by identifying the common outcome domains included in all three intervention-specific Core Outcome Domain Sets. To address the secondary objectives, we will gather feedback from participants about their experience of taking part in the Delphi process. We aspire to conduct an observational cohort study to evaluate uptake of the core outcomes in published studies at 7 years following Core Outcome Set publication. Discussion The COMIT’ID study aims to develop a Core Outcome Domain Set that is agreed as critically important for deciding whether a treatment for subjective tinnitus is effective. Such a recommendation would help to standardise future clinical trials worldwide and so we will determine if participation increases use of the Core Outcome Set in the long term. Trial registration This project has been registered (November 2014) in the database of the Core Outcome Measures in Effectiveness Trials (COMET) initiative

    The Topology of Large Scale Structure in the 1.2 Jy IRAS Redshift Survey

    Get PDF
    We measure the topology (genus) of isodensity contour surfaces in volume limited subsets of the 1.2 Jy IRAS redshift survey, for smoothing scales \lambda=4\hmpc, 7\hmpc, and 12\hmpc. At 12\hmpc, the observed genus curve has a symmetric form similar to that predicted for a Gaussian random field. At the shorter smoothing lengths, the observed genus curve shows a modest shift in the direction of an isolated cluster or ``meatball'' topology. We use mock catalogs drawn from cosmological N-body simulations to investigate the systematic biases that affect topology measurements in samples of this size and to determine the full covariance matrix of the expected random errors. We incorporate the error correlations into our evaluations of theoretical models, obtaining both frequentist assessments of absolute goodness-of-fit and Bayesian assessments of models' relative likelihoods. We compare the observed topology of the 1.2 Jy survey to the predictions of dynamically evolved, unbiased, gravitational instability models that have Gaussian initial conditions. The model with an n=−1n=-1, power-law initial power spectrum achieves the best overall agreement with the data, though models with a low-density cold dark matter power spectrum and an n=0n=0 power-law spectrum are also consistent. The observed topology is inconsistent with an initially Gaussian model that has n=−2n=-2, and it is strongly inconsistent with a Voronoi foam model, which has a non-Gaussian, bubble topology.Comment: ApJ submitted, 39 pages, LaTeX(aasms4), 12 figures, 1 Tabl

    1934: Abilene Christian College Bible Lectures - Full Text

    Get PDF
    INTRODUCTION The theme for the Lectures for 1934, “The New Testament Church in History,” is a very timely one and follows naturally the theme of the 1933 Lectures, “The Church We Read About in the New Testament.” There is no subject that is so vital in our work as Christians today as a proper understanding of the great spiritual kingdom of our Savior, the church which was built by Jesus Christ. It is a hard lesson to teach because all people are so dull of hearing concerning things spiritual. Just as Nicodemus marveled when the Christ told him of the spiritual kingdom so do people today wonder and marvel when they are told that there is only one great church, the spiritual kingdom of our Lord and Savior Jesus Christ, and that all the saved of earth belong to that church and that belonging to anything else profits little, and is unnecessary. Not only are numbers of denominational churches and people who have no religious affiliation ignorant of the true meaning of the church, but even those who claim to be members of the one body are lacking in understanding concerning the kingdom of Christ. It is therefore the purpose of the Abilene College Lectures last year, this year and next year to arouse a greater interest in the study and the teaching of this very vital matter. In this particular volume much valuable information is brought together on the trials and struggles of the church from its foundations to the present. The speakers have made careful preparation on their subjects and have given lessons that should prove helpful to all who desire to have a better understanding of the church. Our prayer is that these Lectures may be read by many and that they may do much good in the name of the Christ. Jas. F. Cox,President, Abilene Christian College. Nov. 6, 1934

    Topology of structure in the Sloan Digital Sky Survey: model testing

    Full text link
    We measure the three-dimensional topology of large-scale structure in the Sloan Digital Sky Survey (SDSS). This allows the genus statistic to be measured with unprecedented statistical accuracy. The sample size is now sufficiently large to allow the topology to be an important tool for testing galaxy formation models. For comparison, we make mock SDSS samples using several state-of-the-art N-body simulations: the Millennium run of Springel et al. (2005)(10 billion particles), Kim & Park (2006) CDM models (1.1 billion particles), and Cen & Ostriker (2006) hydrodynamic code models (8.6 billion cell hydro mesh). Each of these simulations uses a different method for modeling galaxy formation. The SDSS data show a genus curve that is broadly characteristic of that produced by Gaussian random phase initial conditions. Thus the data strongly support the standard model of inflation where Gaussian random phase initial conditions are produced by random quantum fluctuations in the early universe. But on top of this general shape there are measurable differences produced by non-linear gravitational effects (cf. Matsubara 1994), and biasing connected with galaxy formation. The N-body simulations have been tuned to reproduce the power spectrum and multiplicity function but not topology, so topology is an acid test for these models. The data show a ``meatball'' shift (only partly due to the Sloan Great Wall of Galaxies; this shift also appears in a sub-sample not containing the Wall) which differs at the 2.5\sigma level from the results of the Millennium run and the Kim & Park dark halo models, even including the effects of cosmic variance.Comment: 13 Apj pages, 7 figures High-resolution stereo graphic available at http://www.astro.princeton.edu/~dclayh/stereo50.ep

    Curvature of the Universe and Observed Gravitational Lens Image Separations Versus Redshift

    Get PDF
    In a flat, k=0 cosmology with galaxies that approximate singular isothermal spheres, gravitational lens image separations should be uncorrelated with source redshift. But in an open k=-1 cosmology such gravitational lens image separations become smaller with increasing source redshift. The observed separations do become smaller with increasing source redshift but the effect is even stronger than that expected in an Omega=0 cosmology. The observations are thus not compatible with the "standard" gravitational lensing statistics model in a flat universe. We try various open and flat cosmologies, galaxy mass profiles, galaxy merging and evolution models, and lensing aided by clusters to explain the correlation. We find the data is not compatible with any of these possibilities within the 95% confidence limit, leaving us with a puzzle. If we regard the observed result as a statistical fluke, it is worth noting that we are about twice as likely to observe it in an open universe (with 0<Omega<0.4) as we are to observe it in a flat one. Finally, the existence of an observed multiple image lens system with a source at z=4.5 places a lower limit on the deceleration parameter: q_0 > -2.0.Comment: 21 pages, 4 figures, AASTeX
    • 

    corecore