306 research outputs found

    A comparison of Bayesian and Fourier methods for frequency determination in asteroseismology

    Full text link
    Bayesian methods are becoming more widely used in asteroseismic analysis. In particular, they are being used to determine oscillation frequencies, which are also commonly found by Fourier analysis. It is important to establish whether the Bayesian methods provide an improvement on Fourier methods. We compare, using simulated data, the standard iterative sine-wave fitting method against a Markov Chain Monte Carlo (MCMC) code that has been introduced to infer purely the frequencies of oscillation modes (Brewer et al. 2007). A uniform prior probability distribution function is used for the MCMC method. We find the methods do equally well at determining the correct oscillation frequencies, although the Bayesian method is able to highlight the possibility of a misidentification due to aliasing, which can be useful. In general, we suggest that the least computationally intensive method is preferable.Comment: 11 pages, 8 figures, accepted for publication in Communications in Asterosesimolog

    Optimizing Glass Design: The Role of Computational Wind Engineering & Advanced Numerical Analysis

    Get PDF
    Wind induced pressure is a major design consideration for glazing design. However, the effects of façade geometry and urban terrain on wind loading are often difficult to quantify without costly and time-consuming wind tunnel testing.  Accurate 3-dimensional data, covering most major cities, is becoming increasingly accessible, and such models are ideal to support numerical modelling of environmental effects on the built environment, especially if such modelling attempts to capture the geometric effects of the cityscape.  A new methodology to assess the effects of wind loads on the structural strength of glass using transient, geometrically non-linear analyses and improved glass failure prediction models is presented.  A description is provided for both the calculation of wind-induced façade loads, and the development and employment of a finite element (FE) solver to model façade performance.&nbsp

    Coastal risk adaptation: the potential role of accessible geospatial Big Data

    Get PDF
    Increasing numbers of people are living in and using coastal areas. Combined with the presence of pervasive coastal threats, such as flooding and erosion, this is having widespread impacts on coastal populations, infrastructure and ecosystems. For the right adaptive strategies to be adopted, and planning decisions to be made, rigorous evaluation of the available options is required. This evaluation hinges on the availability and use of suitable datasets. For knowledge to be derived from coastal datasets, such data needs to be combined and analysed in an effective manner. This paper reviews a wide range of literature relating to data-driven approaches to coastal risk evaluation, revealing how limitations have been imposed on many of these methods, due to restrictions in computing power and access to data. The rapidly emerging field of ‘Big Data’ can help overcome many of these hurdles. ‘Big Data’ involves powerful computer infrastructures, enabling storage, processing and real-time analysis of large volumes and varieties of data, in a fast and reliable manner. Through consideration of examples of how ‘Big Data’ technologies are being applied to fields related to coastal risk, it becomes apparent that geospatial Big Data solutions hold clear potential to improve the process of risk based decision making on the coast. ‘Big Data’ does not provide a stand-alone solution to the issues and gaps outlined in this paper, yet these technological methods hold the potential to optimise data-driven approaches, enabling robust risk profiles to be generated for coastal regions

    The Physical Parameters of the Retired A Star HD185351

    Full text link
    We report here an analysis of the physical stellar parameters of the giant star HD185351 using Kepler short-cadence photometry, optical and near infrared interferometry from CHARA, and high-resolution spectroscopy. Asteroseismic oscillations detected in the Kepler short-cadence photometry combined with an effective temperature calculated from the interferometric angular diameter and bolometric flux yield a mean density, rho_star = 0.0130 +- 0.0003 rho_sun and surface gravity, logg = 3.280 +- 0.011. Combining the gravity and density we find Rstar = 5.35 +- 0.20 Rsun and Mstar = 1.99 +- 0.23 Msun. The trigonometric parallax and CHARA angular diameter give a radius Rstar = 4.97 +- 0.07 Rsun. This smaller radius,when combined with the mean stellar density, corresponds to a stellar mass Mstar = 1.60 +- 0.08 Msun, which is smaller than the asteroseismic mass by 1.6-sigma. We find that a larger mass is supported by the observation of mixed modes in our high-precision photometry, the spacing of which is consistent only for Mstar =~ 1.8 Msun. Our various and independent mass measurements can be compared to the mass measured from interpolating the spectroscopic parameters onto stellar evolution models, which yields a model-based mass M_star = 1.87 +- 0.07 Msun. This mass agrees well with the asteroseismic value,but is 2.6-sigma higher than the mass from the combination of asteroseismology and interferometry. The discrepancy motivates future studies with a larger sample of giant stars. However, all of our mass measurements are consistent with HD185351 having a mass in excess of 1.5 Msun.Comment: ApJ accepte

    Image segmentation for improved consistency in image-interpretation of opium poppy

    Get PDF
    The image-interpretation of opium poppy crops from very high resolution satellite imagery forms part of the annual Afghanistan opium surveys conducted by the United Nations Office on Drugs and Crime and the United States Government. We tested the effect of generalization of field delineations on the final estimates of poppy cultivation using survey data from Helmand province in 2009 and an area frame sampling approach. The sample data was reinterpreted from pan-sharpened IKONOS scenes using two increasing levels of generalization consistent with observed practice. Samples were also generated from manual labelling of image segmentation and from a digital object classification. Generalization was found to bias the cultivation estimate between 6.6% and 13.9%, which is greater than the sample error for the highest level. Object classification of image-segmented samples increased the cultivation estimate by 30.2% because of systematic labelling error. Manual labelling of image-segmented samples gave a similar estimate to the original interpretation. The research demonstrates that small changes in poppy interpretation can result in systematic differences in final estimates that are not included within confidence intervals. Segmented parcels were similar to manually digitized fields and could provide increased consistency in field delineation at a reduced cost. The results are significant for Afghanistan’s opium monitoring programmes and other surveys where sample data are collected by remote sensing

    The application of data innovations to geomorphological impact analyses in coastal areas: An East Anglia, UK, case study

    Get PDF
    Rapidly advancing surveying technologies, capable of generating high resolution bathymetric and topographic data, allow precise measurements of geomorphological change and deformation. This permits great accuracy in the characterisation of volumetric change, sediment and debris flows, accumulations and erosion rates. However, such data can be utilised inadequately by coastal practitioners in their assessments of coastal change, due to a lack of awareness of the appropriate analytical techniques and the potential benefits offered by such data-driven approaches. This was found to be the case for the region of East Anglia, UK, which was analysed in this study. This paper evaluates the application of innovative geomorphological change detection (GCD) techniques for analysis of coastal change. The first half of the paper contains an extensive review of GCD methods and data sources used in previous studies. This leads to the selection and recommendation of an appropriate methodology for calculation of volumetric GCD, which has been subsequently applied and evaluated for 14 case study sites in East Anglia. This has involved combining open source point cloud datasets for broad spatial scales, covering an extended temporal period. The results comprise quantitative estimates of volumetric change for selected locations. This allows estimation of the sediment budgets for each stretch of coastline focused upon, revealing fluctuations in their rates of change. These quantitative results were combined with qualitative outputs, such as visual representations of change and we reveal how combining such methods assists identification of patterns and impacts linked to specific events. The study demonstrates how high-resolution point cloud data, which is now readily available, can be used to better inform coastal management practices, revealing trends, impacts and vulnerability in dynamic coastal regions. The results also indicate heterogeneous impacts of events, such as the 2013 East Coast Storm Surge, across the study area of East Anglia

    Benchmarking Substellar Evolutionary Models Using New Age Estimates for HD 4747 B and HD 19467 B

    Get PDF
    Constraining substellar evolutionary models (SSEMs) is particularly difficult due to a degeneracy between the mass, age, and luminosity of a brown dwarf. In cases where a brown dwarf is found as a directly imaged companion to a star, as in HD 4747 and HD 19467, the mass, age, and luminosity of the brown dwarf are determined independently, making them ideal objects to use to benchmark SSEMs. Using the Center for High Angular Resolution Astronomy Array, we measured the angular diameters and calculated the radii of the host stars HD 4747 A and HD 19467 A. After fitting their parameters to the Dartmouth Stellar Evolution Database, MESA Isochrones and Stellar Tracks, and Yonsei-Yale isochronal models, we adopt age estimates of 10.74−6.87+6.7510.74^{+6.75}_{-6.87} Gyr for HD 4747 A and 10.06−0.82+1.1610.06^{+1.16}_{-0.82} Gyr for HD 19467 A. Assuming the brown dwarf companions HD 4747 B and HD 19467 B have the same ages as their host stars, we show that many of the SSEMs under-predict bolometric luminosities by ∼\sim 0.75 dex for HD 4747 B and ∼0.5\sim 0.5 dex for HD 19467 B. The discrepancies in luminosity correspond to over-predictions of the masses by ∼\sim 12\% for HD 4747 B and ∼\sim 30\% for HD 19467 B. We also show that SSEMs that take into account the effect of clouds reduce the under-prediction of luminosity to ∼0.6\sim 0.6 dex and the over-prediction of mass to ∼8%\sim 8\% for HD 4747 B, an L/T transition object that is cool enough to begin forming clouds. One possible explanation for the remaining discrepancies is missing physics in the models, such as the inclusion of metallicity effects.Comment: 12 pages, 6 figures, 5 tables, accepted to Ap

    Revisiting experimental methods for studies of acidity-dependent ocean sound absorption

    Get PDF
    Author Posting. © Acoustical Society of America, 2009. This article is posted here by permission of Acoustical Society of America for personal use, not for redistribution. The definitive version was published in Journal of the Acoustical Society of America 125 (2009): 1971-1981, doi:10.1121/1.3089591.The practical usefulness of long-range acoustic measurements of ocean acidity-linked sound absorption is analyzed. There are two applications: Determining spatially-averaged pH via absorption measurement and verifying absorption effects in an area of known pH. The method is a differential-attenuation technique, with the difference taken across frequency. Measurement performance versus mean frequency and range is examined. It is found that frequencies below 500 Hz are optimal. These are lower than the frequency where the measurement would be most sensitive in the absence of noise and signal fluctuation (scintillation). However, attenuation serves to reduce signal-to-noise ratio with increasing distance and frequency, improving performance potential at lower frequencies. Use of low frequency allows longer paths to be used, with potentially better spatial averaging. Averaging intervals required for detection of fluctuations or trends with the required precision are computed

    The History of Flow Chemistry at Eli Lilly and Company

    Get PDF
    Flow chemistry was initially used for speed to early phase material delivery in the development laboratories, scaling up chemical transformations that we would not or could not scale up batch for safety reasons. Some early examples included a Newman Kwart Rearrangement, Claisen rearrangement, hydroformylation, and thermal imidazole cyclization. Next, flow chemistry was used to enable safe scale up of hazardous chemistries to manufacturing plants. Examples included high pressure hydrogenation, aerobic oxidation, and Grignard formation reactions. More recently, flow chemistry was used in Small Volume Continuous (SVC) processes, where highly potent oncolytic molecules were produced by fully continuous processes at about 10 kg/day including reaction, extraction, distillation, and crystallization, using disposable equipment contained in fume hoods
    • …
    corecore