438 research outputs found

    Turbulent Wing-Leading-Edge Correlation Assessment for the Shuttle Orbiter

    Get PDF
    This study was conducted in support of the Orbiter damage assessment activity that takes place for each Shuttle mission since STS-107 (STS - Space Transportation System). As part of the damage assessment activity, the state of boundary layer (laminar or turbulent) during reentry needs to be estimated in order to define the aerothermal environment on the Orbiter. Premature turbulence on the wing leading edge (WLE) is possible if a surface irregularity promotes early transition and the resulting turbulent wedge flow contaminates the WLE flow. The objective of this analysis is to develop a criterion to determine if and when the flow along the WLE experiences turbulent heating given an incoming turbulent boundary layer that contaminates the attachment line. The data to be analyzed were all obtained as part of the MH-13 Space Shuttle Orbiter Aerothermodynamic Test conducted on a 1.8%-scale Orbiter model at Calspan/University of Buffalo Research Center in the Large Energy National Shock Tunnels facility. A rational framework was used to develop a means to assess the state of the WLE flow on the Orbiter during reentry given a contaminated attachment-line flow. Evidence of turbulent flow on the WLE has been recently documented for a few STS missions during the Orbiter s flight history, albeit late in the reentry trajectory. The criterion developed herein will be compared to these flight results

    The Remarkably Featureless High Resolution X-ray Spectrum of Mrk 478

    Full text link
    An observation of Mrk 478 using the Chandra Low Energy Transmission Grating Spectrometer is presented. The source exhibited 30-40% flux variations on timescales of order 10000 s together with a slow decline in the spectral softness over the full 80 ks observation. The 0.15--3.0 keV spectrum is well fitted by a single power law with photon index of Gamma = 2.91 +/- 0.03. Combined with high energy data from BeppoSAX, the spectrum from 0.15 to 10 keV is well fit as the sum of two power laws with Gamma = 3.03 +/- 0.04, which dominates below 2 keV and 1.4 +/- 0.2, which dominates above 2 keV (quoting 90% confidence uncertainties). No significant emission or absorption features are detected in the high resolution spectrum, supporting our previous findings using the Extreme Ultraviolet Explorer but contradicting the claims of emission lines by Hwang & Bowyer (1997). There is no evidence of a warm absorber, as found in the high resolution spectra of many Sy 1 galaxies including others classified as narrow line Sy 1 galaxies such as Mrk 478. We suggest that the X-ray continuum may result from Comptonization of disk thermal emission in a hot corona through a range of optical depths.Comment: 21 pages, 7 figures; accepted for publication in the Astronomical Journa

    Intensive HST, RXTE and ASCA Monitoring of NGC 3516: Evidence Against Thermal Reprocessing

    Full text link
    During 1998 April 13-16, NGC 3516 was monitored almost continuously with HST for 10.3 hr in the UV and 2.8 d in the optical, and simultaneous RXTE and ASCA monitoring covered the same period. The X-rays were strongly variable with the soft (0.5-2 keV) showing stronger variations (~65% peak-to-peak) than the hard (2-10 keV; ~50% peak-to-peak). The optical continuum showed much smaller but highly significant variations: a slow ~2.5% rise followed by a faster ~3.5% decline. The short UV observation did not show significant variability. The soft and hard X-ray light curves were strongly correlated with no significant lag. Likewise, the optical continuum bands (3590 and 5510 A) were also strongly correlated with no measurable lag above limits of <0.15 d. However no significant correlation or simple relationship could be found for the optical and X-ray light curves. These results appear difficult to reconcile with previous reports of correlations between X-ray and optical variations and of measurable lags within the optical band for some other Seyfert 1s. These results also present serious problems for "reprocessing" models in which the X-ray source heats a stratified accretion disk which then reemits in the optical/ultraviolet: the synchronous variations within the optical would suggest that the emitting region is <0.3 lt-d across, while the lack of correlation between X-ray and optical variations would indicate, in the context of this model, that any reprocessing region must be >1 lt-d in size. It may be possible to resolve this conflict by invoking anisotropic emission or special geometry, but the most natural explanation appears to be that the bulk of the optical luminosity is generated by some other mechanism than reprocessing.Comment: 23 pages including 6 figures, accepted for publication in Ap

    Towards More Precise Survey Photometry for PanSTARRS and LSST: Measuring Directly the Optical Transmission Spectrum of the Atmosphere

    Full text link
    Motivated by the recognition that variation in the optical transmission of the atmosphere is probably the main limitation to the precision of ground-based CCD measurements of celestial fluxes, we review the physical processes that attenuate the passage of light through the Earth's atmosphere. The next generation of astronomical surveys, such as PanSTARRS and LSST, will greatly benefit from dedicated apparatus to obtain atmospheric transmission data that can be associated with each survey image. We review and compare various approaches to this measurement problem, including photometry, spectroscopy, and LIDAR. In conjunction with careful measurements of instrumental throughput, atmospheric transmission measurements should allow next-generation imaging surveys to produce photometry of unprecedented precision. Our primary concerns are the real-time determination of aerosol scattering and absorption by water along the line of sight, both of which can vary over the course of a night's observations.Comment: 41 pages, 14 figures. Accepted PAS

    Environmental Sensor Placement with Convolutional Gaussian Neural Processes

    Full text link
    Environmental sensors are crucial for monitoring weather conditions and the impacts of climate change. However, it is challenging to maximise measurement informativeness and place sensors efficiently, particularly in remote regions like Antarctica. Probabilistic machine learning models can evaluate placement informativeness by predicting the uncertainty reduction provided by a new sensor. Gaussian process (GP) models are widely used for this purpose, but they struggle with capturing complex non-stationary behaviour and scaling to large datasets. This paper proposes using a convolutional Gaussian neural process (ConvGNP) to address these issues. A ConvGNP uses neural networks to parameterise a joint Gaussian distribution at arbitrary target locations, enabling flexibility and scalability. Using simulated surface air temperature anomaly over Antarctica as ground truth, the ConvGNP learns spatial and seasonal non-stationarities, outperforming a non-stationary GP baseline. In a simulated sensor placement experiment, the ConvGNP better predicts the performance boost obtained from new observations than GP baselines, leading to more informative sensor placements. We contrast our approach with physics-based sensor placement methods and propose future work towards an operational sensor placement recommendation system. This system could help to realise environmental digital twins that actively direct measurement sampling to improve the digital representation of reality.Comment: In review for the Climate Informatics 2023 special issue of Environmental Data Scienc

    Multitrait genetic association analysis identifies 50 new risk loci for gastro-oesophageal reflux, seven new loci for Barrett’s oesophagus and provides insights into clinical heterogeneity in reflux diagnosis

    Get PDF
    Objective: Gastro-oesophageal reflux disease (GERD) has heterogeneous aetiology primarily attributable to its symptom-based definitions. GERD genome-wide association studies (GWASs) have shown strong genetic overlaps with established risk factors such as obesity and depression. We hypothesised that the shared genetic architecture between GERD and these risk factors can be leveraged to (1) identify new GERD and Barrett's oesophagus (BE) risk loci and (2) explore potentially heterogeneous pathways leading to GERD and oesophageal complications. Design: We applied multitrait GWAS models combining GERD (78 707 cases; 288 734 controls) and genetically correlated traits including education attainment, depression and body mass index. We also used multitrait analysis to identify BE risk loci. Top hits were replicated in 23andMe (462 753 GERD cases, 24 099 BE cases, 1 484 025 controls). We additionally dissected the GERD loci into obesity-driven and depression-driven subgroups. These subgroups were investigated to determine how they relate to tissue-specific gene expression and to risk of serious oesophageal disease (BE and/or oesophageal adenocarcinoma, EA). Results: We identified 88 loci associated with GERD, with 59 replicating in 23andMe after multiple testing corrections. Our BE analysis identified seven novel loci. Additionally we showed that only the obesity-driven GERD loci (but not the depression-driven loci) were associated with genes enriched in oesophageal tissues and successfully predicted BE/EA. Conclusion: Our multitrait model identified many novel risk loci for GERD and BE. We present strong evidence for a genetic underpinning of disease heterogeneity in GERD and show that GERD loci associated with depressive symptoms are not strong predictors of BE/EA relative to obesity-driven GERD loci

    The longer-term effects of access to HIV self-tests on HIV testing frequency in high-risk gay and bisexual men: follow-up data from a randomised controlled trial

    Get PDF
    Background: A wait-list randomised controlled trial in Australia (FORTH) in high-risk gay and bisexual men (GBM) showed access to free HIV self-tests (HIVSTs) doubled the frequency of HIV testing in year 1 to reach guideline recommended levels of 4 tests per year, compared to two tests per year in the standard-care arm (facility-based testing). In year 2, men in both arms had access to HIVSTs. We assessed if the effect was maintained for a further 12 months. Methods: Participants included GBM reporting condomless anal intercourse or > 5 male partners in the past 3 months. We included men who had completed at least one survey in both year 1 and 2 and calculated the mean tests per person, based on the validated self-report and clinic records. We used Poisson regression and random effects Poisson regression models to compare the overall testing frequency by study arm, year and testing modality (HIVST/facility-based test). Findings: Overall, 362 men completed at least one survey in year 1 and 343 in year 2. Among men in the intervention arm (access to HIVSTs in both years), the mean number of HIV tests in year 2 (3â‹…7 overall, 2â‹…3 facility-based tests, 1â‹…4 HIVSTs) was lower compared to year 1 (4â‹…1 overall, 1â‹…7 facility-based tests, 2â‹…4 HIVSTs) (RR:0â‹…84, 95% CI:0â‹…75-0â‹…95, p=0â‹…002), but higher than the standard-care arm in year 1 (2â‹…0 overall, RR:1â‹…71, 95% CI:1â‹…48-1.97, p<0â‹…001). Findings were not different when stratified by sociodemographic characteristics or recent high risk sexual history. Interpretation: In year 2, fewer HIVSTs were used on average compared to year 1, but access to free HIVSTs enabled more men to maintain higher HIV testing frequency, compared with facility-based testing only. HIV self-testing should be a key component of HIV testing and prevention strategies. Funding:: This work was supported by grant 568971 from the National Health and Medical Research Council of Australia

    Modelling seawater carbonate chemistry in shellfish aquaculture regions: Insights into CO2 release associated with shell formation and growth

    Get PDF
    Mollusc aquaculture is a high-value industry that is increasing production rapidly in Europe and across the globe. In recent years, there has been discussion of the potential wide-ranging environmental benefits of this form of food production. One aspect of mollusc aquaculture that has received scrutiny is the production of calcareous shells (CaCO3). Mollusc shell growth has sometimes been described as a sink for atmospheric CO2, as it locks away carbon in solid mineral form. However, more rigorous carbonate chemistry modelling, including concurrent changes in seawater pCO2, pH, dissolved inorganic carbon, and total alkalinity, shows that calcification is a net CO2 source to the atmosphere. Combined with discussions about whether mollusc respiration should be included in carbon footprint modelling, this suggests that greater in-depth understanding is required before shellfish aquaculture can be included in carbon trading schemes and footprint calculations. Here, we show that regional differences in the marine carbonate system can alter the amount of CO2 released per unit CaCO3 formation. Our carbonate chemistry modelling shows that a coastal mussel farm in southern Portugal releases up to ~0.290 g of CO2 per g of CaCO3 shell formed. In comparison, an identical farm in the coastal Baltic Sea would produce up to 33% more CO2 per g of CaCO3 (~0.385 g-CO2·(g-CaCO3)−1). This spatial variability should therefore also be considered if mollusc aquaculture is to be included in future carbon trading schemes, and in planning future expansion of production across the industry

    Environmental sensor placement with convolutional Gaussian neural processes

    Get PDF
    Environmental sensors are crucial for monitoring weather conditions and the impacts of climate change. However, it is challenging to place sensors in a way that maximises the informativeness of their measurements, particularly in remote regions like Antarctica. Probabilistic machine learning models can suggest informative sensor placements by finding sites that maximally reduce prediction uncertainty. Gaussian process (GP) models are widely used for this purpose, but they struggle with capturing complex non-stationary behaviour and scaling to large datasets. This paper proposes using a convolutional Gaussian neural process (ConvGNP) to address these issues. A ConvGNP uses neural networks to parameterise a joint Gaussian distribution at arbitrary target locations, enabling flexibility and scalability. Using simulated surface air temperature anomaly over Antarctica as training data, the ConvGNP learns spatial and seasonal non-stationarities, outperforming a non-stationary GP baseline. In a simulated sensor placement experiment, the ConvGNP better predicts the performance boost obtained from new observations than GP baselines, leading to more informative sensor placements. We contrast our approach with physics-based sensor placement methods and propose future steps towards an operational sensor placement recommendation system. Our work could help to realise environmental digital twins that actively direct measurement sampling to improve the digital representation of reality
    • …
    corecore