1,703 research outputs found

    Orbiter Return-To-Flight Entry Aeroheating

    Get PDF
    The Columbia accident on February 1, 2003 began an unprecedented level of effort within the hypersonic aerothermodynamic community to support the Space Shuttle Program. During the approximately six month time frame of the primary Columbia Accident Investigation Board activity, many technical disciplines were involved in a concerted effort to reconstruct the last moments of the Columbia and her crew, and understand the critical events that led to that loss. Significant contributions to the CAIB activity were made by the hypersonic aerothermodynamic community(REF CAIB) in understanding the re-entry environments that led to the propagation of an ascent foam induced wing leading edge damage to a subsequent breech of the wing spar of Columbia, and the subsequent breakup of the vehicle. A core of the NASA hypersonic aerothermodynamics team that was involved in the CAIB investigation has been combined with the United Space Alliance and Boeing Orbiter engineering team in order to position the Space Shuttle Program with a process to perform in-flight Thermal Protection System damage assessments. This damage assessment process is now part of the baselined plan for Shuttle support, and is a direct out-growth of the Columbia accident and NASAs response. Multiple re-entry aeroheating tools are involved in this damage assessment process, many of which have been developed during the Return To Flight activity. In addition, because these aeroheating tools are part of an overall damage assessment process that also involves the thermal and stress analyses community, in addition to a much broader mission support team, an integrated process for performing the damage assessment activities has been developed by the Space Shuttle Program and the Orbiter engineering community. Several subsets of activity in the Orbiter aeroheating communities support to the Return To Flight effort have been described in previous publications (CFD?, Cavity Heating? Any BLT? Grid Generation?). This work will provide a description of the integrated process utilized to perform Orbiter tile damage assessment, and in particular will seek to provide a description of the integrated aeroheating tools utilized to perform these assessments. Individual aeroheating tools will be described which provide the nominal re-entry heating environment characterization for the Orbiter, the heating environments for tile damage, heating effects due to exposed Thermal Protection System substrates, the application of Computational Fluid Dynamics for the description of tile cavity heating, and boundary layer transition prediction. This paper is meant to provide an overall view of the integrated aeroheating assessment process for tile damage assessment as one of a sequence of papers on the development of the boundary layer transition prediction capability in support of Space Shuttle Return To Flight efforts

    blob loss: instance imbalance aware loss functions for semantic segmentation

    Full text link
    Deep convolutional neural networks have proven to be remarkably effective in semantic segmentation tasks. Most popular loss functions were introduced targeting improved volumetric scores, such as the Sorensen Dice coefficient. By design, DSC can tackle class imbalance; however, it does not recognize instance imbalance within a class. As a result, a large foreground instance can dominate minor instances and still produce a satisfactory Sorensen Dice coefficient. Nevertheless, missing out on instances will lead to poor detection performance. This represents a critical issue in applications such as disease progression monitoring. For example, it is imperative to locate and surveil small-scale lesions in the follow-up of multiple sclerosis patients. We propose a novel family of loss functions, nicknamed blob loss, primarily aimed at maximizing instance-level detection metrics, such as F1 score and sensitivity. Blob loss is designed for semantic segmentation problems in which the instances are the connected components within a class. We extensively evaluate a DSC-based blob loss in five complex 3D semantic segmentation tasks featuring pronounced instance heterogeneity in terms of texture and morphology. Compared to soft Dice loss, we achieve 5 percent improvement for MS lesions, 3 percent improvement for liver tumor, and an average 2 percent improvement for Microscopy segmentation tasks considering F1 score.Comment: 23 pages, 7 figures // corrected one mistake where it said beta instead of alpha in the tex

    Thermographic Imaging of the Space Shuttle During Re-Entry Using a Near Infrared Sensor

    Get PDF
    High resolution calibrated near infrared (NIR) imagery of the Space Shuttle Orbiter was obtained during hypervelocity atmospheric re-entry of the STS-119, STS-125, STS-128, STS-131, STS-132, STS-133, and STS-134 missions. This data has provided information on the distribution of surface temperature and the state of the airflow over the windward surface of the Orbiter during descent. The thermal imagery complemented data collected with onboard surface thermocouple instrumentation. The spatially resolved global thermal measurements made during the Orbiter s hypersonic re-entry will provide critical flight data for reducing the uncertainty associated with present day ground-to-flight extrapolation techniques and current state-of-the-art empirical boundary-layer transition or turbulent heating prediction methods. Laminar and turbulent flight data is critical for the validation of physics-based, semi-empirical boundary-layer transition prediction methods as well as stimulating the validation of laminar numerical chemistry models and the development of turbulence models supporting NASA s next-generation spacecraft. In this paper we provide details of the NIR imaging system used on both air and land-based imaging assets. The paper will discuss calibrations performed on the NIR imaging systems that permitted conversion of captured radiant intensity (counts) to temperature values. Image processing techniques are presented to analyze the NIR data for vignetting distortion, best resolution, and image sharpness. Keywords: HYTHIRM, Space Shuttle thermography, hypersonic imaging, near infrared imaging, histogram analysis, singular value decomposition, eigenvalue image sharpnes

    Prenatal and early life influences on epigenetic age in children:a study of mother-offspring pairs from two cohort studies

    Get PDF
    DNA methylation-based biomarkers of aging are highly correlated with actual age. Departures of methylation-estimated age from actual age can be used to define epigenetic measures of child development or age acceleration (AA) in adults. Very little is known about genetic or environmental determinants of these epigenetic measures of aging. We obtained DNA methylation profiles using Infinium HumanMethylation450 BeadChips across five time-points in 1018 mother-child pairs from the Avon Longitudinal Study of Parents and Children. Using the Horvath age estimation method, we calculated epigenetic age for these samples. AA was defined as the residuals from regressing epigenetic age on actual age. AA was tested for associations with cross-sectional clinical variables in children. We identified associations between AA and sex, birth weight, birth by caesarean section and several maternal characteristics in pregnancy, namely smoking, weight, BMI, selenium and cholesterol level. Offspring of non-drinkers had higher AA on average but this difference appeared to resolve during childhood. The associations between sex, birth weight and AA found in ARIES were replicated in an independent cohort (GOYA). In children, epigenetic AA measures are associated with several clinically relevant variables, and early life exposures appear to be associated with changes in AA during adolescence. Further research into epigenetic aging, including the use of causal inference methods, is required to better our understanding of aging

    Comparison of genetic association strategies in the presence of rare alleles

    Get PDF
    In the quest for the missing heritability of most complex diseases, rare variants have received increased attention. Advances in large-scale sequencing have led to a shift from the common disease/common variant hypothesis to the common disease/rare variant hypothesis or have at least reopened the debate about the relevance and importance of rare variants for gene discoveries. The investigation of modeling and testing approaches to identify significant disease/rare variant associations is in full motion. New methods to better deal with parameter estimation instabilities, convergence problems, or multiple testing corrections in the presence of rare variants or effect modifiers of rare variants are in their infancy. Using a recently developed semiparametric strategy to detect causal variants, we investigate the performance of the model-based multifactor dimensionality reduction (MB-MDR) technique in terms of power and family-wise error rate (FWER) control in the presence of rare variants, using population-based and family-based data (FAM-MDR). We compare family-based results obtained from MB-MDR analyses to screening findings from a quantitative trait Pedigree-based association test (PBAT). Population-based data were further examined using penalized regression models. We restrict attention to all available single-nucleotide polymorphisms on chromosome 4 and consider Q1 as the outcome of interest. The considered family-based methods identified marker C4S4935 in the VEGFC gene with estimated power not exceeding 0.35 (FAM-MDR), when FWER was kept under control. The considered population-based methods gave rise to highly inflated FWERs (up to 90% for PBAT screening)

    ORAMA project deliverable 1.2. Final analysis and recommendations for the improvement of statistical data collection methods in Europe for primary raw materials

    Get PDF
    This report brings together the outputs of Task 1.1 and 1.2 of Work Package 1 of the ORAMA project. Task 1.1 aims to produce an inventory of how minerals data are collected within Europe, via a survey of data providers, and Task 1.2 aims to review previous work from past projects, working groups and professional organisations in this subject area. Together this has built a comprehensive understanding of how minerals data are collected in Europe, what data gaps exist, what the issues are with regard to creating harmonised European datasets for minerals information and what good practice examples exist that lessons can be learnt from. The results of the survey show that countries that have a clear legal and regulatory procedure for collecting data often have the most robust systems in place. These countries often also have a strong motivation for collecting such data, such as receiving a significant income from mineral royalties as a result of state ownership of minerals, although resource management or land use planning also provide motivation for the collection of data. The results of the survey also showed there is a large variety in the way data is collected within Europe. This variety is not necessarily an issue with regard harmonisation as long as data providers ensure that they adhere to common data standards and classification systems, such as INSPIRE or UNFC when providing data for aggregation at a European level. The review of previous projects showed the breadth of work that had gone into the improvement of statistical datasets over the last few years. Especially from projects such as Minventory and Minerals4EU, which provide a clear roadmap for harmonising European minerals datasets, or the work of the EGS MREG (EuroGeoSurveys Mineral Resources Expert Group) towards the harmonisation of resource and reserve codes within Europe. A common theme of many of these recommendations is the need for common standards to be adhered to and that in some instances these specifications may need to be adapted to accommodate statistical data for mineral resources which are aggregated at a national scale

    The forgotten drought of 1765–1768: Reconstructing and re-evaluating historical droughts in the British and Irish Isles

    Get PDF
    Historical precipitation records are fundamental for the management of water resources, yet rainfall observations typically span 100–15 0 years at most, with considerable uncertainties surrounding earlier records. Here, we analyse some of the longest a vailabl e precipitation records globally, for England and Wales, Scotland and Ireland. To assess the credibility of these records and extend them further back in time, we statistically reconstruct (using independent predictors) monthly precipitation series representing these regions for the period 1748–2000. By applying the Standardized Precipi- tation Index at 12-month accumulations (SPI-12) to the observed and our reconstructed series we re-evaluate historical meteorological droughts. We find strong agreement between observed and reconstructed drought chronol- ogies in post-1870 records, but divergence in e arlier series due to biases in early precipitation observations. Hence, the 1800s decade was less drought prone in our reconstructions relative to observations. Overall, the drought of 1834–1836 was the most intense SPI-12 event in our reconstruction for England and Wales. Newspaper accounts and documentary sources confirm the extent of impacts across England in particular. We also identify a major, “forgotten” drought in 1765–1768 that affected the British-Irish Isles. This was the most intense event in our reconstructions for Ireland and Scotland, and ranks first for accumulated deficits a cross all three regional series. Moreover, the 1765–1768 event was also the most extreme multi-year drought across all regional series when considering 36-month a ccumulations (SPI-36). Newspaper and other sources confirm the occurrence and major socio- economic impact of this drought, such as major rivers like the Shannon being fordable by foot. Our results provide new insights into historical droughts across the British Irish Isles. Given the importance of historical droughts for stress-testing the resilience of water resources, drought plans and supply sys- tems, the forgotten drought of 1765–1 768 offers perhaps the most extreme benchmark scenario in more than 250-years

    A meta-analysis of genome-wide association studies of epigenetic age acceleration

    Get PDF
    Funding: Generation Scotland received core support from the Chief Scientist Office of the Scottish Government Health Directorates (CZD/16/6) and the Scottish Funding Council (HR03006). Genotyping and DNA methylation profiling of the GS samples was carried out by the Genetics Core Laboratory at the Wellcome Trust Clinical Research Facility, Edinburgh, Scotland and was funded by the Medical Research Council UK and the Wellcome Trust (Wellcome Trust Strategic Award “STratifying Resilience and Depression Longitudinally” ((STRADL) Reference 104036/Z/14/Z)). Funding details for the cohorts included in the study by Lu et al. (2018) can be found in their publication. HCW is supported by a JMAS SIM fellowship from the Royal College of Physicians of Edinburgh and by an ESAT College Fellowship from the University of Edinburgh. AMM & HCW acknowledge the support of the Dr. Mortimer and Theresa Sackler Foundation. SH acknowledges support from grant 1U01AG060908-01. REM is supported by Alzheimer’s Research UK major project grant ARUK-PG2017B-10. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Data Availability: Summary statistics from the research reported in the manuscript will be made available immediately following publication on the Edinburgh Data Share portal with a permanent digital object identifier (DOI). According to the terms of consent for Generation Scotland participants, requests for access to the individual-level data must be reviewed by the GS Access Committee ([email protected]). Individual-level data are not immediately available, due to confidentiality considerations and our legal obligation to protect personal information. These data will, however, be made available upon request and after review by the GS access committee, once ethical and data governance concerns regarding personal data have been addressed by the receiving institution through a Data Transfer Agreement.Peer reviewedPublisher PD
    corecore