3,364 research outputs found

    DESI Commissioning Instrument Metrology

    Get PDF
    The Dark Energy Spectroscopic Instrument (DESI) is under construction to measure the expansion history of the Universe using the Baryon Acoustic Oscillation technique. The spectra of 35 million galaxies and quasars over 14000 sq deg will be measured during the life of the experiment. A new prime focus corrector for the KPNO Mayall telescope will deliver light to 5000 fiber optic positioners. The fibers in turn feed ten broad-band spectrographs. We will describe the methods and results for the commissioning instrument metrology program. The primary goals of this program are to calculate the transformations and further develop the systems that will place fibers within 5um RMS of the target positions. We will use the commissioning instrument metrology program to measure the absolute three axis Cartesian coordinates of the five CCDs and 22 illuminated fiducials on the commissioning instrument

    Ant algorithm hyperheuristic approaches for scheduling problems

    Get PDF
    For decades, optimisation research has investigated methods to find optimal solutions to many problems in the fields of scheduling, timetabling and rostering. A family of abstract methods known as metaheuristics have been developed and applied to many of these problems, but their application to specific problems requires problem-specific coding and parameter adjusting to produce the best results for that problem. Such specialisation makes code difficult to adapt to new problem instances or new problems. One methodology that intended to increase the generality of state of the art algorithms is known as hyperheuristics. Hyperheuristics are algorithms which construct algorithms: using "building block" heuristics, the higher-level algorithm chooses between heuristics to move around the solution space, learning how to use the heuristics to find better solutions. We introduce a new hyperheuristic based upon the well-known ant algorithm metaheuristic, and apply it towards several real-world problems without parameter tuning, producing results that are competitive with other hyperheuristic methods and established bespoke metaheuristic techniques

    Ant algorithm hyperheuristic approaches for scheduling problems

    Get PDF
    For decades, optimisation research has investigated methods to find optimal solutions to many problems in the fields of scheduling, timetabling and rostering. A family of abstract methods known as metaheuristics have been developed and applied to many of these problems, but their application to specific problems requires problem-specific coding and parameter adjusting to produce the best results for that problem. Such specialisation makes code difficult to adapt to new problem instances or new problems. One methodology that intended to increase the generality of state of the art algorithms is known as hyperheuristics. Hyperheuristics are algorithms which construct algorithms: using "building block" heuristics, the higher-level algorithm chooses between heuristics to move around the solution space, learning how to use the heuristics to find better solutions. We introduce a new hyperheuristic based upon the well-known ant algorithm metaheuristic, and apply it towards several real-world problems without parameter tuning, producing results that are competitive with other hyperheuristic methods and established bespoke metaheuristic techniques

    The Remarkably Featureless High Resolution X-ray Spectrum of Mrk 478

    Full text link
    An observation of Mrk 478 using the Chandra Low Energy Transmission Grating Spectrometer is presented. The source exhibited 30-40% flux variations on timescales of order 10000 s together with a slow decline in the spectral softness over the full 80 ks observation. The 0.15--3.0 keV spectrum is well fitted by a single power law with photon index of Gamma = 2.91 +/- 0.03. Combined with high energy data from BeppoSAX, the spectrum from 0.15 to 10 keV is well fit as the sum of two power laws with Gamma = 3.03 +/- 0.04, which dominates below 2 keV and 1.4 +/- 0.2, which dominates above 2 keV (quoting 90% confidence uncertainties). No significant emission or absorption features are detected in the high resolution spectrum, supporting our previous findings using the Extreme Ultraviolet Explorer but contradicting the claims of emission lines by Hwang & Bowyer (1997). There is no evidence of a warm absorber, as found in the high resolution spectra of many Sy 1 galaxies including others classified as narrow line Sy 1 galaxies such as Mrk 478. We suggest that the X-ray continuum may result from Comptonization of disk thermal emission in a hot corona through a range of optical depths.Comment: 21 pages, 7 figures; accepted for publication in the Astronomical Journa

    Proteomic study of proteolysis during ripening of cheddar cheese made from milk over a lactation cycle

    Get PDF
    Milk for cheese production in Ireland is predominantly produced by pasture-fed spring-calving herds. Consequently, there are marked seasonal changes in milk composition, which arise from the interactive lactational, dietary and environmental factors. In this study, Cheddar cheese was manufactured on a laboratory scale from milk taken from a spring calving herd, over a 9-month lactation cycle between early April and early December. Plasmin activity of 6-months-old Cheddar cheese samples generally decreased over ripening time. One-dimensional urea-polyacrylamide gel electrophoresis (PAGE) of cheese samples taken after 6 months of ripening showed an extensive hydrolysis of caseins, with the fastest hydrolysis of αs1-caseins in cheeses made in August. A proteomic comparison between cheeses produced from milk taken in April, August and December showed a reduction in levels of β-casein and appearance of additional products, corresponding to low molecular weight hydrolysis products of the caseins. This study has demonstrated that a seasonal milk supply causes compositional differences in Cheddar cheese, and that proteomic tools are helpful in understanding the impact of those differences

    Has working-age morbidity been declining? Changes over time in survey measures of general health, chronic diseases, symptoms and biomarkers in England 1994-2014

    Get PDF
    Objectives: As life expectancy has increased in high-income countries, there has been a global debate about whether additional years of life are free from ill-health/disability. However, little attention has been given to changes over time in morbidity in the working-age population, particularly outside the US, despite its importance for health monitoring and social policy. This study therefore asks: what are the changes over time in working-age morbidity in England over two decades? Design, setting and participants: We use a high-quality annual cross-sectional survey, the Health Survey for England (‘HSE’) 1994-2014. HSE uses a random sample of the English household population, with a combined sample size of over 140,000 people. We produce a newly-harmonised version of HSE that maximises comparability over time, including new non-response weights. While HSE is used for monitoring population health, it has hitherto not used for investigating morbidity as a whole. Outcome measures: We analyse all 39 measures that are fully comparable over time – including chronic disease diagnoses, symptomatology and a number of biomarkers – adjusting for gender and age. Results: We find a mixed picture: we see improving cardiovascular and respiratory health, but deteriorations in obesity, diabetes, some biomarkers, and feelings of extreme anxiety/depression, alongside stability in moderate mental ill-health and musculoskeletal-related health. In several domains we also see stable or rising chronic disease diagnoses even where symptomatology has declined. While data limitations make it challenging to combine these measures into a single morbidity index, there is little systematic trend for declining morbidity to be seen in the measures that predict self-reported health most strongly. Conclusions: Despite considerable falls in working-age mortality – and the assumptions of many policymakers that morbidity will follow mortality – there is no systematic improvement in overall working-age morbidity in England from 1994 to 2014

    The highly variable X-ray spectrum of the luminous Seyfert 1 galaxy 1H 0419-577

    Full text link
    An XMM-Newton observation of the luminous Seyfert 1 galaxy 1H 0419-577 is presented. We find that the spectrum is well fitted by a power law of canonical slope (gamma ~ 1.9) and 3 blackbody components (to model the strong soft excess). The XMM data are compared and contrasted with observations by ROSAT in 1992 and by ASCA and BeppoSAX in 1996. We find that the overall X-ray spectrum has changed substantially over the period, and suggest that the changes are driven by the soft X-ray component. When bright, as in our XMM-Newton observation, it appears that the enhanced soft flux cools the Comptonising corona, causing the 2-10 keV power law to assume a `typical' slope, in contrast to the unusually hard (`photon-starved') spectra observed by ASCA and BeppoSAX four years earlier.Comment: 5 pages, 4 figures; accepted by MNRA

    The complex relationship between pediatric cardiac surgical case volumes and mortality rates in a national clinical database

    Get PDF
    ObjectiveWe sought to determine the association between pediatric cardiac surgical volume and mortality using sophisticated case-mix adjustment and a national clinical database.MethodsPatients 18 years of age or less who had a cardiac operation between 2002 and 2006 were identified in the Society of Thoracic Surgeons Congenital Heart Surgery Database (32,413 patients from 48 programs). Programs were grouped by yearly pediatric cardiac surgical volume (small, <150; medium, 150–249; large, 250–349; and very large, ≥350 cases per year). Logistic regression was used to adjust mortality rates for volume, surgical case mix (Aristotle Basic Complexity and Risk Adjustment for Congenital Heart Surgery, Version 1 categories), patient risk factors, and year of operation.ResultsWith adjustment for patient-level risk factors and surgical case mix, there was an inverse relationship between overall surgical volume as a continuous variable and mortality (P = .002). When the data were displayed graphically, there appeared to be an inflection point between 200 and 300 cases per year. When volume was analyzed as a categorical variable, the relationship was most apparent for difficult operations (Aristotle technical difficulty component score, >3.0), for which mortality decreased from 14.8% (60/406) at small programs to 8.4% (157/1858) at very large programs (P = .02). The same was true for the subgroup of patients who underwent Norwood procedures (36.5% [23/63] vs 16.9% [81/479], P < .0001). After risk adjustment, all groups performed similarly for low-difficulty operations. Conversely, for difficult procedures, small programs performed significantly worse. For Norwood procedures, very large programs outperformed all other groups.ConclusionThere was an inverse association between pediatric cardiac surgical volume and mortality that became increasingly important as case complexity increased. Although volume was not associated with mortality for low-complexity cases, lower-volume programs underperformed larger programs as case complexity increased
    • …
    corecore