2,927 research outputs found

    Quantifying correlations between galaxy emission lines and stellar continua

    Get PDF
    We analyse the correlations between continuum properties and emission line equivalent widths of star-forming and active galaxies from the Sloan Digital Sky Survey. Since upcoming large sky surveys will make broad-band observations only, including strong emission lines into theoretical modelling of spectra will be essential to estimate physical properties of photometric galaxies. We show that emission line equivalent widths can be fairly well reconstructed from the stellar continuum using local multiple linear regression in the continuum principal component analysis (PCA) space. Line reconstruction is good for star-forming galaxies and reasonable for galaxies with active nuclei. We propose a practical method to combine stellar population synthesis models with empirical modelling of emission lines. The technique will help generate more accurate model spectra and mock catalogues of galaxies to fit observations of the new surveys. More accurate modelling of emission lines is also expected to improve template-based photometric redshift estimation methods. We also show that, by combining PCA coefficients from the pure continuum and the emission lines, automatic distinction between hosts of weak active galactic nuclei (AGNs) and quiescent star-forming galaxies can be made. The classification method is based on a training set consisting of high-confidence starburst galaxies and AGNs, and allows for the similar separation of active and star-forming galaxies as the empirical curve found by Kauffmann et al. We demonstrate the use of three important machine learning algorithms in the paper: k-nearest neighbour finding, k-means clustering and support vector machines.Comment: 14 pages, 14 figures. Accepted by MNRAS on 2015 December 22. The paper's website with data and code is at http://www.vo.elte.hu/papers/2015/emissionlines

    Oral rivaroxaban versus standard therapy for the treatment of symptomatic venous thromboembolism : a pooled analysis of the EINSTEIN-DVT and PE randomized studies

    Get PDF
    Background: Standard treatment for venous thromboembolism (VTE) consists of a heparin combined with vitamin K antagonists. Direct oral anticoagulants have been investigated for acute and extended treatment of symptomatic VTE; their use could avoid parenteral treatment and/or laboratory monitoring of anticoagulant effects. Methods: A prespecified pooled analysis of the EINSTEIN-DVT and EINSTEIN-PE studies compared the efficacy and safety of rivaroxaban (15 mg twice-daily for 21 days, followed by 20 mg once-daily) with standard-therapy (enoxaparin 1.0 mg/kg twice-daily and warfarin or acenocoumarol). Patients were treated for 3, 6, or 12 months and followed for suspected recurrent VTE and bleeding. The prespecified noninferiority margin was 1.75. Results: 8282 patients were enrolled. 4151 received rivaroxaban and 4131 received standard-therapy. The primary efficacy outcome occurred in 86 rivaroxaban-treated patients (2.1%) compared with 95 (2.3%) standard-therapy-treated patients (hazard ratio, 0.89; 95% confidence interval [CI], 0.66-1.19; pnoninferiority<0.001). Major bleeding was observed in 40 (1.0%) and 72 (1.7%) patients in the rivaroxaban and standard-therapy groups, respectively (hazard ratio, 0.54; 95% CI, 0.37-0.79; p=0.002). In key subgroups, including fragile patients, cancer patients, patients presenting with large clots and those with a history of recurrent VTE, the efficacy and safety of rivaroxaban was similar compared with standard-therapy. Conclusion: The single-drug approach with rivaroxaban resulted in similar efficacy to standard-therapy and was associated with a significantly lower rate of major bleeding. Efficacy and safety results were consistent among key patient subgroups

    Reliable Eigenspectra for New Generation Surveys

    Get PDF
    We present a novel technique to overcome the limitations of the applicability of Principal Component Analysis to typical real-life data sets, especially astronomical spectra. Our new approach addresses the issues of outliers, missing information, large number of dimensions and the vast amount of data by combining elements of robust statistics and recursive algorithms that provide improved eigensystem estimates step-by-step. We develop a generic mechanism for deriving reliable eigenspectra without manual data censoring, while utilising all the information contained in the observations. We demonstrate the power of the methodology on the attractive collection of the VIMOS VLT Deep Survey spectra that manifest most of the challenges today, and highlight the improvements over previous workarounds, as well as the scalability of our approach to collections with sizes of the Sloan Digital Sky Survey and beyond.Comment: 7 pages, 3 figures, accepted to MNRA

    Aerothermodynamic Analysis of a Reentry Brazilian Satellite

    Full text link
    This work deals with a computational investigation on the small ballistic reentry Brazilian vehicle SARA (acronyms for SAt\'elite de Reentrada Atmosf\'erica). Hypersonic flows over the vehicle SARA at zero-degree angle of attack in a chemical equilibrium and thermal non-equilibrium are modeled by the Direct Simulation Monte Carlo (DSMC) method, which has become the main technique for studying complex multidimensional rarefied flows, and that properly accounts for the non-equilibrium aspects of the flows. The emphasis of this paper is to examine the behavior of the primary properties during the high altitude portion of SARA reentry. In this way, velocity, density, pressure and temperature field are investigated for altitudes of 100, 95, 90, 85 and 80 km. In addition, comparisons based on geometry are made between axisymmetric and planar two-dimensional configurations. Some significant differences between these configurations were noted on the flowfield structure in the reentry trajectory. The analysis showed that the flow disturbances have different influence on velocity, density, pressure and temperature along the stagnation streamline ahead of the capsule nose. It was found that the stagnation region is a thermally stressed zone. It was also found that the stagnation region is a zone of strong compression, high wall pressure. Wall pressure distributions are compared with those of available experimental data and good agreement is found along the spherical nose for the altitude range investigated.Comment: The paper will be published in Vol. 42 of the Brazilian Journal of Physic

    Nutritional soil heterogeneity and Mycorrhiza as determinants of plant species diversity

    Get PDF
    Patterns in nutrient availability often vary both in space and time (e.g. Pegtel 1987; Stark 1994; Marschner 1995) and small differences can lead to large consequences in the ecophysiological responses and competitive abilities of plant species (Fitter 1982; Wedin & Tilman 1990; Grime 1994). Nevertheless the spatial scale and the degree of spatial heterogeneity and how this might differ among communities are poorly understood. Most plant individuals experience nutrient availability through their fungal partner. Different mycorrhiza types have some important ecological and physiological differences that may have important consequences for the competitive abilities of the associated plants. The impact of mycorrhiza networks relative to other mechanisms on the interaction between plant species remains unclear, but several experiments suggest a considerable influence, at least in some terrestrial ecosystems. In this review, we will discuss causes and consequences of soil heterogeneity, at spatial scales ranging from individual plants to the level of plant communities, in view of their impact on competition and coexistence. A similar approach will be applied to the issue of effects of mycorrhizal fungi, asking the question whether they intensify interspecific competition or facilitate coexistence

    Ranging patterns and site fidelity of Snubfin Dolphins in Yawuru Nagulagun/Roebuck Bay, Western Australia

    Get PDF
    For long-lived species such as marine mammals, having sufficient data on ranging patterns and space use in a timescale suitable for population management and conservation can be difficult. Yawuru Nagulagun/Roebuck Bay in the northwest of Western Australia supports one of the largest known populations of Australian snubfin dolphins (Orcaella heinsohni)—a species with a limited distribution, vulnerable conservation status, and high cultural value. Understanding the species’ use of this area will inform management for the long-term conservation of this species. We combined 11 years of data collected from a variety of sources between 2007 and 2020 to assess the ranging patterns and site fidelity of this population. Ranging patterns were estimated using minimum convex polygons (MCPs) and fixed kernel densities (weighted to account for survey effort) to estimate core and representative areas of use for both the population and for individuals. We estimated the population to range over a small area within the bay (103.05 km2). The Mean individual representative area of use (95% Kernel density contour) was estimated as 39.88 km2 (± 32.65 SD) and the Mean individual core area of use (50% Kernel density contour) was estimated as 21.66 km2 (±18.85 SD) with the majority of sightings located in the northern part of the bay less than 10 km from the coastline. Most individuals (56%) showed moderate to high levels of site fidelity (i.e., part-time or long-term residency) when individual re-sight rates were classified using agglomerative hierarchical clustering (AHC). These results emphasize the importance of the area to this vulnerable species, particularly the area within the Port of Broome that has been identified within the population’s core range. The pressures associated with coastal development and exposure to vessel traffic, noise, and humans will need to be considered in ongoing management efforts. Analyzing datasets from multiple studies and across time could be beneficial for threatened species where little is known on their ranging patterns and site fidelity. Combined datasets can provide larger sample sizes over an extended period of time, fill knowledge gaps, highlight data limitations, and identify future research needs to be considered with dedicated studies

    Antipsychotics and Torsadogenic Risk: Signals Emerging from the US FDA Adverse Event Reporting System Database

    Get PDF
    Background: Drug-induced torsades de pointes (TdP) and related clinical entities represent a current regulatory and clinical burden. Objective: As part of the FP7 ARITMO (Arrhythmogenic Potential of Drugs) project, we explored the publicly available US FDA Adverse Event Reporting System (FAERS) database to detect signals of torsadogenicity for antipsychotics (APs). Methods: Four groups of events in decreasing order of drug-attributable risk were identified: (1) TdP, (2) QT-interval abnormalities, (3) ventricular fibrillation/tachycardia, and (4) sudden cardiac death. The reporting odds ratio (ROR) with 95 % confidence interval (CI) was calculated through a cumulative analysis from group 1 to 4. For groups 1+2, ROR was adjusted for age, gender, and concomitant drugs (e.g., antiarrhythmics) and stratified for AZCERT drugs, lists I and II (http://www.azcert.org, as of June 2011). A potential signal of torsadogenicity was defined if a drug met all the following criteria: (a) four or more cases in group 1+2; (b) significant ROR in group 1+2 that persists through the cumulative approach; (c) significant adjusted ROR for group 1+2 in the stratum without AZCERT drugs; (d) not included in AZCERT lists (as of June 2011). Results: Over the 7-year period, 37 APs were reported in 4,794 cases of arrhythmia: 140 (group 1), 883 (group 2), 1,651 (group 3), and 2,120 (group 4). Based on our criteria, the following potential signals of torsadogenicity were found: amisulpride (25 cases; adjusted ROR in the stratum without AZCERT drugs = 43.94, 95 % CI 22.82-84.60), cyamemazine (11; 15.48, 6.87-34.91), and olanzapine (189; 7.74, 6.45-9.30). Conclusions: This pharmacovigilance analysis on the FAERS found 3 potential signals of torsadogenicity for drugs previously unknown for this risk

    Argumentation in school science : Breaking the tradition of authoritative exposition through a pedagogy that promotes discussion and reasoning

    Get PDF
    The value of argumentation in science education has become internationally recognised and has been the subject of many research studies in recent years. Successful introduction of argumentation activities in learning contexts involves extending teaching goals beyond the understanding of facts and concepts, to include an emphasis on cognitive and metacognitive processes, epistemic criteria and reasoning. The authors focus on the difficulties inherent in shifting a tradition of teaching from one dominated by authoritative exposition to one that is more dialogic, involving small-group discussion based on tasks that stimulate argumentation. The paper builds on previous research on enhancing the quality of argument in school science, to focus on how argumentation activities have been designed, with appropriate strategies, resources and modelling, for pedagogical purposes. The paper analyses design frameworks, their contexts and lesson plans, to evaluate their potential for enhancing reasoning through foregrounding the processes of argumentation. Examples of classroom dialogue where teachers adopt the frameworks/plans are analysed to show how argumentation processes are scaffolded. The analysis shows that several layers of interpretation are needed and these layers need to be aligned for successful implementation. The analysis serves to highlight the potential and limitations of the design frameworks

    The development and application of a new tool to assess the adequacy of the content and timing of antenatal care

    Get PDF
    Abstract Background: Current measures of antenatal care use are limited to initiation of care and number of visits. This study aimed to describe the development and application of a tool to assess the adequacy of the content and timing of antenatal care. Methods: The Content and Timing of care in Pregnancy (CTP) tool was developed based on clinical relevance for ongoing antenatal care and recommendations in national and international guidelines. The tool reflects minimal care recommended in every pregnancy, regardless of parity or risk status. CTP measures timing of initiation of care, content of care (number of blood pressure readings, blood tests and ultrasound scans) and whether the interventions were received at an appropriate time. Antenatal care trajectories for 333 pregnant women were then described using a standard tool (the APNCU index), that measures the quantity of care only, and the new CTP tool. Both tools categorise care into 4 categories, from ‘Inadequate’ (both tools) to ‘Adequate plus’ (APNCU) or ‘Appropriate’ (CTP). Participants recorded the timing and content of their antenatal care prospectively using diaries. Analysis included an examination of similarities and differences in categorisation of care episodes between the tools. Results: According to the CTP tool, the care trajectory of 10,2% of the women was classified as inadequate, 8,4% as intermediate, 36% as sufficient and 45,3% as appropriate. The assessment of quality of care differed significantly between the two tools. Seventeen care trajectories classified as ‘Adequate’ or ‘Adequate plus’ by the APNCU were deemed ‘Inadequate’ by the CTP. This suggests that, despite a high number of visits, these women did not receive the minimal recommended content and timing of care. Conclusions: The CTP tool provides a more detailed assessment of the adequacy of antenatal care than the current standard index. However, guidelines for the content of antenatal care vary, and the tool does not at the moment grade over-use of interventions as ‘Inappropriate’. Further work needs to be done to refine the content items prior to larger scale testing of the impact of the new measure
    corecore