174 research outputs found

    The Radio to Gamma Ray Connection of EGRET Blazars: Correlation, Regression and Monte Carlo Analysis

    Full text link
    A comprehensive statistical analysis of the broadband properties of EGRET blazars is presented. This analysis includes sources identified as blazars in the Sowards-Emmerd publications. Using this sample of 122 sources, we find that there is a relationship LγLr0.77±0.03L_\gamma \propto {L_r}^{0.77 \pm 0.03} as well as a correlation between αog\alpha_{og} and αro\alpha_{ro}, and a correlation between radio luminosity and αog\alpha_{og}. Through the use of Monte Carlo simulations, we can replicate the observed luminosity relationship if a synchrotron self-Compton model is assumed. However, this relationship can not be replicated if an external Compton scattering model is assumed. These differences are primarily due to beaming effects. In addition it has been determined that the intrinsic radio luminosity of the parent sample falls in the range 1021<L<1030WattsHz110^{21} < L < 10^{30} {\rm Watts Hz^{-1}} and that the bulk Lorentz factors of the source are in the range 1<Γ<30 1 < \Gamma < 30 , in a agreement with VLBI observations. Finally, we discuss implications for GLAST, successfully launched in June 2008

    Not fitting in and getting out : psychological type and congregational satisfaction among Anglican churchgoers in England

    Get PDF
    Listening to the motivations reported by individuals for ceasing church attendance and becoming church leavers, Francis and Richter identified high on the list the sense of "not fitting in". Drawing on psychological type theory, several recent studies have documented the way in which some psychological types are over-represented in church congregations and other psychological types are under-represented. Bringing these two observations together, the present study tested the hypothesis that church congregations have created type-alike communities within which individuals displaying the opposite type preferences are more likely to feel marginalised and to display lower levels of satisfaction with the congregations they attend. Data were provided by 1867 churchgoers who completed a measure of psychological type, together with measures of frequency of attendance and congregational satisfaction. These data confirmed that congregations were weighted towards preferences for introversion, sensing, feeling and judging, and that individuals displaying the opposite preferences (especially intuition, thinking and perceiving) recorded lower levels of congregational satisfaction. The implications of these findings are discussed for promoting congregational retention by enhancing awareness of psychological type preferences among those who attend

    Contribution of the Gamma-ray Loud Radio Galaxies Core Emissions to the Cosmic MeV and GeV Gamma-Ray Background Radiation

    Full text link
    The Fermi gamma-ray satellite has recently detected gamma-ray emissions from radio galaxy cores. From these samples, we first examine the correlation between the luminosities at 5 GHz, L_{5GHz}, and at 0.1-10 GeV, L_{gamma}, of these gamma-ray loud radio galaxies. We find that the correlation is significant with L_{gamma} \propto L_{5GHz}^{1.16} based on a partial correlation analysis. Using this correlation and the radio luminosity function (RLF) of radio galaxies, we further explore the contribution of gamma-ray loud radio galaxies to the unresolved extragalactic gamma-ray background (EGRB). The gamma-ray luminosity function is obtained by normalizing the RLF to reproduce the source count distribution of the Fermi gamma-ray loud radio galaxies. We find that gamma-ray loud radio galaxies will explain ~25% of the unresolved Fermi EGRB flux above 100 MeV and will also make a significant contribution to the EGRB in the 1-30 MeV energy band. Since blazars explain 22% of the EGRB above 100 MeV, radio loud active galactic nuclei (AGNs) population explains ~47% of the unresolved EGRB. We further make an interpretation on the origin of the EGRB. The observed EGRB spectrum at 0.2-100 GeV does not show an absorption signature by the extragalactic background light. Thus, the dominant population of the origin of EGRB at very high energy (>30 GeV) might be nearby gamma-ray emitting sources or sources with very hard gamma-ray spectrum.Comment: 9 pages, accepted for publication in Ap

    Local Difference Measures between Complex Networks for Dynamical System Model Evaluation

    Get PDF
    Acknowledgments We thank Reik V. Donner for inspiring suggestions that initialized the work presented herein. Jan H. Feldhoff is credited for providing us with the STARS simulation data and for his contributions to fruitful discussions. Comments by the anonymous reviewers are gratefully acknowledged as they led to substantial improvements of the manuscript.Peer reviewedPublisher PD

    Speech and language therapy for aphasia following stroke

    Get PDF
    Background  Aphasia is an acquired language impairment following brain damage that affects some or all language modalities: expression and understanding of speech, reading, and writing. Approximately one third of people who have a stroke experience aphasia.  Objectives  To assess the effects of speech and language therapy (SLT) for aphasia following stroke.  Search methods  We searched the Cochrane Stroke Group Trials Register (last searched 9 September 2015), CENTRAL (2015, Issue 5) and other Cochrane Library Databases (CDSR, DARE, HTA, to 22 September 2015), MEDLINE (1946 to September 2015), EMBASE (1980 to September 2015), CINAHL (1982 to September 2015), AMED (1985 to September 2015), LLBA (1973 to September 2015), and SpeechBITE (2008 to September 2015). We also searched major trials registers for ongoing trials including ClinicalTrials.gov (to 21 September 2015), the Stroke Trials Registry (to 21 September 2015), Current Controlled Trials (to 22 September 2015), and WHO ICTRP (to 22 September 2015). In an effort to identify further published, unpublished, and ongoing trials we also handsearched theInternational Journal of Language and Communication Disorders(1969 to 2005) and reference lists of relevant articles, and we contacted academic institutions and other researchers. There were no language restrictions.  Selection criteria  Randomised controlled trials (RCTs) comparing SLT (a formal intervention that aims to improve language and communication abilities, activity and participation) versus no SLT; social support or stimulation (an intervention that provides social support and communication stimulation but does not include targeted therapeutic interventions); or another SLT intervention (differing in duration, intensity, frequency, intervention methodology or theoretical approach).  Data collection and analysis  We independently extracted the data and assessed the quality of included trials. We sought missing data from investigators.  Main results  We included 57 RCTs (74 randomised comparisons) involving 3002 participants in this review (some appearing in more than one comparison). Twenty-seven randomised comparisons (1620 participants) assessed SLT versus no SLT; SLT resulted in clinically and statistically significant benefits to patients' functional communication (standardised mean difference (SMD) 0.28, 95% confidence interval (CI) 0.06 to 0.49, P = 0.01), reading, writing, and expressive language, but (based on smaller numbers) benefits were not evident at follow-up. Nine randomised comparisons (447 participants) assessed SLT with social support and stimulation; meta-analyses found no evidence of a difference in functional communication, but more participants withdrew from social support interventions than SLT. Thirty-eight randomised comparisons (1242 participants) assessed two approaches to SLT. Functional communication was significantly better in people with aphasia that received therapy at a high intensity, high dose, or over a long duration compared to those that received therapy at a lower intensity, lower dose, or over a shorter period of time. The benefits of a high intensity or a high dose of SLT were confounded by a significantly higher dropout rate in these intervention groups. Generally, trials randomised small numbers of participants across a range of characteristics (age, time since stroke, and severity profiles), interventions, and outcomes.  Authors' conclusions  Our review provides evidence of the effectiveness of SLT for people with aphasia following stroke in terms of improved functional communication, reading, writing, and expressive language compared with no therapy. There is some indication that therapy at high intensity, high dose or over a longer period may be beneficial. HIgh-intensity and high dose interventions may not be acceptable to all

    Scientific Application Requirements for Leadership Computing at the Exascale

    Get PDF
    The Department of Energy s Leadership Computing Facility, located at Oak Ridge National Laboratory s National Center for Computational Sciences, recently polled scientific teams that had large allocations at the center in 2007, asking them to identify computational science requirements for future exascale systems (capable of an exaflop, or 1018 floating point operations per second). These requirements are necessarily speculative, since an exascale system will not be realized until the 2015 2020 timeframe, and are expressed where possible relative to a recent petascale requirements analysis of similar science applications [1]. Our initial findings, which beg further data collection, validation, and analysis, did in fact align with many of our expectations and existing petascale requirements, yet they also contained some surprises, complete with new challenges and opportunities. First and foremost, the breadth and depth of science prospects and benefits on an exascale computing system are striking. Without a doubt, they justify a large investment, even with its inherent risks. The possibilities for return on investment (by any measure) are too large to let us ignore this opportunity. The software opportunities and challenges are enormous. In fact, as one notable computational scientist put it, the scale of questions being asked at the exascale is tremendous and the hardware has gotten way ahead of the software. We are in grave danger of failing because of a software crisis unless concerted investments and coordinating activities are undertaken to reduce and close this hardwaresoftware gap over the next decade. Key to success will be a rigorous requirement for natural mapping of algorithms to hardware in a way that complements (rather than competes with) compilers and runtime systems. The level of abstraction must be raised, and more attention must be paid to functionalities and capabilities that incorporate intent into data structures, are aware of memory hierarchy, possess fault tolerance, exploit asynchronism, and are power-consumption aware. On the other hand, we must also provide application scientists with the ability to develop software without having to become experts in the computer science components. Numerical algorithms are scattered broadly across science domains, with no one particular algorithm being ubiquitous and no one algorithm going unused. Structured grids and dense linear algebra continue to dominate, but other algorithm categories will become more common. A significant increase is projected for Monte Carlo algorithms, unstructured grids, sparse linear algebra, and particle methods, and a relative decrease foreseen in fast Fourier transforms. These projections reflect the expectation of much higher architecture concurrency and the resulting need for very high scalability. The new algorithm categories that application scientists expect to be increasingly important in the next decade include adaptive mesh refinement, implicit nonlinear systems, data assimilation, agent-based methods, parameter continuation, and optimization. The attributes of leadership computing systems expected to increase most in priority over the next decade are (in order of importance) interconnect bandwidth, memory bandwidth, mean time to interrupt, memory latency, and interconnect latency. The attributes expected to decrease most in relative priority are disk latency, archival storage capacity, disk bandwidth, wide area network bandwidth, and local storage capacity. These choices by application developers reflect the expected needs of applications or the expected reality of available hardware. One interpretation is that the increasing priorities reflect the desire to increase computational efficiency to take advantage of increasing peak flops [floating point operations per second], while the decreasing priorities reflect the expectation that computational efficiency will not increase. Per-core requirements appear to be relatively static, while aggregate requirements will grow with the system. This projection is consistent with a relatively small increase in performance per core with a dramatic increase in the number of cores. Leadership system software must face and overcome issues that will undoubtedly be exacerbated at the exascale. The operating system (OS) must be as unobtrusive as possible and possess more stability, reliability, and fault tolerance during application execution. As applications will be more likely at the exascale to experience loss of resources during an execution, the OS must mitigate such a loss with a range of responses. New fault tolerance paradigms must be developed and integrated into applications. Just as application input and output must not be an afterthought in hardware design, job management, too, must not be an afterthought in system software design. Efficient scheduling of those resources will be a major obstacle faced by leadership computing centers at the exas..

    Photoferrotrophy: Remains of an ancient photosynthesis in modern environments

    Get PDF
    © 2017 Camacho, Walter, Picazo and Zopfi. Photoferrotrophy, the process by which inorganic carbon is fixed into organic matter using light as an energy source and reduced iron [Fe(II)] as an electron donor, has been proposed as one of the oldest photoautotrophic metabolisms on Earth. Under the iron-rich (ferruginous) but sulfide poor conditions dominating the Archean ocean, this type of metabolism could have accounted for most of the primary production in the photic zone. Here we review the current knowledge of biogeochemical, microbial and phylogenetic aspects of photoferrotrophy, and evaluate the ecological significance of this process in ancient and modern environments. From the ferruginous conditions that prevailed during most of the Archean, the ancient ocean evolved toward euxinic (anoxic and sulfide rich) conditions and, finally, much after the advent of oxygenic photosynthesis, to a predominantly oxic environment. Under these new conditions photoferrotrophs lost importance as primary producers, and now photoferrotrophy remains as a vestige of a formerly relevant photosynthetic process. Apart from the geological record and other biogeochemical markers, modern environments resembling the redox conditions of these ancient oceans can offer insights into the past significance of photoferrotrophy and help to explain how this metabolism operated as an important source of organic carbon for the early biosphere. Iron-rich meromictic (permanently stratified) lakes can be considered as modern analogs of the ancient Archean ocean, as they present anoxic ferruginous water columns where light can still be available at the chemocline, thus offering suitable niches for photoferrotrophs. A few bacterial strains of purple bacteria as well as of green sulfur bacteria have been shown to possess photoferrotrophic capacities, and hence, could thrive in these modern Archean ocean analogs. Studies addressing the occurrence and the biogeochemical significance of photoferrotrophy in ferruginous environments have been conducted so far in lakes Matano, Pavin, La Cruz, and the Kabuno Bay of Lake Kivu. To date, only in the latter two lakes a biogeochemical role of photoferrotrophs has been confirmed. In this review we critically summarize the current knowledge on iron-driven photosynthesis, as a remains of ancient Earth biogeochemistry

    Investigating the relationships among the South Atlantic Magnetic Anomaly, southern nighttime midlatitude trough, and nighttime Weddell Sea Anomaly during southern summer

    Get PDF
    [i] This study utilized the multi-instrument data of the Defense Meteorological Satellite Program to investigate the evening/nighttime topside ionosphere during the 1996/1997 southern summer. A series of regional surface maps were constructed and permitted the tracking of the topside ionosphere's plasma density features, plasma composition, thermal structures, and vertical and horizontal plasma flows. These maps tracked a complete nighttime Weddell Sea Anomaly (WSA) and strong horizontal plasma flows that registered the high-conductivity regions of the South Atlantic Magnetic Anomaly (SAMA). These regions developed over the southeastern Pacific, just equatorward of the WSA, and over the South Atlantic. A heavy-ion stagnation trough developed poleward of the SAMA affected regions. Thus, the trough appeared on the WSA's equatorward side. During periods of increasing magnetic activity, the plasmapause was the WSA's poleward boundary. A statistical study modeled the trough's magnetic activity dependence and revealed a strong east-west hemispherical difference that was due to the SAMA effects. When the AE6 was 0 nT, the trough appeared at (57.49 ± 2.82)°S (geomagnetic) over the southwestern hemisphere. Owing to the SAMA's special electrodynamic effects, the trough developed at lower latitudes, (42.39 ± 3.04)°S, over the southeastern hemisphere. Meanwhile, the plasmapause occurred at ̃(62.5 ± 4)°S, and the WSA's peak appeared at ̃(56.2 ± 4)°S. Hence, there was a ̃20° (lat) separation between the trough and the plasmapause over the southeastern hemisphere. Between 210°E and 330°E (geographic), the WSA filled this gap. With increasing magnetic activity, the trough in the SAMA affected regions moved poleward at a rate of (0.0157 ± 0.004)°S/nT. Elsewhere, it had a (0.0196 ± 0.002)°S/nT equatorward movement
    corecore