5,204 research outputs found

    The spectral evolution of impulsive solar X-ray flares

    Full text link
    The time evolution of the spectral index and the non-thermal flux in 24 impulsive solar hard X-ray flares of GOES class M was studied in RHESSI observations. The high spectral resolution allows for a clean separation of thermal and non-thermal components in the 10-30 keV range, where most of the non-thermal photons are emitted. Spectral index and flux can thus be determined with much better accuracy than before. The spectral soft-hard-soft behavior in rise-peak-decay phases is discovered not only in the general flare development, but even more pronounced in subpeaks. An empirically found power-law dependence between the spectral index and the normalization of the non-thermal flux holds during the rise and decay phases of the emission peaks. It is still present in the combined set of all flares. We find an asymmetry in this dependence between rise and decay phases of the non-thermal emission. There is no delay between flux peak and spectral index minimum. The soft-hard-soft behavior appears to be an intrinsic signature of the elementary electron acceleration process.Comment: 10 pages, 7 figures. Accepted for publication by A&

    A Comment on “Is Information Systems a Science?”

    Get PDF
    In this paper, we respond to McBride’s (2018) paper on whether information systems is a science. We first argue that information systems is indeed a science in that it draws from and creates knowledge in a form similar to many different disciplines, including psychology, sociology, mathematics, economics, computer science, and engineering. We counter the flawed logic of methodical extremists who believe that their approach represents the best or only path to knowledge. Specifically, we argue that many different methods of inquiry and discovery are appropriate in information systems and that each has its strengths and weaknesses

    B743: Primary Health Care and the Developmentally Disabled: An Analysis of the Normalization Principle in the State of Maine

    Get PDF
    At the time of publication, there existed an estimated group of 10 million American people defined as developmentally disabled. Stimulated in part by the often observed dehumanizing environment of institutional arrangement for the mentally disabled, the search for more humane treatment and management alternative has pointed in the direction of what has been termed \u27 normalization. In 1969, the Danish Mental Retardation Service defined normalization as \u27\u27 letting the mentally retarded obtain an existence as close to normal as possible. The focus of this study is on barriers to the normalization principle in the provision of primary health care to the developmentally disabled in the State of Maine. Possible barriers include attitudes toward the developmentally disabled, accessibility and quality of community based services, and lack of viable coordination mechanisms. Since 1971, the Maine Department of Mental Health and Corrections has made a concerted effort to encourage services based upon the principle of normalization. As pressures for normalization intensify, it seems warranted that those community-based structures which carry out the concept be examined as to their receptivity and the feasibility of further efforts in this direction. Although the principle of normalization has demonstrated its usefulness and potential, it is not without its limitations (Mesibov 1976). This study made no attempt to examine these limitations of the principle itself.https://digitalcommons.library.umaine.edu/aes_bulletin/1127/thumbnail.jp

    On Silicon Group Elements Ejected by Supernovae Type Ia

    Full text link
    There is compelling evidence that the peak brightness of a Type Ia supernova is affected by the electron fraction Ye at the time of the explosion. The electron fraction is set by the aboriginal composition of the white dwarf and the reactions that occur during the pre explosive convective burning. To date, determining the makeup of the white dwarf progenitor has relied on indirect proxies, such as the average metallicity of the host stellar population. In this paper, we present analytical calculations supporting the idea that the electron fraction of the progenitor systematically influences the nucleosynthesis of silicon group ejecta in Type Ia supernovae. In particular, we suggest the abundances generated in quasi nuclear statistical equilibrium are preserved during the subsequent freezeout. This allows one to potential recovery of Ye at explosion from the abundances recovered from an observed spectra. We show that measurement of 28Si, 32S, 40Ca, and 54Fe abundances can be used to construct Ye in the silicon rich regions of the supernovae. If these four abundances are determined exactly, they are sufficient to recover Ye to 6 percent. This is because these isotopes dominate the composition of silicon-rich material and iron rich material in quasi nuclear statistical equilibrium. Analytical analysis shows that the 28Si abundance is insensitive to Ye, the 32S abundance has a nearly linear trend with Ye, and the 40Ca abundance has a nearly quadratic trend with Ye. We verify these trends with post-processing of 1D models and show that these trends are reflected in model synthetic spectra.Comment: Submitted to the Ap

    Replication research: opportunities, experiences and challenges

    Get PDF
    Replication is one of the main principles of the scientific method. In the physical sciences, new knowledge is often not considered valid until the original study has been replicated in other labs and the original results are not refuted Replication will either improve confidence in our research findings or identify important boundary conditions. Replications also enhance various scientific processes and offer methodical and educational improvements. The purpose of this panel is twofold. First, to explore the opportunities for scientific development that replication research enables by reflecting on the experiences of encouraging, doing, and publishing replication studies. Second, to explore the various challenges that replication research raises about its value to individual scholars as well as to our collective understanding of phenomena within the information systems field

    Clinical and hemodynamic follow-up of left ventricular to aortic conduits in patients with aortic stenosis

    Get PDF
    To assess the long-term results of left ventricular outflow tract reconstruction utilizing an apical left ventricular to aortic valved (porcine) conduit the clinical and hemodynamic data were reviewed from 24 patients who had placement of an apico-aortic conduit. Eighteen of the patients are asymptomatic and taking no cardiac medications. Three patients were reoperated on, one patient 1.5 years after his original operation for subacute bacterial endocarditis and two patients 3 to 4 years after their original operation for severe conduit valve insufficiency. None of the patients is taking anticoagulants and no thromboembolic events have occurred. Postoperative catheterization has been performed 1 to 1.5 years (mean 1.2) after repair in 15 of 21 patients. The rest left ventricular outflow tract gradient has decreased from 102.5 ± 20 mm Hg preoperatively to 14.8 ± 9.9 mm Hg postoperatively (probability [p] < 0.001). Some degree of conduit obstruction was demonstrated by catheter passage in 11 of the 15 patients. In these 11 patients, the obstruction occurred at three distant sites: at the egress of the left ventricle in 9, at the porcine valve in 5 and at the aortic to conduit junction in 1. Isometric exercise in five and supine bicycle exercise in six patients increased the left ventricular outflow tract gradient by 2.5 ± 1.1 and 20.8 ± 11.8 mm Hg, respectively, despite an increase in cardiac index of 1 ± 0.3 and 3.7 ± 0.4 liters/min per m2, respectively. The data suggest that a left ventricular to aortic conduit is an effective form of therapy for severe left ventricular outflow tract obstruction

    Multi-wavelength analysis of high energy electrons in solar flares: a case study of August 20, 2002 flare

    Full text link
    A multi-wavelength spatial and temporal analysis of solar high energy electrons is conducted using the August 20, 2002 flare of an unusually flat (gamma=1.8) hard X-ray spectrum. The flare is studied using RHESSI, Halpha, radio, TRACE, and MDI observations with advanced methods and techniques never previously applied in the solar flare context. A new method to account for X-ray Compton backscattering in the photosphere (photospheric albedo) has been used to deduce the primary X-ray flare spectra. The mean electron flux distribution has been analysed using both forward fitting and model independent inversion methods of spectral analysis. We show that the contribution of the photospheric albedo to the photon spectrum modifies the calculated mean electron flux distribution, mainly at energies below 100 keV. The positions of the Halpha emission and hard X-ray sources with respect to the current-free extrapolation of the MDI photospheric magnetic field and the characteristics of the radio emission provide evidence of the closed geometry of the magnetic field structure and the flare process in low altitude magnetic loops. In agreement with the predictions of some solar flare models, the hard X-ray sources are located on the external edges of the Halpha emission and show chromospheric plasma heated by the non-thermal electrons. The fast changes of Halpha intensities are located not only inside the hard X-ray sources, as expected if they are the signatures of the chromospheric response to the electron bombardment, but also away from them.Comment: 26 pages, 9 figures, accepted to Solar Physic
    • 

    corecore