759 research outputs found
The implementation and use of Ada on distributed systems with reliability requirements
The issues involved in the use of the programming language Ada on distributed systems are discussed. The effects of Ada programs on hardware failures such as loss of a processor are emphasized. It is shown that many Ada language elements are not well suited to this environment. Processor failure can easily lead to difficulties on those processors which remain. As an example, the calling task in a rendezvous may be suspended forever if the processor executing the serving task fails. A mechanism for detecting failure is proposed and changes to the Ada run time support system are suggested which avoid most of the difficulties. Ada program structures are defined which allow programs to reconfigure and continue to provide service following processor failure
The implementation and use of Ada on distributed systems with high reliability requirements
The use and implementation of Ada in distributed environments in which reliability is the primary concern were investigated. In particular, the concept that a distributed system may be programmed entirely in Ada so that the individual tasks of the system are unconcerned with which processors they are executing on, and that failures may occur in the software or underlying hardware was examined. Progress is discussed for the following areas: continued development and testing of the fault-tolerant Ada testbed; development of suggested changes to Ada so that it might more easily cope with the failure of interest; and design of new approaches to fault-tolerant software in real-time systems, and integration of these ideas into Ada
Molecular Line Emission Towards High-Mass Clumps: The MALT90 Catalogue
The Millimetre Astronomy Legacy Team 90 GHz (MALT90) survey aims to characterise the physical and chemical evolution of high-mass clumps. Recently completed, it mapped 90 GHz line emission toward 3246 high-mass clumps identified from the ATLASGAL 870 �m Galactic plane survey. By utilising the broad frequency coverage of the Mopra telescope's spectrometer, maps in 16 different emission lines were simultaneously obtained. Here we describe the �first line catalog of the detected emission, generated by Gaussian profile �fitting to spectra extracted toward each clumps' dust peak. Synthetic spectra show that the catalog has a completeness of >95%, a probability of a false-positive detection of <0.3%, and a relative uncertainty in the measured quantities of <20% over the range of detection criteria. We find that the detection rates are highest for the (1{0) molecular transitions of HCO+, HNC, N2H+, and HCN (72{88%). The majority of clumps (~� 95%) are detected in at least one of the molecular transitions, just under half of the clumps (�~48%) are detected in 4 or more of the transitions, while only 2 clumps are detected in 13 or more transitions. We find several striking trends in the ensemble of properties for the different molecular transitions when plotted as a function of the clumps' evolutionary state. In particular, the optically thickest HCO+ emission shows a `blue-red asymmetry' that indicates overall collapse that monotonically decreases as the clumps evolve. This catalog represents the largest compiled database of molecular line emission toward high-mass clumps and is a valuable data set for detailed studies of these objects
Recommended from our members
Evaluation of outreach services for primary care and mental health; assessing the impact
Objectives: This paper reports an evaluation, carried out for London Health Libraries, of the impact of outreach services to primary care and mental health workers in thirteen different settings. The main aims of the project were to identify the impact being made by the service, and to produce best practice guidelines for outreach services in this kind of ‘difficult’ community setting.
Methods: Methods used were: analysis of documents (all 13 services); analysis of any evaluation already performed by or for the service (all 13 services); interviews with outreach librarians (11 services); questionnaire survey of a representative sample of users (8 services, with 66 returned questionnaires, 35% response rate). The services evaluated were very diverse, in terms of setting, structure, functions and activities, and extent and nature of self-evaluation and reporting. The evaluation was therefore largely qualitative, in order to deal with the lack of a consistent ‘template’ for analysis. Emphasis was placed on trying to identify critical incidents , where it could be shown unambiguously that the outreach services made a difference to practice.
Study limitations included the difficulty of summarising and comparing very different situations and diverse services, difficulty in identifying critical incidents, and an inability to study ‘non-users’.
Findings: Service recipients felt better informed, more up-
to-date, more aware of resources, more confident and supported in their work, and saved time. Services contributed to a richer information environment. Direct impacts, demonstrably improved patient care, cost savings etc., were more difficult to establish
X-ray and radio observations of central black holes in nearby low-mass early-type galaxies: Preliminary evidence for low Eddington fractions
We present new radio and X-ray observations of two nearby ( Mpc)
low-mass early-type galaxies with dynamically-confirmed central black holes:
NGC 5102 and NGC 205. NGC 5102 shows a weak nuclear X-ray source and has no
core radio emission. However, for the first time we demonstrate that it shows
luminous extended radio continuum emission in low-resolution, low-frequency ( GHz) data, consistent with jet lobes on scales pc formed from
past accretion and jet activity. By contrast, in new, extremely deep,
strictly-simultaneous Very Large Array and Chandra observations, no radio or
X-ray emission is detected from the black hole in NGC 205. We consider these
measurements and upper limits in the context of the few other low-mass
early-type galaxies with dynamically-confirmed black holes, and show that the
mean ratio of bolometric to Eddington luminosity in this sample is only
. These Eddington
ratios are lower than typical in a comparison sample of more massive early-type
galaxies, though this conclusion is quite tentative due to our small sample of
low-mass galaxies and potential biases in the comparison sample. This
preliminary result is in mild tension with previous work using less sensitive
observations of more distant galaxies, which predict higher X-ray luminosities
than we observe for low-mass galaxies. If it is confirmed that central black
holes in low-mass galaxies typically have low Eddington ratios, this presents a
challenge to measuring the occupation fraction of central black holes with
standard optical emission line, X-ray, or radio surveys.Comment: 13 pages, 4 figures, 3 tables. Accepted for publication in Ap
Ionisation-induced star formation III: Effects of external triggering on the IMF in clusters
We report on Smoothed Particle Hydrodynamics (SPH) simulations of the impact
on a turbulent M star--forming molecular cloud of
irradiation by an external source of ionizing photons. We find that the
ionizing radiation has a significant effect on the gas morphology, but a less
important role in triggering stars. The rate and morphology of star formation
are largely governed by the structure in the gas generated by the turbulent
velocity field, and feedback has no discernible effect on the stellar initial
mass function. Although many young stars are to be found in dense gas located
near an ionization front, most of these objects also form when feedback is
absent. Ionization has a stronger effect in diffuse regions of the cloud by
sweeping up low--density gas that would not otherwise form stars into
gravitationally--unstable clumps. However, even in these regions, dynamical
interactions between the stars rapidly erase the correlations between their
positions and velocities and that of the ionization front.Comment: 12 pages, 16 figures (some downgraded to fit on astro-ph), accepted
for publication in MNRA
Compliance assessment of ambulatory Alzheimer patients to aid therapeutic decisions by healthcare professionals
<p>Abstract</p> <p>Background</p> <p>Compliance represents a major determinant for the effectiveness of pharmacotherapy. Compliance reports summarising electronically compiled compliance data qualify healthcare needs and can be utilised as part of a compliance enhancing intervention. Nevertheless, evidence-based information on a sufficient level of compliance is scarce complicating the interpretation of compliance reports. The purpose of our pilot study was to determine the compliance of ambulatory Alzheimer patients to antidementia drugs under routine therapeutic use using electronic monitoring. In addition, the forgiveness of donepezil (i.e. its ability to sustain adequate pharmacological response despite suboptimal compliance) was characterised and evidence-based guidance for the interpretation of compliance reports was intended to be developed.</p> <p>Methods</p> <p>We determined the compliance of four different antidementia drugs by electronic monitoring in 31 patients over six months. All patients were recruited from the gerontopsychiatric clinic of a university hospital as part of a pilot study. The so called medication event monitoring system (MEMS) was employed, consisting of a vial with a microprocessor in the lid which records the time (date, hour, minute) of every opening. Daily compliance served as primary outcome measure, defined as percentage of days with correctly administered doses of medication. In addition, pharmacokinetics and pharmacodynamics of donepezil were simulated to systematically assess therapeutic undersupply also incorporating study compliance patterns. Statistical analyses were performed with SPSS and Microsoft Excel.</p> <p>Results</p> <p>Median daily compliance was 94% (range 48%-99%). Ten patients (32%) were non-compliant at least for one month. One-sixth of patients taking donepezil displayed periods of therapeutic undersupply. For 10 mg and 5 mg donepezil once-daily dosing, the estimated forgiveness of donepezil was 80% and 90% daily compliance or two and one dosage omissions at steady state, respectively. Based on the simulation findings we developed rules for the evidence-based interpretation of donepezil compliance reports.</p> <p>Conclusions</p> <p>Compliance in ambulatory Alzheimer patients was for the first time assessed under routine conditions using electronic monitoring: On average compliance was relatively high but variable between patients. The approach of pharmacokinetic/pharmacodynamic <it>in silico </it>simulations was suitable to characterise the forgiveness of donepezil suggesting evidence-based recommendations for the interpretation of compliance reports.</p
Characteristics of Fibromyalgia Independently Predict Poorer Long‐Term Analgesic Outcomes Following Total Knee and Hip Arthroplasty
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/111198/1/art39051.pd
On Tackling the Limits of Resolution in SAT Solving
The practical success of Boolean Satisfiability (SAT) solvers stems from the
CDCL (Conflict-Driven Clause Learning) approach to SAT solving. However, from a
propositional proof complexity perspective, CDCL is no more powerful than the
resolution proof system, for which many hard examples exist. This paper
proposes a new problem transformation, which enables reducing the decision
problem for formulas in conjunctive normal form (CNF) to the problem of solving
maximum satisfiability over Horn formulas. Given the new transformation, the
paper proves a polynomial bound on the number of MaxSAT resolution steps for
pigeonhole formulas. This result is in clear contrast with earlier results on
the length of proofs of MaxSAT resolution for pigeonhole formulas. The paper
also establishes the same polynomial bound in the case of modern core-guided
MaxSAT solvers. Experimental results, obtained on CNF formulas known to be hard
for CDCL SAT solvers, show that these can be efficiently solved with modern
MaxSAT solvers
- …