276 research outputs found
Information System Development for Web Based Routine Reporting of Pneumonia in Acute Respiratory Infection Control Program at Semarang District Health Office
Evaluation activities for a program of Acute Respiratory Tract Infection Disease Control at SemarangDistrict Health Office were not optimal because required information from a routine report couldnot be used to support evaluation activities. Some problems existing on current information systemof pneumonia routine reporting were as follows: an officer had a difficulty to change or re-access pneumonia information, resulted information was incomplete and unclear, and submission of areport was not timely. This research aimed to develop information system of Pneumonia routinereporting based on web on the program of Acute Respiratory Tract Infection Disease Control atSemarang District Health Office.Design of this research was pre-experiment with one group pretest-posttest approach. In addition,development of the system used the methods of Framework for the Application of System Techniques(FAST). Subjects consisted of a system user at District Health Office and four pilot health centers.Data were collected using methods of observation, indepth interview, and a closed-endedquestionnaire. Furthermore, data were analyzed using content analysis and Wilcoxon test.This research resulted information system of Pneumonia routine reporting based on web on theprogram of Acute Respiratory Tract Infection Disease Control to solve problems happened in theold system. The result of Wilcoxon test revealed that there were any quality differences of informationin terms of the aspect of easiness (p=0.0001), the aspect of completeness (p=0.0001), the aspect ofclarity (p=0.0001), and the aspect of timeliness (p=0.0001) before and after developing theinformation system.As a suggestion, Semarang District Health Office needs to have commitment to optimally operatethe new system. Additionally, data reported from health centers to District Health Office must beaccurate in accordance with reality in the fields
Plasma Levels of Tumor Necrosis Factor-Alpha and Interleukin-6 in Obsessive Compulsive Disorder
Aim. Recent research implicated place of an immune mechanism in the pathophysiology of obsessive-compulsive disorder (OCD). Despite increasing evidence involvement of cytokine release in OCD, results of the studies are inconsistent. The aim of this study was to evaluate the plasma levels of the cytokines; tumor necrosis factor-alpha (TNF-α) and interleukin-6 (IL-6) in OCD patients. Methods. Plasma concentrations of TNF-α and IL-6 were measured in 31 drug-free outpatients with OCD, and 31-year age and sex-matched healthy controls. TNF-α and IL-6 concentrations in blood were determined by enzyme-linked immunosorbent assay (ELISA). Results. Both TNF-α and IL-6 levels showed statistically significant increases in OCD patients compared to controls (P < .000, P < .001, resp.). In addition, the age of onset was negatively correlated with TNF-α level (r = â.402, P = .025) and duration of illness was weakly correlated with IL-6 levels (r : .357; P : .048) in patients group. Conclusion. OCD patients showed increases in TNF-α and IL-6 levels compared to the healthy controls. This study provides evidence for alterations in the proinflamatory cytokines which suggest the involvement of the immune system in the pathophysiology of OCD
Constitutional Analogies in the International Legal System
This Article explores issues at the frontier of international law and constitutional law. It considers five key structural and systemic challenges that the international legal system now faces: (1) decentralization and disaggregation; (2) normative and institutional hierarchies; (3) compliance and enforcement; (4) exit and escape; and (5) democracy and legitimacy. Each of these issues raises questions of governance, institutional design, and allocation of authority paralleling the questions that domestic legal systems have answered in constitutional terms. For each of these issues, I survey the international legal landscape and consider the salience of potential analogies to domestic constitutions, drawing upon and extending the writings of international legal scholars and international relations theorists. I also offer some preliminary thoughts about why some treaties and institutions, but not others, more readily lend themselves to analysis in constitutional terms. And I distinguish those legal and political issues that may generate useful insights for scholars studying the growing intersections of international and constitutional law from other areas that may be more resistant to constitutional analogies
Analysis of design parameters in SIL-4 safety-critical computer
Nowadays, Safety-critical computers are extensively used in may civil domains like transportation including railways, avionics and automotive. We noticed that in design of some previous works, some critical safety design parameters like failure diagnostic coverage (DC) or common cause failure (CCF) ratio have not been seriously taken into account. Moreover, in some cases safety has not been compared with standard safety levels (IEC-61508 SIL1-SIL4) or even have not met them. Most often, it is not very clear that which part of the system is the Achilles' heel and how design can be improved to reach standard safety levels. Motivated by such design ambiguities, we aim to study the effect of various design parameters on safety in some prevalent safety configurations: 1oo2 and 2oo3. 1oo1 is also used as a reference. By employing Markov modeling, sensitivity of safety to each of the following critical design parameters is analyzed: failure rate of processing element, failure diagnostics coverage, common cause failures and repair rates. This study gives a deeper sense regarding influence of variation in design parameters over safety. Consequently, to meet appropriate safety integrity level, instead of improving some system parts blindly, it will be possible to make an informed decision on more relevant parameters. © 2017 IEEE
Recommended from our members
Impact of particles on the Planck HFI detectors: Ground-based measurements and physical interpretation
The Planck High Frequency Instrument (HFI) surveyed the sky continuously from
August 2009 to January 2012. Its noise and sensitivity performance were
excellent, but the rate of cosmic ray impacts on the HFI detectors was
unexpectedly high. Furthermore, collisions of cosmic rays with the focal plane
produced transient signals in the data (glitches) with a wide range of
characteristics. A study of cosmic ray impacts on the HFI detector modules has
been undertaken to categorize and characterize the glitches, to correct the HFI
time-ordered data, and understand the residual effects on Planck maps and data
products. This paper presents an evaluation of the physical origins of glitches
observed by the HFI detectors. In order to better understand the glitches
observed by HFI in flight, several ground-based experiments were conducted with
flight-spare HFI bolometer modules. The experiments were conducted between 2010
and 2013 with HFI test bolometers in different configurations using varying
particles and impact energies. The bolometer modules were exposed to 23 MeV
protons from the Orsay IPN TANDEM accelerator, and to Am and Cm
-particle and Fe radioactive X-ray sources. The calibration data
from the HFI ground-based preflight tests were used to further characterize the
glitches and compare glitch rates with statistical expectations under
laboratory conditions. Test results provide strong evidence that the dominant
family of glitches observed in flight are due to cosmic ray absorption by the
silicon die substrate on which the HFI detectors reside. Glitch energy is
propagated to the thermistor by ballistic phonons, while there is also a
thermal diffusion contribution. The implications of these results for future
satellite missions, especially those in the far-infrared to sub-millimetre and
millimetre regions of the electromagnetic spectrum, are discussed.Comment: 11 pages, 13 figure
Extracorporeal Membrane Oxygenation for Graft Dysfunction Early After Heart Transplantation: A Systematic Review and Meta-analysis
Introduction: Venoarterial extracorporeal membrane oxygenation (VA-ECMO) is a prevailing option for the management of severe early graft dysfunction. This systematic review and individual patient data (IPD) meta-analysis aims to evaluate (1) mortality, (2) rates of major complications, (3) prognostic factors, and (4) the effect of different VA-ECMO strategies on outcomes in adult heart transplant (HT) recipients supported with VA-ECMO. Methods and Results: We conducted a systematic search and included studies of adults (â„18 years) who received VA-ECMO during their index hospitalization after HT and reported on mortality at any timepoint. We pooled data using random effects models. To identify prognostic factors, we analysed IPD using mixed effects logistic regression. We assessed the certainty in the evidence using the GRADE framework. We included 49 observational studies of 1477 patients who received VA-ECMO after HT, of which 15 studies provided IPD for 448 patients. There were no differences in mortality estimates between IPD and non-IPD studies. The short-term (30-day/in-hospital) mortality estimate was 33% (moderate certainty, 95% confidence interval [CI] 28%â39%) and 1-year mortality estimate 50% (moderate certainty, 95% CI 43%â57%). Recipient age (odds ratio 1.02, 95% CI 1.01â1.04) and prior sternotomy (OR 1.57, 95% CI 0.99â2.49) are associated with increased short-term mortality. There is low certainty evidence that early intraoperative cannulation and peripheral cannulation reduce the risk of short-term death. Conclusions: One-third of patients who receive VA-ECMO for early graft dysfunction do not survive 30 days or to hospital discharge, and one-half do not survive to 1 year after HT. Improving outcomes will require ongoing research focused on optimizing VA-ECMO strategies and care in the first year after HT
Understanding single-station ground motion variability and uncertainty (sigma) â Lessons learnt from EUROSEISTEST
Accelerometric data from the well-studied valley EUROSEISTEST are used to investigate ground motion uncertainty and variability. We define a simple local ground motion prediction equation (GMPE) and investigate changes in standard deviation (Ï) and its components, the between-event variability (Ï) and within-event variability (Ï). Improving seismological metadata significantly reduces Ï (30-50%), which in turn reduces the total Ï. Improving site information reduces the systematic site-to-site variability, ÏS2S (20-30%), in turn reducing Ï, and ultimately, Ï. Our values of standard deviations are lower than global values from literature, and closer to path-specific than site-specific values. However, our data have insufficient azimuthal coverage for single-path analysis. Certain stations have higher ground-motion variability, possibly due to topography, basin edge or downgoing wave effects. Sensitivity checks show that 3 recordings per event is a sufficient data selection criterion, however, one of the datasetâs advantages is the large number of recordings per station (9-90) that yields good site term estimates. We examine uncertainty components binning our data with magnitude from 0.01 to 2 s; at smaller magnitudes, Ï decreases and ÏSS increases, possibly due to Îș and source-site trade-offs Finally, we investigate the alternative approach of computing ÏSS using existing GMPEs instead of creating an ad hoc local GMPE. This is important where data are insufficient to create one, or when site-specific PSHA is performed. We show that global GMPEs may still capture ÏSS, provided that: 1. the magnitude scaling errors are accommodated by the event terms; 2. there are no distance scaling errors (use of a regionally applicable model). Site terms (ÏS2S) computed by different global GMPEs (using different site-proxies) vary significantly, especially for hard-rock sites. This indicates that GMPEs may be poorly constrained where they are sometimes most needed, i.e. for hard rock
Heterogeneity of ubiquitin pathology in frontotemporal lobar degeneration: classification and relation to clinical phenotype
We have investigated the extent and pattern of immunostaining for ubiquitin protein (UBQ) in 60 patients with frontotemporal lobar degeneration (FTLD) with ubiquitin-positive, tau-negative inclusions (FTLD-U), 37 of whom were ascertained in Manchester UK and 23 in Newcastle-Upon-Tyne, UK. There were three distinct histological patterns according to the form and distribution of the UBQ pathology. Histological type 1 was present in 19 patients (32%) and characterised by the presence of a moderate number, or numerous, UBQ immunoreactive neurites and intraneuronal cytoplasmic inclusions within layer II of the frontal and temporal cerebral cortex, and cytoplasmic inclusions within granule cells of the dentate gyrus; neuronal intranuclear inclusions (NII) of a âcatâs eyeâ or âlentiformâ appearance were present in 17 of these patients. In histological type 2 (16 patients, 27%), UBQ neurites were predominantly, or exclusively, present with few intraneuronal cytoplasmic inclusions within layer II of the cerebral cortex, while in histological type 3 (25 patients, 42%), UBQ intraneuronal cytoplasmic inclusions either within the cortical layer II or in the granule cells of the dentate gyrus, with few or no UBQ neurites, were seen. In neither of these latter two groups were NII present. The influence of histological type on clinical phenotype was highly significant with type 1 histology being associated clinically with cases of frontotemporal dementia (FTD) or progressive non-fluent aphasia (PNFA), type 2 histology with semantic dementia (SD), and type 3 histology with FTD, or FTD and motor neurone disease (MND)
Derivation of consistent hard rock (1000<Vs<3000 m/s) GMPEs from surface and down-hole recordings: Analysis of KiK-net data
A key component in seismic hazard assessment is the estimation of ground motion for hard rock sites, either for applications to installations built on this site category, or as an input motion for site response computation. Empirical ground motion prediction equations (GMPEs) are the traditional basis for estimating ground motion while VS30 is the basis to account for site conditions. As current GMPEs are poorly constrained for VS30 larger than 1000 m/s, the presently used approach for estimating hazard on hard rock sites consists of âhost-to-targetâ adjustment techniques based on VS30 and Îș0 values. The present study investigates alternative methods on the basis of a KiK-net dataset corresponding to stiff and rocky sites with 500 < VS30 < 1350 m/s. The existence of sensor pairs (one at the surface and one in depth) and the availability of P- and S-wave velocity profiles allow deriving two âvirtualâ datasets associated to outcropping hard rock sites with VS in the range [1000, 3000] m/s with two independent corrections: 1/down-hole recordings modified from within motion to outcropping motion with a depth correction factor, 2/surface recordings deconvolved from their specific site response derived through 1D simulation. GMPEs with simple functional forms are then developed, including a VS30 site term. They lead to consistent and robust hard-rock motion estimates, which prove to be significantly lower than host-to-target adjustment predictions. The difference can reach a factor up to 3â4 beyond 5 Hz for very hard-rock, but decreases for decreasing frequency until vanishing below 2 Hz
- âŠ