198 research outputs found

    Novel Topology for Four-Quadrant Converter

    Get PDF
    Particle accelerators, like the LHC (Large Hadron Collider), make use of true bipolar power converters to feed superconducting magnets. Moreover, the LHC imposes that most converters must be installed underground. This constraint leads to the necessity of a high efficiency and a reduced volume for all the power converters. In this paper, the authors present a novel four-quadrant topology composed by an association of a ZVS-inverter and a ZCS-rectifier. This DC-AC-DC converter is fully reversible and a soft-switching operation mode is achieved for all switches over the full operating range. After a thorough analysis of the prototype design [±600A, ±10V], simulation and experimental results confirm the general performance of this power structure

    Undocumented Migrants in Switzerland: Geographical Origin Versus Legal Status as Risk Factor for Tuberculosis

    Get PDF
    Undocumented migrants, meaning migrants without a legal residency permit, come to Geneva from countries with high tuberculosis (TB) incidence. We estimate here whether being undocumented is a determinant of TB, independently of origin. Cross-sectional study including undocumented migrants in a TB screening program in 2002; results were compared to 12,904 age and frequency matched participants in a general TB screening program conducted at various workplaces in Geneva, Switzerland from 1992 to 2002. A total of 206 undocumented migrants (36% male, 64% female, mean age 37.8 years (SD 11.8), 82.5% from Latin America) participated in the TB screening program. Compared to legal residents, undocumented migrants had an adjusted OR for TB-related fibrotic signs of 1.7 (95% CI 0.8;3.7). The OR of TB-related fibrotic signs for Latin American (vs. other) origin was 2.7 (95% CI 1.6;4.7) among legal residents and 5.5 (95% CI 2.8;10.8) among undocumented migrants. Chest X-ray screening identified a higher proportion of TB-related fibrotic signs among Latin Americans, independently of their residency statu

    Proposal of a framework for evaluating military surveillance systems for early detection of outbreaks on duty areas

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In recent years a wide variety of epidemiological surveillance systems have been developed to provide early identification of outbreaks of infectious disease. Each system has had its own strengths and weaknesses. In 2002 a Working Group of the Centers for Disease Control and Prevention (CDC) produced a framework for evaluation, which proved suitable for many public health surveillance systems. However this did not easily adapt to the military setting, where by necessity a variety of different parameters are assessed, different constraints placed on the systems, and different objectives required. This paper describes a proposed framework for evaluation of military syndromic surveillance systems designed to detect outbreaks of disease on operational deployments.</p> <p>Methods</p> <p>The new framework described in this paper was developed from the cumulative experience of British and French military syndromic surveillance systems. The methods included a general assessment framework (CDC), followed by more specific methods of conducting evaluation. These included Knowledge/Attitude/Practice surveys (KAP surveys), technical audits, ergonomic studies, simulations and multi-national exercises. A variety of military constraints required integration into the evaluation. Examples of these include the variability of geographical conditions in the field, deployment to areas without prior knowledge of naturally-occurring disease patterns, the differences in field sanitation between locations and over the length of deployment, the mobility of military forces, turnover of personnel, continuity of surveillance across different locations, integration with surveillance systems from other nations working alongside each other, compatibility with non-medical information systems, and security.</p> <p>Results</p> <p>A framework for evaluation has been developed that can be used for military surveillance systems in a staged manner consisting of initial, intermediate and final evaluations. For each stage of the process parameters for assessment have been defined and methods identified.</p> <p>Conclusion</p> <p>The combined experiences of French and British syndromic surveillance systems developed for use in deployed military forces has allowed the development of a specific evaluation framework. The tool is suitable for use by all nations who wish to evaluate syndromic surveillance in their own military forces. It could also be useful for civilian mobile systems or for national security surveillance systems.</p

    Beyond traditional surveillance: applying syndromic surveillance to developing settings – opportunities and challenges

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>All countries need effective disease surveillance systems for early detection of outbreaks. The revised International Health Regulations [IHR], which entered into force for all 194 World Health Organization member states in 2007, have expanded traditional infectious disease notification to include surveillance for public health events of potential international importance, even if the causative agent is not yet known. However, there are no clearly established guidelines for how countries should conduct this surveillance, which types of emerging disease syndromes should be reported, nor any means for enforcement.</p> <p>Discussion</p> <p>The commonly established concept of syndromic surveillance in developed regions encompasses the use of pre-diagnostic information in a near real time fashion for further investigation for public health action. Syndromic surveillance is widely used in North America and Europe, and is typically thought of as a highly complex, technology driven automated tool for early detection of outbreaks. Nonetheless, low technology applications of syndromic surveillance are being used worldwide to augment traditional surveillance.</p> <p>Summary</p> <p>In this paper, we review examples of these novel applications in the detection of vector-borne diseases, foodborne illness, and sexually transmitted infections. We hope to demonstrate that syndromic surveillance in its basic version is a feasible and effective tool for surveillance in developing countries and may facilitate compliance with the new IHR guidelines.</p

    High Performance Multicell Series Inverter-Fed Induction Motor Drive

    Get PDF
    This document is the Accepted Manuscript version of the following article: M. Khodja, D. Rahiel, M. B. Benabdallah, H. Merabet Boulouiha, A. Allali, A. Chaker, and M. Denai, ‘High-performance multicell series inverter-fed induction motor drive’, Electrical Engineering, Vol. 99 (3): 1121-1137, September 2017. The final publication is available at Springer via DOI: https://doi.org/10.1007/s00202-016-0472-4.The multilevel voltage-source inverter (VSI) topology of the series multicell converter developed in recent years has led to improved converter performance in terms of power density and efficiency. This converter reduces the voltage constraints between all cells, which results in a lower transmission losses, high switching frequencies and the improvement of the output voltage waveforms. This paper proposes an improved topology of the series multicell inverter which minimizes harmonics, reduces torque ripples and losses in a variable-speed induction motor drive. The flying capacitor multilevel inverter topology based on the classical and modified phase shift pulse width modulation (PSPWM, MPSPWM) techniques are applied in this paper to minimize harmonic distortion at the inverter output. Simulation results are presented for a 2-kW induction motor drive and the results obtained demonstrate reduced harmonics, improved transient responses and reference tracking performance of the voltage in the induction motor and consequently reduced torque ripplesPeer reviewe

    Diversity of experimentation by farmers engaged in agroecology

    Get PDF
    International audienceAbstractAgroecology questions the production of generic knowledge. Rather than searching for the best practices for large-scale transfer, it would be more efficient to help farmers find their own solutions. A promising activity for farmers is experimentation because it answers their needs and helps them learn. However, how agroecological practices are tested by farmers in their own experiments is still poorly known. In this study, we examined the short-term experimental activity, i.e., experiments carried out at a yearly scale in pre-defined fields. Seventeen farmers in south eastern France were surveyed. The farmers practiced conventional or organic farming and cultivated either arable or market garden crops. Experiments on agroecological practices were characterized, located along a timeline, and discussed with them. To conduct the interviews with the farmers, each experiment was described in three stages: (1) designing the experiment, (2) managing it in real time, and (3) evaluating the results of the experiment. The data collected in the interviews were first analyzed to build a descriptive framework of farmers’ experiments, after which hierarchical cluster analysis was used to analyze the diversity of the farmers’ experiments. Here, we propose for the first time a generic framework to describe farmers’ experiments at a short time scale based on the consistency between the Design, Management, and Evaluation stages. We used the framework to characterize the diversity of farmers’ experiments and identified four clusters. The originality of this work is both building a descriptive framework resulting from in-depth analyses of farmers’ discourse and using statistical tools to identify and interpret the groups of experiments. Our results provide a better understanding of farmers’ experiments and suggest tools and methods to help them experiment, a major challenge in the promotion of a large-scale agroecological transition

    Evidence for a lack of a direct transcriptional suppression of the iron regulatory peptide hepcidin by hypoxia-inducible factors.

    Get PDF
    BACKGROUND: Hepcidin is a major regulator of iron metabolism and plays a key role in anemia of chronic disease, reducing intestinal iron uptake and release from body iron stores. Hypoxia and chemical stabilizers of the hypoxia-inducible transcription factor (HIF) have been shown to suppress hepcidin expression. We therefore investigated the role of HIF in hepcidin regulation. METHODOLOGY/PRINCIPAL FINDINGS: Hepcidin mRNA was down-regulated in hepatoma cells by chemical HIF stabilizers and iron chelators, respectively. In contrast, the response to hypoxia was variable. The decrease in hepcidin mRNA was not reversed by HIF-1alpha or HIF-2alpha knock-down or by depletion of the HIF and iron regulatory protein (IRP) target transferrin receptor 1 (TfR1). However, the response of hepcidin to hypoxia and chemical HIF inducers paralleled the regulation of transferrin receptor 2 (TfR2), one of the genes critical to hepcidin expression. Hepcidin expression was also markedly and rapidly decreased by serum deprivation, independent of transferrin-bound iron, and by the phosphatidylinositol 3 (PI3) kinase inhibitor LY294002, indicating that growth factors are required for hepcidin expression in vitro. Hepcidin promoter constructs mirrored the response of mRNA levels to interleukin-6 and bone morphogenetic proteins, but not consistently to hypoxia or HIF stabilizers, and deletion of the putative HIF binding motifs did not alter the response to different hypoxic stimuli. In mice exposed to carbon monoxide, hypoxia or the chemical HIF inducer N-oxalylglycine, liver hepcidin 1 mRNA was elevated rather than decreased. CONCLUSIONS/SIGNIFICANCE: Taken together, these data indicate that hepcidin is neither a direct target of HIF, nor indirectly regulated by HIF through induction of TfR1 expression. Hepcidin mRNA expression in vitro is highly sensitive to the presence of serum factors and PI3 kinase inhibition and parallels TfR2 expression

    Comparison of HIV-1 Genotypic Resistance Test Interpretation Systems in Predicting Virological Outcomes Over Time

    Get PDF
    Background: Several decision support systems have been developed to interpret HIV-1 drug resistance genotyping results. This study compares the ability of the most commonly used systems (ANRS, Rega, and Stanford's HIVdb) to predict virological outcome at 12, 24, and 48 weeks. Methodology/Principal Findings: Included were 3763 treatment-change episodes (TCEs) for which a HIV-1 genotype was available at the time of changing treatment with at least one follow-up viral load measurement. Genotypic susceptibility scores for the active regimens were calculated using scores defined by each interpretation system. Using logistic regression, we determined the association between the genotypic susceptibility score and proportion of TCEs having an undetectable viral load (<50 copies/ml) at 12 (8-16) weeks (2152 TCEs), 24 (16-32) weeks (2570 TCEs), and 48 (44-52) weeks (1083 TCEs). The Area under the ROC curve was calculated using a 10-fold cross-validation to compare the different interpretation systems regarding the sensitivity and specificity for predicting undetectable viral load. The mean genotypic susceptibility score of the systems was slightly smaller for HIVdb, with 1.92±1.17, compared to Rega and ANRS, with 2.22±1.09 and 2.23±1.05, respectively. However, similar odds ratio's were found for the association between each-unit increase in genotypic susceptibility score and undetectable viral load at week 12; 1.6 [95% confidence interval 1.5-1.7] for HIVdb, 1.7 [1.5-1.8] for ANRS, and 1.7 [1.9-1.6] for Rega. Odds ratio's increased over time, but remained comparable (odds ratio's ranging between 1.9-2.1 at 24 weeks and 1.9-2.

    Value of syndromic surveillance within the Armed Forces for early warning during a dengue fever outbreak in French Guiana in 2006

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A dengue fever outbreak occured in French Guiana in 2006. The objectives were to study the value of a syndromic surveillance system set up within the armed forces, compared to the traditional clinical surveillance system during this outbreak, to highlight issues involved in comparing military and civilian surveillance systems and to discuss the interest of syndromic surveillance for public health response.</p> <p>Methods</p> <p>Military syndromic surveillance allows the surveillance of suspected dengue fever cases among the 3,000 armed forces personnel. Within the same population, clinical surveillance uses several definition criteria for dengue fever cases, depending on the epidemiological situation. Civilian laboratory surveillance allows the surveillance of biologically confirmed cases, within the 200,000 inhabitants.</p> <p>Results</p> <p>It was shown that syndromic surveillance detected the dengue fever outbreak several weeks before clinical surveillance, allowing quick and effective enhancement of vector control within the armed forces. Syndromic surveillance was also found to have detected the outbreak before civilian laboratory surveillance.</p> <p>Conclusion</p> <p>Military syndromic surveillance allowed an early warning for this outbreak to be issued, enabling a quicker public health response by the armed forces. Civilian surveillance system has since introduced syndromic surveillance as part of its surveillance strategy. This should enable quicker public health responses in the future.</p
    corecore