140 research outputs found

    Can "presumed consent" justify the duty to treat infectious diseases? An analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>AIDS, SARS, and the recent epidemics of the avian-flu have all served to remind us the debate over the limits of the moral duty to care. It is important to first consider the question of whether or not the "duty to treat" might be subject to contextual constraints. The purpose of this study was to investigate the opinions and beliefs held by both physicians and dentists regarding the occupational risks of infectious diseases, and to analyze the argument that the notion of "presumed consent" on the part of professionals may be grounds for supporting the duty to treat.</p> <p>Methods</p> <p>For this cross-sectional survey, the study population was selected from among physicians and dentists in Ankara. All of the 373 participants were given a self-administered questionnaire.</p> <p>Results</p> <p>In total, 79.6% of the participants said that they either had some degree of knowledge about the risks when they chose their profession or that they learned of the risks later during their education and training. Of the participants, 5.2% said that they would not have chosen this profession if they had been informed of the risks. It was found that 57% of the participants believed that there is a standard level of risk, and 52% of the participants stated that certain diseases would exceed the level of acceptable risk unless specific protective measures were implemented.</p> <p>Conclusion</p> <p>If we use the presumed consent argument to establish the duty of the HCW to provide care, we are confronted with problems ranging over the difficulty of choosing a profession autonomously, the constant level of uncertainty present in the medical profession, the near-impossibility of being able to evaluate retrospectively whether every individual was informed, and the seemingly inescapable problem that this practice would legitimize, and perhaps even foster, discrimination against patients with certain diseases. Our findings suggest that another problem can be added to the list: one-fifth of the participants in this study either lacked adequate knowledge of the occupational risks when they chose the medical profession or were not sufficiently informed of these risks during their faculty education and training. Furthermore, in terms of the moral duty to provide care, it seems that most HCWs are more concerned about the availability of protective measures than about whether they had been informed of a particular risk beforehand. For all these reasons, the presumed consent argument is not persuasive enough, and cannot be used to justify the duty to provide care. It is therefore more useful to emphasize justifications other than presumed consent when defining the duty of HCWs to provide care, such as the social contract between society and the medical profession and the fact that HCWs have a greater ability to provide medical aid.</p

    Evolutionary approaches for the reverse-engineering of gene regulatory networks: A study on a biologically realistic dataset

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Inferring gene regulatory networks from data requires the development of algorithms devoted to structure extraction. When only static data are available, gene interactions may be modelled by a Bayesian Network (BN) that represents the presence of direct interactions from regulators to regulees by conditional probability distributions. We used enhanced evolutionary algorithms to stochastically evolve a set of candidate BN structures and found the model that best fits data without prior knowledge.</p> <p>Results</p> <p>We proposed various evolutionary strategies suitable for the task and tested our choices using simulated data drawn from a given bio-realistic network of 35 nodes, the so-called insulin network, which has been used in the literature for benchmarking. We assessed the inferred models against this reference to obtain statistical performance results. We then compared performances of evolutionary algorithms using two kinds of recombination operators that operate at different scales in the graphs. We introduced a niching strategy that reinforces diversity through the population and avoided trapping of the algorithm in one local minimum in the early steps of learning. We show the limited effect of the mutation operator when niching is applied. Finally, we compared our best evolutionary approach with various well known learning algorithms (MCMC, K2, greedy search, TPDA, MMHC) devoted to BN structure learning.</p> <p>Conclusion</p> <p>We studied the behaviour of an evolutionary approach enhanced by niching for the learning of gene regulatory networks with BN. We show that this approach outperforms classical structure learning methods in elucidating the original model. These results were obtained for the learning of a bio-realistic network and, more importantly, on various small datasets. This is a suitable approach for learning transcriptional regulatory networks from real datasets without prior knowledge.</p

    Inhibition of IL-10 Production by Maternal Antibodies against Group B Streptococcus GAPDH Confers Immunity to Offspring by Favoring Neutrophil Recruitment

    Get PDF
    Group B Streptococcus (GBS) is the leading cause of neonatal pneumonia, septicemia, and meningitis. We have previously shown that in adult mice GBS glycolytic enzyme glyceraldehyde-3-phosphate dehydrogenase (GAPDH) is an extracellular virulence factor that induces production of the immunosuppressive cytokine interleukin-10 (IL-10) by the host early upon bacterial infection. Here, we investigate whether immunity to neonatal GBS infection could be achieved through maternal vaccination against bacterial GAPDH. Female BALB/c mice were immunized with rGAPDH and the progeny was infected with a lethal inoculum of GBS strains. Neonatal mice born from mothers immunized with rGAPDH were protected against infection with GBS strains, including the ST-17 highly virulent clone. A similar protective effect was observed in newborns passively immunized with anti-rGAPDH IgG antibodies, or F(ab')2 fragments, indicating that protection achieved with rGAPDH vaccination is independent of opsonophagocytic killing of bacteria. Protection against lethal GBS infection through rGAPDH maternal vaccination was due to neutralization of IL-10 production soon after infection. Consequently, IL-10 deficient (IL-10−/−) mice pups were as resistant to GBS infection as pups born from vaccinated mothers. We observed that protection was correlated with increased neutrophil trafficking to infected organs. Thus, anti-rGAPDH or anti-IL-10R treatment of mice pups before GBS infection resulted in increased neutrophil numbers and lower bacterial load in infected organs, as compared to newborn mice treated with the respective control antibodies. We showed that mothers immunized with rGAPDH produce neutralizing antibodies that are sufficient to decrease IL-10 production and induce neutrophil recruitment into infected tissues in newborn mice. These results uncover a novel mechanism for GBS virulence in a neonatal host that could be neutralized by vaccination or immunotherapy. As GBS GAPDH is a structurally conserved enzyme that is metabolically essential for bacterial growth in media containing glucose as the sole carbon source (i.e., the blood), this protein constitutes a powerful candidate for the development of a human vaccine against this pathogen

    A "Candidate-Interactome" Aggregate Analysis of Genome-Wide Association Data in Multiple Sclerosis

    Get PDF
    Though difficult, the study of gene-environment interactions in multifactorial diseases is crucial for interpreting the relevance of non-heritable factors and prevents from overlooking genetic associations with small but measurable effects. We propose a “candidate interactome” (i.e. a group of genes whose products are known to physically interact with environmental factors that may be relevant for disease pathogenesis) analysis of genome-wide association data in multiple sclerosis. We looked for statistical enrichment of associations among interactomes that, at the current state of knowledge, may be representative of gene-environment interactions of potential, uncertain or unlikely relevance for multiple sclerosis pathogenesis: Epstein-Barr virus, human immunodeficiency virus, hepatitis B virus, hepatitis C virus, cytomegalovirus, HHV8-Kaposi sarcoma, H1N1-influenza, JC virus, human innate immunity interactome for type I interferon, autoimmune regulator, vitamin D receptor, aryl hydrocarbon receptor and a panel of proteins targeted by 70 innate immune-modulating viral open reading frames from 30 viral species. Interactomes were either obtained from the literature or were manually curated. The P values of all single nucleotide polymorphism mapping to a given interactome were obtained from the last genome-wide association study of the International Multiple Sclerosis Genetics Consortium & the Wellcome Trust Case Control Consortium, 2. The interaction between genotype and Epstein Barr virus emerges as relevant for multiple sclerosis etiology. However, in line with recent data on the coexistence of common and unique strategies used by viruses to perturb the human molecular system, also other viruses have a similar potential, though probably less relevant in epidemiological terms

    A “Candidate-Interactome” Aggregate Analysis of Genome-Wide Association Data in Multiple Sclerosis

    Get PDF
    Though difficult, the study of gene-environment interactions in multifactorial diseases is crucial for interpreting the relevance of non-heritable factors and prevents from overlooking genetic associations with small but measurable effects. We propose a "candidate interactome" (i.e. a group of genes whose products are known to physically interact with environmental factors that may be relevant for disease pathogenesis) analysis of genome-wide association data in multiple sclerosis. We looked for statistical enrichment of associations among interactomes that, at the current state of knowledge, may be representative of gene-environment interactions of potential, uncertain or unlikely relevance for multiple sclerosis pathogenesis: Epstein-Barr virus, human immunodeficiency virus, hepatitis B virus, hepatitis C virus, cytomegalovirus, HHV8-Kaposi sarcoma, H1N1-influenza, JC virus, human innate immunity interactome for type I interferon, autoimmune regulator, vitamin D receptor, aryl hydrocarbon receptor and a panel of proteins targeted by 70 innate immune-modulating viral open reading frames from 30 viral species. Interactomes were either obtained from the literature or were manually curated. The P values of all single nucleotide polymorphism mapping to a given interactome were obtained from the last genome-wide association study of the International Multiple Sclerosis Genetics Consortium & the Wellcome Trust Case Control Consortium, 2. The interaction between genotype and Epstein Barr virus emerges as relevant for multiple sclerosis etiology. However, in line with recent data on the coexistence of common and unique strategies used by viruses to perturb the human molecular system, also other viruses have a similar potential, though probably less relevant in epidemiological terms

    Search for dark matter produced in association with a single top quark or a top quark pair in proton-proton collisions at s=13 TeV

    Get PDF
    A search has been performed for heavy resonances decaying to ZZ or ZW in 2l2q final states, with two charged leptons (l = e, mu) produced by the decay of a Z boson, and two quarks produced by the decay of a W or Z boson. The analysis is sensitive to resonances with masses in the range from 400 to 4500 GeV. Two categories are defined based on the merged or resolved reconstruction of the hadronically decaying vector boson, optimized for high- and low-mass resonances, respectively. The search is based on data collected during 2016 by the CMS experiment at the LHC in proton-proton collisions with a center-of-mass energy of root s = 13 TeV, corresponding to an integrated luminosity of 35.9 fb(-1). No excess is observed in the data above the standard model background expectation. Upper limits on the production cross section of heavy, narrow spin-1 and spin-2 resonances are derived as a function of the resonance mass, and exclusion limits on the production of W' bosons and bulk graviton particles are calculated in the framework of the heavy vector triplet model and warped extra dimensions, respectively.A search has been performed for heavy resonances decaying to ZZ or ZW in 2l2q final states, with two charged leptons (l = e, mu) produced by the decay of a Z boson, and two quarks produced by the decay of a W or Z boson. The analysis is sensitive to resonances with masses in the range from 400 to 4500 GeV. Two categories are defined based on the merged or resolved reconstruction of the hadronically decaying vector boson, optimized for high- and low-mass resonances, respectively. The search is based on data collected during 2016 by the CMS experiment at the LHC in proton-proton collisions with a center-of-mass energy of root s = 13 TeV, corresponding to an integrated luminosity of 35.9 fb(-1). No excess is observed in the data above the standard model background expectation. Upper limits on the production cross section of heavy, narrow spin-1 and spin-2 resonances are derived as a function of the resonance mass, and exclusion limits on the production of W' bosons and bulk graviton particles are calculated in the framework of the heavy vector triplet model and warped extra dimensions, respectively.A search for dark matter produced in association with top quarks in proton-proton collisions at a center-of-mass energy of 13 TeV is presented. The data set used corresponds to an integrated luminosity of 35.9 fb(-1) recorded with the CMS detector at the LHC. Whereas previous searches for neutral scalar or pseudoscalar mediators considered dark matter production in association with a top quark pair only, this analysis also includes production modes with a single top quark. The results are derived from the combination of multiple selection categories that are defined to target either the single top quark or the top quark pair signature. No significant deviations with respect to the standard model predictions are observed. The results are interpreted in the context of a simplified model in which a scalar or pseudoscalar mediator particle couples to a top quark and subsequently decays into dark matter particles. Scalar and pseudoscalar mediator particles with masses below 290 and 300 GeV, respectively, are excluded at 95% confidence level, assuming a dark matter particle mass of 1 GeV and mediator couplings to fermions and dark matter particles equal to unity.Peer reviewe

    Search for the pair production of light top squarks in the e(+/-)mu(-/+) final state in proton-proton collisions at root s=13 TeV

    Get PDF
    A search for the production of a pair of top squarks at the LHC is presented. This search targets a region of parameter space where the kinematics of top squark pair production and top quark pair production are very similar, because of the mass difference between the top squark and the neutralino being close to the top quark mass. The search is performed with 35.9 fb(-1) of proton-proton collisions at a centre-of-mass energy of root s = 13 TeV, collected by the CMS detector in 2016, using events containing one electron-muon pair with opposite charge. The search is based on a precise estimate of the top quark pair background, and the use of the M-T2 variable, which combines the transverse mass of each lepton and the missing transverse momentum. No excess of events is found over the standard model predictions. Exclusion limits are placed at 95% confidence level on the production of top squarks up to masses of 208 GeV for models with a mass difference between the top squark and the lightest neutralino close to that of the top quark.Peer reviewe

    Search for strongly interacting massive particles generating trackless jets in proton-proton collisions at s = 13 TeV

    Get PDF
    A search for dark matter in the form of strongly interacting massive particles (SIMPs) using the CMS detector at the LHC is presented. The SIMPs would be produced in pairs that manifest themselves as pairs of jets without tracks. The energy fraction of jets carried by charged particles is used as a key discriminator to suppress efficiently the large multijet background, and the remaining background is estimated directly from data. The search is performed using proton-proton collision data corresponding to an integrated luminosity of 16.1 fb - 1 , collected with the CMS detector in 2016. No significant excess of events is observed above the expected background. For the simplified dark matter model under consideration, SIMPs with masses up to 100 GeV are excluded and further sensitivity is explored towards higher masses
    corecore