315 research outputs found

    Primary Prevention of Child Abuse: Is It Really Possible?

    Get PDF
    Despite the growing interest in child abuse and its prevention, to date no systematic research has been conducted to determine the usefulness of instruments used to identify and predict abuse or neglect. The present study is a review and analysis of predictive instruments of abuse or neglect with the goal of identifying the predictive efficiency of the instruments. Analysis reveals a variety of problems with predictive efficiency, particularly as predicting individual risk of abuse or neglect relates to primary prevention. Implications of the findings and suggestions for practice are discussed

    Voice, autonomy and utopian desire in participatory film-making with young refugees

    Get PDF
    This article is a reflection on what reflexive documentary scholars call the ‘moral dimension’ (Nash 2012: 318) of a participatory filmmaking project with refugee young people, who wanted to make a film to support other new young arrivals in the process of making home in Scotland. In the first part, we highlight some of the challenges of collaborating with refugee young people, in light of the often de-humanising representations of refugees in mainstream media and the danger of the triple conflation of authenticity-voice-pain in academic narratives about refugees. In the second part, we show how honouring young people’s desire to convey the hopeful aspects of making home, emerged as a key pedagogical strategy to affirm their expert position and encourage their participation in the project. Revisiting key moments of learning and interaction, we demonstrate how young people’s process of ‘finding a voice’ in moment-by-moment filmmaking practice was not a linear, developmental process towards ‘pure’ individual empowerment and singular artistic expression. Their participation in shaping their visual (self-)representation in the final film, was embedded in the dialogical process and pragmatic requirements of a collaborative film production, in which voice, autonomy and teacher authority were negotiated on a moment-by-moment basis. We conclude that it is vital for a reflexive practice and research to not gloss over the moral dilemmas in the name of progressive ideals, for example, when representations are co-created by project filmmakers/educators, but embrace these deliberations as part of the ‘fascinating collaborative matrix’ (Chambers 2019: 29) of participatory filmmaking

    Daptomycin versus standard therapy for bacteremia and endocarditis caused by Staphylococcus aureus.

    Get PDF
    BACKGROUND: Alternative therapies for Staphylococcus aureus bacteremia and endocarditis are needed. METHODS: We randomly assigned 124 patients with S. aureus bacteremia with or without endocarditis to receive 6 mg of daptomycin intravenously per kilogram of body weight daily and 122 to receive initial low-dose gentamicin plus either an antistaphylococcal penicillin or vancomycin. The primary efficacy end point was treatment success 42 days after the end of therapy. RESULTS: Forty-two days after the end of therapy in the modified intention-to-treat analysis, a successful outcome was documented for 53 of 120 patients who received daptomycin as compared with 48 of 115 patients who received standard therapy (44.2 percent vs. 41.7 percent; absolute difference, 2.4 percent; 95 percent confidence interval, -10.2 to 15.1 percent). Our results met prespecified criteria for the noninferiority of daptomycin. The success rates were similar in subgroups of patients with complicated bacteremia, right-sided endocarditis, and methicillin-resistant S. aureus. Daptomycin therapy was associated with a higher rate of microbiologic failure than was standard therapy (19 vs. 11 patients, P=0.17). In 6 of the 19 patients with microbiologic failure in the daptomycin group, isolates with reduced susceptibility to daptomycin emerged; similarly, a reduced susceptibility to vancomycin was noted in isolates from patients treated with vancomycin. As compared with daptomycin therapy, standard therapy was associated with a nonsignificantly higher rate of adverse events that led to treatment failure due to the discontinuation of therapy (17 vs. 8, P=0.06). Clinically significant renal dysfunction occurred in 11.0 percent of patients who received daptomycin and in 26.3 percent of patients who received standard therapy (P=0.004). CONCLUSIONS: Daptomycin (6 mg per kilogram daily) is not inferior to standard therapy for S. aureus bacteremia and right-sided endocarditis. (ClinicalTrials.gov number, NCT00093067 [ClinicalTrials.gov].)

    The state of the Martian climate

    Get PDF
    60°N was +2.0°C, relative to the 1981–2010 average value (Fig. 5.1). This marks a new high for the record. The average annual surface air temperature (SAT) anomaly for 2016 for land stations north of starting in 1900, and is a significant increase over the previous highest value of +1.2°C, which was observed in 2007, 2011, and 2015. Average global annual temperatures also showed record values in 2015 and 2016. Currently, the Arctic is warming at more than twice the rate of lower latitudes

    Teratology Primer-2nd Edition (7/9/2010)

    Get PDF
    Foreword: What is Teratology? “What a piece of work is an embryo!” as Hamlet might have said. “In form and moving how express and admirable! In complexity how infinite!” It starts as a single cell, which by repeated divisions gives rise to many genetically identical cells. These cells receive signals from their surroundings and from one another as to where they are in this ball of cells —front or back, right or left, headwards or tailwards, and what they are destined to become. Each cell commits itself to being one of many types; the cells migrate, combine into tissues, or get out of the way by dying at predetermined times and places. The tissues signal one another to take their own pathways; they bend, twist, and form organs. An organism emerges. This wondrous transformation from single celled simplicity to myriad-celled complexity is programmed by genes that, in the greatest mystery of all, are turned on and off at specified times and places to coordinate the process. It is a wonder that this marvelously emergent operation, where there are so many opportunities for mistakes, ever produces a well-formed and functional organism. And sometimes it doesn’t. Mistakes occur. Defective genes may disturb development in ways that lead to death or to malformations. Extrinsic factors may do the same. “Teratogenic” refers to factors that cause malformations, whether they be genes or environmental agents. The word comes from the Greek “teras,” for “monster,” a term applied in ancient times to babies with severe malformations, which were considered portents or, in the Latin, “monstra.” Malformations can happen in many ways. For example, when the neural plate rolls up to form the neural tube, it may not close completely, resulting in a neural tube defect—anencephaly if the opening is in the head region, or spina bifida if it is lower down. The embryonic processes that form the face may fail to fuse, resulting in a cleft lip. Later, the shelves that will form the palate may fail to move from the vertical to the horizontal, where they should meet in the midline and fuse, resulting in a cleft palate. Or they may meet, but fail to fuse, with the same result. The forebrain may fail to induce the overlying tissue to form the eye, so there is no eye (anophthalmia). The tissues between the toes may fail to break down as they should, and the toes remain webbed. Experimental teratology flourished in the 19th century, and embryologists knew well that the development of bird and frog embryos could be deranged by environmental “insults,” such as lack of oxygen (hypoxia). But the mammalian uterus was thought to be an impregnable barrier that would protect the embryo from such threats. By exclusion, mammalian malformations must be genetic, it was thought. In the early 1940s, several events changed this view. In Australia an astute ophthalmologist, Norman Gregg, established a connection between maternal rubella (German measles) and the triad of cataracts, heart malformations, and deafness. In Cincinnati Josef Warkany, an Austrian pediatrician showed that depriving female rats of vitamin B (riboflavin) could cause malformations in their offspring— one of the early experimental demonstrations of a teratogen. Warkany was trying to produce congenital cretinism by putting the rats on an iodine deficient diet. The diet did indeed cause malformations, but not because of the iodine deficiency; depleting the diet of iodine had also depleted it of riboflavin! Several other teratogens were found in experimental animals, including nitrogen mustard (an anti cancer drug), trypan blue (a dye), and hypoxia (lack of oxygen). The pendulum was swinging back; it seemed that malformations were not genetically, but environmentally caused. In Montreal, in the early 1950s, Clarke Fraser’s group wanted to bring genetics back into the picture. They had found that treating pregnant mice with cortisone caused cleft palate in the offspring, and showed that the frequency was high in some strains and low in others. The only difference was in the genes. So began “teratogenetics,” the study of how genes influence the embryo’s susceptibility to teratogens. The McGill group went on to develop the idea that an embryo’s genetically determined, normal, pattern of development could influence its susceptibility to a teratogen— the multifactorial threshold concept. For instance, an embryo must move its palate shelves from vertical to horizontal before a certain critical point or they will not meet and fuse. A teratogen that causes cleft palate by delaying shelf movement beyond this point is more likely to do so in an embryo whose genes normally move its shelves late. As studies of the basis for abnormal development progressed, patterns began to appear, and the principles of teratology were developed. These stated, in summary, that the probability of a malformation being produced by a teratogen depends on the dose of the agent, the stage at which the embryo is exposed, and the genotype of the embryo and mother. The number of mammalian teratogens grew, and those who worked with them began to meet from time to time, to talk about what they were finding, leading, in 1960, to the formation of the Teratology Society. There were, of course, concerns about whether these experimental teratogens would be a threat to human embryos, but it was thought, by me at least, that they were all “sledgehammer blows,” that would be teratogenic in people only at doses far above those to which human embryos would be exposed. So not to worry, or so we thought. Then came thalidomide, a totally unexpected catastrophe. The discovery that ordinary doses of this supposedly “harmless” sleeping pill and anti-nauseant could cause severe malformations in human babies galvanized this new field of teratology. Scientists who had been quietly working in their laboratories suddenly found themselves spending much of their time in conferences and workshops, sitting on advisory committees, acting as consultants for pharmaceutical companies, regulatory agencies, and lawyers, as well as redesigning their research plans. The field of teratology and developmental toxicology expanded rapidly. The following pages will show how far we have come, and how many important questions still remain to be answered. A lot of effort has gone into developing ways to predict how much of a hazard a particular experimental teratogen would be to the human embryo (chapters 9–19). It was recognized that animal studies might not prove a drug was “safe” for the human embryo (in spite of great pressure from legislators and the public to do so), since species can vary in their responses to teratogenic exposures. A number of human teratogens have been identified, and some, suspected of teratogenicity, have been exonerated—at least of a detectable risk (chapters 21–32). Regulations for testing drugs before market release have greatly improved (chapter 14). Other chapters deal with how much such things as population studies (chapter 11), post-marketing surveillance (chapter 13), and systems biology (chapter 16) add to our understanding. And, in a major advance, the maternal role of folate in preventing neural tube defects and other birth defects is being exploited (chapter 32). Encouraging women to take folic acid supplements and adding folate to flour have produced dramatic falls in the frequency of neural tube defects in many parts of the world. Progress has been made not only in the use of animal studies to predict human risks, but also to illumine how, and under what circumstances, teratogens act to produce malformations (chapters 2–8). These studies have contributed greatly to our knowledge of abnormal and also normal development. Now we are beginning to see exactly when and where the genes turn on and off in the embryo, to appreciate how they guide development and to gain exciting new insights into how genes and teratogens interact. The prospects for progress in the war on birth defects were never brighter. F. Clarke Fraser McGill University (Emeritus) Montreal, Quebec, Canad

    Noise-Driven Stem Cell and Progenitor Population Dynamics

    Get PDF
    BACKGROUND: The balance between maintenance of the stem cell state and terminal differentiation is influenced by the cellular environment. The switching between these states has long been understood as a transition between attractor states of a molecular network. Herein, stochastic fluctuations are either suppressed or can trigger the transition, but they do not actually determine the attractor states. METHODOLOGY/PRINCIPAL FINDINGS: We present a novel mathematical concept in which stem cell and progenitor population dynamics are described as a probabilistic process that arises from cell proliferation and small fluctuations in the state of differentiation. These state fluctuations reflect random transitions between different activation patterns of the underlying regulatory network. Importantly, the associated noise amplitudes are state-dependent and set by the environment. Their variability determines the attractor states, and thus actually governs population dynamics. This model quantitatively reproduces the observed dynamics of differentiation and dedifferentiation in promyelocytic precursor cells. CONCLUSIONS/SIGNIFICANCE: Consequently, state-specific noise modulation by external signals can be instrumental in controlling stem cell and progenitor population dynamics. We propose follow-up experiments for quantifying the imprinting influence of the environment on cellular noise regulation.Engineering and Applied SciencesOther Research Uni

    [Comment] Redefine statistical significance

    Get PDF
    The lack of reproducibility of scientific studies has caused growing concern over the credibility of claims of new discoveries based on “statistically significant” findings. There has been much progress toward documenting and addressing several causes of this lack of reproducibility (e.g., multiple testing, P-hacking, publication bias, and under-powered studies). However, we believe that a leading cause of non-reproducibility has not yet been adequately addressed: Statistical standards of evidence for claiming discoveries in many fields of science are simply too low. Associating “statistically significant” findings with P < 0.05 results in a high rate of false positives even in the absence of other experimental, procedural and reporting problems. For fields where the threshold for defining statistical significance is P<0.05, we propose a change to P<0.005. This simple step would immediately improve the reproducibility of scientific research in many fields. Results that would currently be called “significant” but do not meet the new threshold should instead be called “suggestive.” While statisticians have known the relative weakness of using P≈0.05 as a threshold for discovery and the proposal to lower it to 0.005 is not new (1, 2), a critical mass of researchers now endorse this change. We restrict our recommendation to claims of discovery of new effects. We do not address the appropriate threshold for confirmatory or contradictory replications of existing claims. We also do not advocate changes to discovery thresholds in fields that have already adopted more stringent standards (e.g., genomics and high-energy physics research; see Potential Objections below). We also restrict our recommendation to studies that conduct null hypothesis significance tests. We have diverse views about how best to improve reproducibility, and many of us believe that other ways of summarizing the data, such as Bayes factors or other posterior summaries based on clearly articulated model assumptions, are preferable to P-values. However, changing the P-value threshold is simple and might quickly achieve broad acceptance
    corecore