196 research outputs found

    Analysis of the first gigantic jet recorded over continental North America

    Get PDF
    [1] Two low-light cameras near Marfa, Texas, recorded a gigantic jet over northern Mexico on 13 May 2005 at approximately 0423:50 UTC. Assuming that the farthest of two candidate storm systems was its source, the bright lower channel ended in a fork at around 50–59 km height with the very dim upper branches extended to 69–80 km altitude. During the time window containing the jet, extremely low frequency magnetic field recordings show that there was no fast charge moment change larger than 50 coulomb times kilometers (C km) but there was a larger and slower charge moment change of 520 C km over 70 ms. The likely parent thunderstorm was a high-precipitation supercell cluster containing a persistent mesocyclone, with radar echo tops of at least 17 km. However, photogrammetric analysis suggests that the gigantic jet occurred over the forward flank downdraft region with echo tops of 14 km. This part of the supercell may have had an inverted-polarity charge configuration as evidenced by positive cloud-to-ground lightning flashes (+CG) dominating over negative flashes (-CG), while -CGs occurred under the downwind anvil. Four minutes before the gigantic jet, -CG activity practically ceased in this area, while +CG rates increased, culminating during the 20 s leading up to the gigantic jet with four National Lightning Detection Network–detected +CGs. A relative lull in lightning activity of both polarities was observed for up to 1.5 min after the gigantic jet. The maturing storm subsequently produced 30 sprites between 0454 and 0820 UTC, some associated with extremely large impulse charge moment change values.Peer ReviewedPostprint (published version

    Trace metal fluxes to the ocean: The importance of high‐standing oceanic islands

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/94592/1/grl16149.pd

    Tissue-preserving approach to extracting DNA from paraffin-embedded specimens using tissue microarray technology

    Full text link
    Background. DNA extracted from tumor cells or normal cells contained in formalin-fixed, paraffin-embedded tissues is widely used in many laboratories. The 2 most common procedures to isolate cells for DNA extraction from paraffin-embedded tissues are scalpel microdissection and laser capture microdissection. A new tissue- and time-conserving method for rapid DNA isolation from small cores taken from paraffin-embedded tissue blocks is described in this report. Methods. DNA was extracted from small tissue cores collected from paraffin-embedded tissue blocks at the time of tissue microarray construction. The quality and quantity of the DNA extracted was compared to DNA collected by scalpel microdissection. DNA collected from tissue cores was used in polymerase chain reaction (PCR) and loss of heterozygosity (LOH) analysis. Results. The quality and quantity of DNA obtained using tissue cores was comparable to DNA obtained by traditional methods. The tissue core method of DNA extraction preserves the tissue blocks from which the cores are extracted for future use. Adequate quantities of DNA can be successfully extracted from small segments of tissue cores and used for PCR. DNA isolated by tissue microdissection and the tissue core method were comparable when used to assess allelic heterozygosity on chromosome arm 18q. Conclusion. The tissue core method of DNA isolation is reliable, tissue conserving, and time effective. Tissue cores for DNA extraction can be harvested at the same time as tissue microarray construction. The technique has the advantage of preserving the original tissue blocks for additional study as only tiny cores are removed. © 2007 Wiley Periodicals, Inc. Head Neck, 2007Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/56021/1/20547_ftp.pd

    Investigating the health implications of social policy initiatives at the local level: study design and methods

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In this paper we present the research design and methods of a study that seeks to capture local level responses to an Australian national social policy initiative, aimed at reducing inequalities in the social determinants of health.</p> <p>Methods/Design</p> <p>The study takes a policy-to-practice approach and combines policy and stakeholder interviewing with a comparative case study analysis of two not-for-profit organisations involved in the delivery of federal government policy.</p> <p>Discussion</p> <p>Before the health impacts of broad-scale policies, such as the one described in this study, can be assessed at the population level, we need to understand the implementation process. This is consistent with current thinking in political science and social policy, which has emphasised the importance of investigating how, and if, policies are translated into operational realities.</p

    Support and Assessment for Fall Emergency Referrals (SAFER 1) trial protocol. Computerised on-scene decision support for emergency ambulance staff to assess and plan care for older people who have fallen: evaluation of costs and benefits using a pragmatic cluster randomised trial

    Get PDF
    Background: Many emergency ambulance calls are for older people who have fallen. As half of them are left at home, a community-based response may often be more appropriate than hospital attendance. The SAFER 1 trial will assess the costs and benefits of a new healthcare technology - hand-held computers with computerised clinical decision support (CCDS) software - to help paramedics decide who needs hospital attendance, and who can be safely left at home with referral to community falls services. Methods/Design: Pragmatic cluster randomised trial with a qualitative component. We shall allocate 72 paramedics ('clusters') at random between receiving the intervention and a control group delivering care as usual, of whom we expect 60 to complete the trial. Patients are eligible if they are aged 65 or older, live in the study area but not in residential care, and are attended by a study paramedic following an emergency call for a fall. Seven to 10 days after the index fall we shall offer patients the opportunity to opt out of further follow up. Continuing participants will receive questionnaires after one and 6 months, and we shall monitor their routine clinical data for 6 months. We shall interview 20 of these patients in depth. We shall conduct focus groups or semi-structured interviews with paramedics and other stakeholders. The primary outcome is the interval to the first subsequent reported fall (or death). We shall analyse this and other measures of outcome, process and cost by 'intention to treat'. We shall analyse qualitative data thematically. Discussion: Since the SAFER 1 trial received funding in August 2006, implementation has come to terms with ambulance service reorganisation and a new national electronic patient record in England. In response to these hurdles the research team has adapted the research design, including aspects of the intervention, to meet the needs of the ambulance services. In conclusion this complex emergency care trial will provide rigorous evidence on the clinical and cost effectiveness of CCDS for paramedics in the care of older people who have fallen

    Facts, values, and Attention-Deficit Hyperactivity Disorder (ADHD): an update on the controversies

    Get PDF
    The Hastings Center, a bioethics research institute, is holding a series of 5 workshops to examine the controversies surrounding the use of medication to treat emotional and behavioral disturbances in children. These workshops bring together clinicians, researchers, scholars, and advocates with diverse perspectives and from diverse fields. Our first commentary in CAPMH, which grew out of our first workshop, explained our method and explored the controversies in general. This commentary, which grows out of our second workshop, explains why informed people can disagree about ADHD diagnosis and treatment. Based on what workshop participants said and our understanding of the literature, we make 8 points. (1) The ADHD label is based on the interpretation of a heterogeneous set of symptoms that cause impairment. (2) Because symptoms and impairments are dimensional, there is an inevitable "zone of ambiguity," which reasonable people will interpret differently. (3) Many other variables, from different systems and tools of diagnosis to different parenting styles and expectations, also help explain why behaviors associated with ADHD can be interpreted differently. (4) Because people hold competing views about the proper goals of psychiatry and parenting, some people will be more, and others less, concerned about treating children in the zone of ambiguity. (5) To recognize that nature has written no bright line between impaired and unimpaired children, and that it is the responsibility of humans to choose who should receive a diagnosis, does not diminish the significance of ADHD. (6) Once ADHD is diagnosed, the facts surrounding the most effective treatment are complicated and incomplete; contrary to some popular wisdom, behavioral treatments, alone or in combination with low doses of medication, can be effective in the long-term reduction of core ADHD symptoms and at improving many aspects of overall functioning. (7) Especially when a child occupies the zone of ambiguity, different people will emphasize different values embedded in the pharmacological and behavioral approaches. (8) Truly informed decision-making requires that parents (and to the extent they are able, children) have some sense of the complicated and incomplete facts regarding the diagnosis and treatment of ADHD

    Emergence of Spatial Structure in Cell Groups and the Evolution of Cooperation

    Get PDF
    On its own, a single cell cannot exert more than a microscopic influence on its immediate surroundings. However, via strength in numbers and the expression of cooperative phenotypes, such cells can enormously impact their environments. Simple cooperative phenotypes appear to abound in the microbial world, but explaining their evolution is challenging because they are often subject to exploitation by rapidly growing, non-cooperative cell lines. Population spatial structure may be critical for this problem because it influences the extent of interaction between cooperative and non-cooperative individuals. It is difficult for cooperative cells to succeed in competition if they become mixed with non-cooperative cells, which can exploit the public good without themselves paying a cost. However, if cooperative cells are segregated in space and preferentially interact with each other, they may prevail. Here we use a multi-agent computational model to study the origin of spatial structure within growing cell groups. Our simulations reveal that the spatial distribution of genetic lineages within these groups is linked to a small number of physical and biological parameters, including cell growth rate, nutrient availability, and nutrient diffusivity. Realistic changes in these parameters qualitatively alter the emergent structure of cell groups, and thereby determine whether cells with cooperative phenotypes can locally and globally outcompete exploitative cells. We argue that cooperative and exploitative cell lineages will spontaneously segregate in space under a wide range of conditions and, therefore, that cellular cooperation may evolve more readily than naively expected
    corecore