124 research outputs found

    The dynamic cusp at low altitudes: A case study combining Viking, DMSP, and Sondrestrom incoherent scatter radar observations

    Get PDF
    A case study involving data from three satellites and a ground-based radar are presented. Focus is on a detailed discussion of observations of the dynamic cusp made on 24 Sep. 1986 in the dayside high-latitude ionosphere and interior magnetosphere. The relevant data from space-borne and ground-based sensors is presented. They include in-situ particle and field measurements from the DMSP-F7 and Viking spacecraft and Sondrestrom radar observations of the ionosphere. These data are augmented by observations of the IMF and the solar wind plasma. The observations are compared with predictions about the ionospheric response to the observed particle precipitation, obtained from an auroral model. It is shown that observations and model calculations fit well and provide a picture of the ionospheric footprint of the cusp in an invariant latitude versus local time frame. The combination of Viking, Sondrestrom radar, and IMP-8 data suggests that we observed an ionospheric signature of the dynamic cusp. Its spatial variation over time which appeared closely related to the southward component of the IMF was monitored

    ULF waves in the low‐latitude boundary layer and their relationship to magnetospheric pulsations: A multisatellite observation

    Get PDF
    On April 30 (day 120), 1985, the magnetosphere was compressed at 0923 UT and the subsolar magnetopause remained near 7 REgeocentric for ∼2 hours, during which the four spacecraft Spacecraft Charging At High Altitude (SCATHA), GOES 5, GOES 6, and Active Magnetospheric Particle Tracer Explorers (AMPTE) CCE were all in the magnetosphere on the morning side. SCATHA was in the low-latitude boundary layer (LLBL) in the second half of this period. The interplanetary magnetic field was inferred to be northward from the characteristics of precipitating particle fluxes as observed by the low-altitude satellite Defense Meteorological Satellite Program (DMSP) F7 and also from absence of substorms. We used magnetic field and particle data from this unique interval to study ULF waves in the LLBL and their relationship to magnetic pulsations in the magnetosphere. The LLBL was identified from the properties of particles, including bidirectional field-aligned electron beams at ∼200 eV. In the boundary layer the magnetic field exhibited both a 5–10 min irregular compressional oscillation and a broadband (Δƒ/ƒ ∼ 1) primarily transverse oscillations with a mean period of ∼50 s and a left-hand sense of polarization about the mean field. The former can be observed by other satellites and is likely due to pressure variations in the solar wind, while the latter is likely due to a Kelvin-Helmholtz (K.-H.) instability occurring in the LLBL or on the magnetopause. Also, a strongly transverse ∼3-s oscillation was observed in the LLBL. The magnetospheric pulsations, which exhibited position dependent frequencies, may be explained in terms of field line resonance with a broadband source wave, that is, either the pressure-induced compressional wave or the K.-H. wave generated in or near the boundary layer

    Spatial variation in breeding habitat selection by Cerulean Warblers (Setophaga cerulea) throughout the appalachian mountains

    Get PDF
    Studies of habitat selection are often of limited utility because they focus on small geographic areas, fail to examine behavior at multiple scales, or lack an assessment of the fitness consequences of habitat decisions. These limitations can hamper the identification of successful site-specific management strategies, which are urgently needed for severely declining species like Cerulean Warblers (Setophaga cerulea). We assessed how breeding habitat decisions made by Cerulean Warblers at multiple scales, and the subsequent effects of these decisions on nest survival, varied across the Appalachian Mountains. Selection for structural habitat features varied substantially among areas, particularly at the territory scale. Males within the least-forested landscapes selected microhabitat features that reflected more closed-canopy forest conditions, whereas males in highly forested landscapes favored features associated with canopy disturbance. Selection of nest-patch and nest-site attributes by females was more consistent across areas, with females selecting for increased tree size and understory cover and decreased basal area and midstory cover. Floristic preferences were similar across study areas: White Oak (Quercus alba), Cucumber-tree (Magnolia acuminata), and Sugar Maple (Acer saccharum) were preferred as nest trees, whereas red oak species (subgenus Erythrobalanus) and Red Maple (A. rubrum) were avoided. The habitat features that were related to nest survival also varied among study areas, and preferred features were negatively associated with nest survival at one area. Thus, our results indicate that large-scale spatial heterogeneity may influence local habitat-selection behavior and that it may be necessary to articulate site-specific management strategies for Cerulean Warblers

    Breeding season concerns and response to forest management: can forest management produce more breeding birds?.

    Get PDF
    Cerulean Warblers (Setophaga cerulea), one of the fastest declining avian species in North America, are associated with heterogeneous canopies in mature hardwood forests. However, the age of most second and third-growth forests in eastern North American is not sufficient for natural tree mortality to maintain structurally diverse canopies. Previous research suggests that forest management through timber harvest also may create conditions suitable as Cerulean Warbler breeding habitat. We conducted a multistate study that examined Cerulean Warbler response to varying degrees of canopy disturbance created by operational timber harvest. Specifically, 3 harvest treatments and an un-harvested reference plot were replicated on 7 study areas in 4 Appalachian states in 2005-2010. We compared pre-harvest and four years post-harvest demographic response of Cerulean Warblers. Over all study areas, Cerulean Warbler territory density remained stable in un-harvested reference plots and increased significantly the first year post-harvest on intermediate harvest plots. By year 3 post-harvest, territory density remained significantly greater for intermediate harvest than reference plots, and marginally greater for light and heavy harvests than reference plots. However, un-harvested reference plots had greater nest survival than most harvest treatments. The one exception was nest survival between reference plots and the intermediate harvest on the northern study areas did not differ. Our results indicate that intermediate harvests likely benefit Cerulean Warblers in some portions of the species’ breeding range. However, additional research is needed to better examine fitness consequences of timber harvests and to estimate population-level implications. In particular, does the greater number of nesting individuals, particularly in intermediate harvests, compensate for lower nesting success? Until researchers provide such insight, we recommend management decisions be based on local conditions, particularly in forests where Cerulean Warbler populations are high

    Emulating Natural Disturbances for Declining Late- Successional Species: A Case Study of the Consequences for Cerulean Warblers (Setophaga cerulea)

    Get PDF
    Forest cover in the eastern United States has increased over the past century and while some late-successional species have benefited from this process as expected, others have experienced population declines. These declines may be in part related to contemporary reductions in small-scale forest interior disturbances such as fire, windthrow, and treefalls. To mitigate the negative impacts of disturbance alteration and suppression on some late-successional species, strategies that emulate natural disturbance regimes are often advocated, but large-scale evaluations of these practices are rare. Here, we assessed the consequences of experimental disturbance (using partial timber harvest) on a severely declining late-successional species, the cerulean warbler (Setophaga cerulea), across the core of its breeding range in the Appalachian Mountains. We measured numerical (density), physiological (body condition), and demographic (age structure and reproduction) responses to three levels of disturbance and explored the potential impacts of disturbance on source-sink dynamics. Breeding densities of warblers increased one to four years after all canopy disturbances (vs. controls) and males occupying territories on treatment plots were in better condition than those on control plots. However, these beneficial effects of disturbance did not correspond to improvements in reproduction; nest success was lower on all treatment plots than on control plots in the southern region and marginally lower on light disturbance plots in the northern region. Our data suggest that only habitats in the southern region acted as sources, and interior disturbances in this region have the potential to create ecological traps at a local scale, but sources when viewed at broader scales. Thus, cerulean warblers would likely benefit from management that strikes a landscape-level balance between emulating natural disturbances in order to attract individuals into areas where current structure is inappropriate, and limiting anthropogenic disturbance in forests that already possess appropriate structural attributes in order to maintain maximum productivity

    Emulating Natural Disturbances for Declining Late-Successional Species: A Case Study of the Consequences for Cerulean Warblers (Setophaga cerulea)

    Get PDF
    Forest cover in the eastern United States has increased over the past century and while some late-successional species have benefited from this process as expected, others have experienced population declines. These declines may be in part related to contemporary reductions in small-scale forest interior disturbances such as fire, windthrow, and treefalls. To mitigate the negative impacts of disturbance alteration and suppression on some late-successional species, strategies that emulate natural disturbance regimes are often advocated, but large-scale evaluations of these practices are rare. Here, we assessed the consequences of experimental disturbance (using partial timber harvest) on a severely declining latesuccessional species, the cerulean warbler (Setophaga cerulea), across the core of its breeding range in the Appalachian Mountains. We measured numerical (density), physiological (body condition), and demographic (age structure and reproduction) responses to three levels of disturbance and explored the potential impacts of disturbance on source-sink dynamics. Breeding densities of warblers increased one to four years after all canopy disturbances (vs. controls) and males occupying territories on treatment plots were in better condition than those on control plots. However, these beneficial effects of disturbance did not correspond to improvements in reproduction; nest success was lower on all treatment plots than on control plots in the southern region and marginally lower on light disturbance plots in the northern region. Our data suggest that only habitats in the southern region acted as sources, and interior disturbances in this region have the potential to create ecological traps at a local scale, but sources when viewed at broader scales. Thus, cerulean warblers would likely benefit from management that strikes a landscape-level balance between emulating natural disturbances in order to attract individuals into areas where current structure is inappropriate, and limiting anthropogenic disturbance in forests that already possess appropriate structural attributes in order to maintain maximum productivity

    Validation of 2006 WHO Prediction Scores for True HIV Infection in Children Less than 18 Months with a Positive Serological HIV Test

    Get PDF
    All infants born to HIV-positive mothers have maternal HIV antibodies, sometimes persistent for 18 months. When Polymerase Chain Reaction (PCR) is not available, August 2006 World Health Organization (WHO) recommendations suggest that clinical criteria may be used for starting antiretroviral treatment (ART) in HIV seropositive children <18 months. Predictors are at least two out of sepsis, severe pneumonia and thrush, or any stage 4 defining clinical finding according to the WHO staging system.From January 2005 to October 2006, we conducted a prospective study on 236 hospitalized children <18 months old with a positive HIV serological test at the national reference hospital in Kigali. The following data were collected: PCR, clinical signs and CD4 cell count. Current proposed clinical criteria were present in 148 of 236 children (62.7%) and in 95 of 124 infected children, resulting in 76.6% sensitivity and 52.7% specificity. For 87 children (59.0%), clinical diagnosis was made based on severe unexplained malnutrition (stage 4 clinical WHO classification), of whom only 44 (50.5%) were PCR positive. Low CD4 count had a sensitivity of 55.6% and a specificity of 78.5%.As PCR is not yet widely available, clinical diagnosis is often necessary, but these criteria have poor specificity and therefore have limited use for HIV diagnosis. Unexplained malnutrition is not clearly enough defined in WHO recommendations. Extra pulmonary tuberculosis (TB), almost impossible to prove in young children, may often be the cause of malnutrition, especially in HIV-affected families more often exposed to TB. Food supplementation and TB treatment should be initiated before starting ART in children who are staged based only on severe malnutrition

    Assessing smoking status in disadvantaged populations: is computer administered self report an accurate and acceptable measure?

    Get PDF
    Background: Self report of smoking status is potentially unreliable in certain situations and in high-risk populations. This study aimed to determine the accuracy and acceptability of computer administered self-report of smoking status among a low socioeconomic (SES) population. Methods: Clients attending a community service organisation for welfare support were invited to complete a cross-sectional touch screen computer health survey. Following survey completion, participants were invited to provide a breath sample to measure exposure to tobacco smoke in expired air. Sensitivity, specificity, positive predictive value and negative predictive value were calculated. Results: Three hundred and eighty three participants completed the health survey, and 330 (86%) provided a breath sample. Of participants included in the validation analysis, 59% reported being a daily or occasional smoker. Sensitivity was 94.4% and specificity 92.8%. The positive and negative predictive values were 94.9% and 92.0% respectively. The majority of participants reported that the touch screen survey was both enjoyable (79%) and easy (88%) to complete. Conclusions: Computer administered self report is both acceptable and accurate as a method of assessing smoking status among low SES smokers in a community setting. Routine collection of health information using touch-screen computer has the potential to identify smokers and increase provision of support and referral in the community setting

    Explaining the effects of a multifaceted intervention to improve inpatient care in rural Kenyan hospitals -- interpretation based on retrospective examination of data from participant observation, quantitative and qualitative studies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>We have reported the results of a cluster randomized trial of rural Kenyan hospitals evaluating the effects of an intervention to introduce care based on best-practice guidelines. In parallel work we described the context of the study, explored the process and perceptions of the intervention, and undertook a discrete study on health worker motivation because this was felt likely to be an important contributor to poor performance in Kenyan public sector hospitals. Here, we use data from these multiple studies and insights gained from being participants in and observers of the intervention process to provide our explanation of how intervention effects were achieved as part of an effort to better understand implementation in low-income hospital settings.</p> <p>Methods</p> <p>Initial hypotheses were generated to explain the variation in intervention effects across place, time, and effect measure (indicator) based on our understanding of theory and informed by our implementation experience and participant observations. All data sources available for hospitals considered as cases for study were then examined to determine if hypotheses were supported, rejected, or required modification. Data included transcriptions of interviews and group discussions, field notes and that from the detailed longitudinal quantitative investigation. Potentially useful explanatory themes were identified, discussed by the implementing and research team, revised, and merged as part of an iterative process aimed at building more generic explanatory theory. At the end of this process, findings were mapped against a recently reported comprehensive framework for implementation research.</p> <p>Results</p> <p>A normative re-educative intervention approach evolved that sought to reset norms and values concerning good practice and promote 'grass-roots' participation to improve delivery of correct care. Maximal effects were achieved when this strategy and external support supervision helped create a soft-contract with senior managers clarifying roles and expectations around desired performance. This, combined with the support of facilitators acting as an expert resource and 'shop-floor' change agent, led to improvements in leadership, accountability, and resource allocation that enhanced workers' commitment and capacity and improved clinical microsystems. Provision of correct care was then particularly likely if tasks were simple and a good fit to existing professional routines. Our findings were in broad agreement with those defined as part of recent work articulating a comprehensive framework for implementation research.</p> <p>Conclusions</p> <p>Using data from multiple studies can provide valuable insight into how an intervention is working and what factors may explain variability in effects. Findings clearly suggest that major intervention strategies aimed at improving child and newborn survival in low-income settings should go well beyond the fixed inputs (training, guidelines, and job aides) that are typical of many major programmes. Strategies required to deliver good care in low-income settings should recognize that this will need to be co-produced through engagement often over prolonged periods and as part of a directive but adaptive, participatory, information-rich, and reflective process.</p
    • …
    corecore