127 research outputs found

    Gentamicin pharmacokinetics during slow daily home hemodialysis

    Get PDF
    Gentamicin pharmacokinetics during slow daily home hemodialysis.BackgroundGentamicin is commonly used in hemodialysis patients. Gentamicin pharmacokinetics during traditional hemodialysis have been described. Slow daily home (SDH) hemodialysis (7 to 9 hours a day/6 days a week) use is increasing due to benefits observed with increased hemodialysis. We determined gentamicin pharmacokinetics for SDH hemodialysis patients.MethodsEight patients (four male and four female) received a single intravenous dose of 0.6 mg/kg gentamicin post-hemodialysis. Blood samples were collected at 5, 10, 15, 30, and 60 minutes after dose. The next day patients underwent a typical SDH hemodialysis (high-flux F50NR dialyzer) session. Blood samples were taken at 0, 5, 15, 60, 120, 240, 360, 480 minutes during and 15, 30 and 60 minutes post-hemodialysis. Baseline and 24-hour urine samples were collected. Pharmacokinetic parameters were calculated assuming a one-compartment model.ResultsPatients were 42.5 ± 13.1 years old (mean ± SD). Inter-, intra-, and post-hemodialysis collection periods were 17.0 ± 2.1 hours, 8.1 ± 0.4 hours, and 1.1 ± 0.1 hours, respectively. Intra-, and interdialytic gentamicin half-lives were different (intradialytic, 3.7 ± 0.8 hours; interdialytic, 20.4 ± 4.7 hours; P < 0.0001). Hemodialysis clearance accounted for 70.5% gentamicin total clearance. Renal clearance correlated with glomerular filtration rate (GFR) (renal clearance = 1.2 GFR; r2 = 0.98; P < 0.001). Mean peak and trough of hemodialysis concentrations were 1.8 ± 0.6 μg/mL and 0.5 ± 0.2 μg/mL, respectively. Post-hemodialysis rebound was 3.1 ± 8.8% at 1 hour.ConclusionPharmacokinetic model predicts 2.0 to 2.5 mg/kg dose gentamicin post-hemodialysis would provide peak (1 hour post-dose) and trough (end of SDH hemodialysis session) concentrations of 6.0 to 7.5 μg/mL and 0.7 to 0.8 μg/mL, respectively. This would provide adequate coverage for most gram-negative organisms in SDH hemodialysis patients

    Pastoralist Use of Reseeded Fields for Additional Resilience to Climate Variability through Alternate Livelihood Activities

    Get PDF
    Land degradation, climate variability, socio-economic changes and population increase are some of the factors that shrink grazing lands making forage availability less predictable and affecting the resilience of pastoralists communities. Reseeding as an intervention can rehabilitate degraded lands for profitable use through offering pastoralists alternative sources of livelihood and encourage livelihood diversification. This study evaluated two pastoral communities, the Tugen and the Njemps from Baringo County, Kenya, who have owned, managed and used reseeded fields to study the dynamics of reseeding land. A total of 193 pastoralist households were interviewed. These two pastoral communities used their fields for: dry season grazing, grass seed harvesting, cutting thatching grass, milking of mostly cattle, engaging in fattening programs, bailing of hay, and leasing of land. On average, each household engaged in two to three activities with a maximum of five activities combined. The most common activity was dry season grazing which is practiced by 82% of the respondents followed by grass seed harvesting. These activities can be performed in sequence but proper planning is required to ensure that the field remains sustainably productive and successful. Reseeding appears to have high potential for implementation in other dry land counties of Kenya and Sub-Saharan Africa as an intervention that can improve resilience to climate variability as it provides additional grazing ground for the livestock

    Forum: Critical Decision Dates for Drought Management in Centraland Northern Great Plains Rangeland

    Get PDF
    Ranchers and other land managers of central and northern Great Plains rangelands face recurrent droughts that negatively influence economic returns and environmental resources for ranching enterprises. Accurately estimating annual forage production and initiating drought decision-making actions proactively early in the growing season are both critical to minimize financial losses and degradation to rangeland soil and plant resources. Long-term forage production data sets from Alberta, Kansas, Montana, Nebraska, North Dakota, South Dakota, and Wyoming demonstrated that precipitation in April, May, and June (or some combination of these months) robustly predict annual forage production. Growth curves from clipping experiments and ecological site descriptions (ESDs) indicate that maximum monthly forage growth rates occur 1 mo after the best spring month (April to June) precipitation prediction variable. Key for rangeland managers is that the probability of receiving sufficient precipitation after 1 July to compensate for earlier spring precipitation deficits is extremely low. The complexity of human dimensions of drought decision-making necessitates that forage prediction tools account for uncertainty in matching animal demand to forage availability, and that continued advancements in remote sensing applications address both spatial and temporal relationships in forage production to inform critical decision dates for drought management in these rangeland ecosystems

    Convergence of dynamic vegetation net productivity responses to precipitation variability from 10 years of MODIS EVI

    Full text link
    According to Global Climate Models (GCMs) the occurrence of extreme events of precipitation will be more frequent in the future. Therefore, important challenges arise regarding climate variability, which are mainly related to the understanding of ecosystem responses to changes in precipitation patterns. Previous studies have found that Above-ground Net Primary Productivity (ANPP) was positively related to increases in annual precipitation and this relation may converge across biomes during dry years. One challenge in studying this ecosystem response at the continental scale is the lack of ANPP field measurements over extended areas. In this study, the MODIS EVI was utilized as a surrogate for ANPP and combined with precipitation datasets from twelve different experimental sites across the United States over a 10-year period. Results from this analysis confirmed that integrated-EVI for different biomes converged toward common precipitation use efficiency during water-limited periods and may be a viable surrogate for ANPP measurements for further ecological research

    Sugar maple (Acer saccharum March.) growth is influenced by close conspecifics and skid trait proximity following selection harvest

    Get PDF
    In this study, we quantified the effects of local neighbourhood competition, light availability, and proximity to skid trails on the growth of sugar maple (Acer saccharum Marsh.) trees following selection harvest. We hypothesized that growth would increase with decreasing competition and increasing light availability, but that proximity to skid trails would negatively affect growth. A total of 300 sugar maples were sampled 10 years after selection harvesting in 18 stands in Témiscamingue (Québec, Canada). Detailed tree and skid trail maps were obtained in one 0.4 ha plot per stand. Square-root transformed radial growth data were fitted to a linear mixed model that included tree diameter, crown position, a neighbourhood competition index, light availability (estimated using the SORTIE light model), and distance to the nearest skid trail as explanatory variables. We considered various distance-dependent or -independent indices based on neighbourhood radii ranging from 6 to 12 m. The competition index that provided the best fit to the data was a distance-dependent index computed in a 6 m search radius, but a\ud distance-independent version of the competition index provided an almost equivalent fit to data. Models corresponding to all combinations of main effects were fit to data using maximum likelihood, and weighted averages of parameter estimates were obtained usingmultimodel inference. All predictors had\ud an influence on growth, with the exception of light. Radial growth decreased with increasing tree diameter, level of competition and proximity to skid trails, and varied among crown positions with trees in suppressed and intermediate positions having lower growth rates than codominants and dominants. Our results indicate that in selection managed stands, the radial growth of sugarmaple trees depends on\ud competition from close (6 m) conspecific neighbours, and is still affected by proximity to skid trails 10 years after harvesting. Such results underscore the importance of minimizing the extent of skid trail networks by careful pre-harvest planning of trail layout. We also conclude that the impact of heterogeneity among individual-tree neighbourhoods, such as those resulting from alternative spatial patterns of harvest, can usefully be integrated into models of post-harvest tree growth

    Factors influencing the adoption of an innovation: An examination of the uptake of the Canadian Heart Health Kit (HHK)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is an emerging knowledge base on the effectiveness of strategies to close the knowledge-practice gap. However, less is known about how attributes of an innovation and other contextual and situational factors facilitate and impede an innovation's adoption. The Healthy Heart Kit (HHK) is a risk management and patient education resource for the prevention of cardiovascular disease (CVD) and promotion of cardiovascular health. Although previous studies have demonstrated the HHK's content validity and practical utility, no published study has examined physicians' uptake of the HHK and factors that shape its adoption.</p> <p>Objectives</p> <p>Conceptually informed by Rogers' Diffusion of Innovation theory, and Theory of Planned Behaviour, this study had two objectives: (1) to determine if specific attributes of the HHK as well as contextual and situational factors are associated with physicians' intention and actual usage of the HHK kit; and (2), to determine if any contextual and situational factors are associated with individual or environmental barriers that prevent the uptake of the HHK among those physicians who do not plan to use the kit.</p> <p>Methods</p> <p>A sample of 153 physicians who responded to an invitation letter sent to all family physicians in the province of Alberta, Canada were recruited for the study. Participating physicians were sent a HHK, and two months later a study questionnaire assessed primary factors on the physicians' clinical practice, attributes of the HHK (relative advantage, compatibility, complexity, trialability, observability), confidence and control using the HHK, barriers to use, and individual attributes. All measures were used in path analysis, employing a causal model based on Rogers' Diffusion of Innovations Theory and Theory of Planned Behaviour.</p> <p>Results</p> <p>115 physicians (follow up rate of 75%) completed the questionnaire. Use of the HHK was associated with intention to use the HHK, relative advantage, and years of experience. Relative advantage and the observability of the HHK benefits were also significantly associated with physicians' intention to use the HHK. Physicians working in solo medical practices reported experiencing more individual and environmental barriers to using the HHK.</p> <p>Conclusion</p> <p>The results of this study suggest that future information innovations must demonstrate an advantage over current resources and the research evidence supporting the innovation must be clearly visible. Findings also suggest that the innovation adoption process has a social element, and collegial interactions and discussions may facilitate that process. These results could be valuable for knowledge translation researchers and health promotion developers in future innovation adoption planning.</p

    Low doses of caffeine reduce heart rate during submaximal cycle ergometry

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The purpose of this study was to examine the cardiovascular effects of two low-levels of caffeine ingestion in non habitual caffeine users at various submaximal and maximal exercise intensities.</p> <p>Methods</p> <p>Nine male subjects (19–25 yr; 83.3 ± 3.1 kg; 184 ± 2 cm), underwent three testing sessions administered in a randomized and double-blind fashion. During each session, subjects were provided 4 oz of water and a gelatin capsule containing a placebo, 1.5 mg/kg caffeine, or 3.0 mg/kg caffeine. After thirty minutes of rest, a warm-up (30 Watts for 2 min) the pedal rate of 60 rpm was maintained at a steady-state output of 60 watts for five minutes; increased to 120 watts for five minutes and to 180 watts for five minutes. After a 2 min rest the workload was 180 watts for one minute and increased by 30 watts every minute until exhaustion. Heart rate (HR) was measured during the last 15-seconds of each minute of submaximal exercise. Systolic blood pressure (BP) was measured at rest and during each of the three sub-maximal steady state power outputs. Minute ventilation (V<sub>E</sub>), Tidal volume (V<sub>T</sub>), Breathing frequency (Bf), Rating of perceived exertion (RPE), Respiratory exchange ratio (RER), and Oxygen consumption (VO<sub>2</sub>) were measured at rest and during each minute of exercise.</p> <p>Results</p> <p>Caffeine at 1.5 and 3.0 mg/kg body weight significantly lowered (p < 0.05) HR during all three submaximal exercise intensities compared to placebo (range – 4 to 7 bpm lower) but not at rest or maximal exercise. BP was significantly higher (p < 0.05) at rest and after the 3 mg/kg caffeine vs placebo (116 ± 13 vs 123 ± 10 mm Hg). Neither dose of caffeine had any effect on BP during submaximal exercise. Caffeine had no effect on V<sub>E</sub>, V<sub>T</sub>, VO<sub>2</sub>, RPE, maximal power output or time to exhaustion.</p> <p>Conclusion</p> <p>In non habitual caffeine users it appears that consuming a caffeine pill (1.5 & 3.0 mg/kg) at a dose comparable to 1–3 cups of coffee lowers heart rate during submaximal exercise but not at near maximal and maximal exercise. In addition, this caffeine dose also only appears to affect systolic blood pressure at rest but not during cycling exercise.</p

    Ecosystem resilience despite large-scale altered hydroclimatic conditions

    Full text link
    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological model for many regions. Large-scale, warm droughts have recently occurred in North America, Africa, Europe, Amazonia and Australia, resulting in major effects on terrestrial ecosystems, carbon balance and food security. Here we compare the functional response of above-ground net primary production to contrasting hydroclimatic periods in the late twentieth century (1975-1998), and drier, warmer conditions in the early twenty-first century (2000-2009) in the Northern and Southern Hemispheres. We find a common ecosystem water-use efficiency (WUE e: Above-ground net primary production/ evapotranspiration) across biomes ranging from grassland to forest that indicates an intrinsic system sensitivity to water availability across rainfall regimes, regardless of hydroclimatic conditions. We found higher WUE e in drier years that increased significantly with drought to a maximum WUE e across all biomes; and a minimum native state in wetter years that was common across hydroclimatic periods. This indicates biome-scale resilience to the interannual variability associated with the early twenty-first century drought - that is, the capacity to tolerate low, annual precipitation and to respond to subsequent periods of favourable water balance. These findings provide a conceptual model of ecosystem properties at the decadal scale applicable to the widespread altered hydroclimatic conditions that are predicted for later this century. Understanding the hydroclimatic threshold that will break down ecosystem resilience and alter maximum WUE e may allow us to predict land-surface consequences as large regions become more arid, starting with water-limited, low-productivity grasslands. © 2013 Macmillan Publishers Limited. All rights reserved

    Effects of Grassland Management Practices on Ant Functional Groups in Central North America

    Get PDF
    Tallgrass prairies of central North America have experienced disturbances including fire and grazing for millennia. Little is known about the effects of these disturbances on prairie ants, even though ants are thought to play major roles in ecosystem maintenance. We implemented three management treatments on remnant and restored grassland tracts in the central U.S., and compared the effects of treatment on abundance of ant functional groups. Management treatments were: (1) patch-burn graze—rotational burning of three spatially distinct patches within a fenced tract, and growing-season cattle grazing; (2) graze-and-burn—burning entire tract every 3 years, and growing-season cattle grazing, and (3) burn-only—burning entire tract every 3 years, but no cattle grazing. Ant species were classified into one of four functional groups. Opportunist ants and the dominant ant species, Formica montana, were more abundant in burn-only tracts than tracts managed with either of the grazing treatments. Generalists were more abundant in graze-and-burn tracts than in burn-only tracts. Abundance of F. montana was negatively associated with pre-treatment time since fire, whereas generalist ant abundance was positively associated. F. montanawere more abundant in restored tracts than remnants, whereas the opposite was true for subdominants and opportunists. In summary, abundance of the dominant F. montana increased in response to intense disturbances that were followed by quick recovery of plant biomass. Generalist ant abundance decreased in response to those disturbances, which we attribute to the effects of competitive dominance of F. montana upon the generalists
    corecore