557 research outputs found

    Spatiotemporal Patterns and Burden of Myocardial Infarction in Florida

    Get PDF
    Knowledge of spatiotemporal disparities in myocardial infarction (MI) risk and the determinants of those disparities is critical for guiding health planning and resource allocation. Therefore, the aims of this study were to: (i) investigate the spatial distribution and clusters of MI hospitalization (MIHosp) and MI mortality (MIMort) risks in Florida over time to identify communities with consistently high MI burdens, (ii) assess temporal trends in geographic disparities in MIHosp and MIMort risks (iii) identify predictors of MIHosp risks.Retrospective MIhosp and MImort data for Florida for 2005-2014 and 2000-2014 periods, respectively, were used. Kulldorff’s circular and Tango’s flexible spatial scan statistics were used to identify spatial clusters, and counties with persistently high or low MIHosp and MIMort risks were identified. Global and local negative binomial models were used to identify predictors of MIHosp risks.MIHosp and MIMort risks declined by 15%-20% and 48% respectively, but there were substantial disparities in space and over time. Persistent clustering of high MIHosp risks occurred in the Big Bend area, South Central and Southeast Florida. Persistent clustering of low risks occurred in southeast and southwest Florida. Clustering of high or low MIMort risks occurred in the same areas as MIHosp risks, but there was no clustering of high MIMort risks in South Central Florida. The risks declined on the overall in all clusters over the study period. However, they decreased more rapidly in high-risk clusters during the first 4-8 years of study, leading to reduced disparities in the short term. Nevertheless, MI risks for high-risk clusters lagged behind those for low-risk clusters by at least a decade. Significant predictors of MIHosp risks included race, marital status, education level, rural residence and lack of health insurance. The impacts of education level and lack of health insurance varied geographically, with strongest associations in southern Florida. In conclusion, MI interventions need to target high-risk clusters to reduce the MI burden and improve population health in Florida. Moreover, the interventions need to consider social contexts, allocating resources based on empirical evidence from global and local models to maximize their efficiency and effectivenes

    Exploratory investigation of region level risk factors of Ebola Virus Disease in West Africa

    Get PDF
    Background. Ebola Virus Disease (EVD) is a highly infectious disease that has produced over 25,000 cases in the past 50 years. While many past outbreaks resulted in relatively few cases, the 2014 outbreak in West Africa was the most deadly occurrence of EVD to date, producing over 15,000 confirmed cases. Objective. In this study, we investigated population level predictors of EVD risk at the regional level in Sierra Leone, Liberia, and Guinea. Methods. Spatial and descriptive analyses were conducted to assess distribution of EVD cases. Choropleth maps showing the spatial distribution of EVD risk across the study area were generated in ArcGIS. Poisson and negative binomial models were then used to investigate population and regional predictors of EVD risk. Results. Results indicated that the risk of EVD was significantly lower in areas with higher proportions of: (a) the population living in urban areas, (b) households with a low quality or no toilets, and (c) married men working in blue collar jobs. However, risk of EVD was significantly higher in areas with high mean years of education. Conclusions. The identified significant predictors of high risk were associated with ar- eas with higher levels of urbanization. This may be due to higher population densities in the more urban centers and hence higher potential of infectious contact. However, there is need to better understand the role of urbanization and individual contact structure in an Ebola outbreak. We discuss shortcomings in available data and emphasize the need to consider spatial scale in future data collection and epidemiological studies

    Emergency medical services transport delays for suspected stroke and myocardial infarction patients

    Get PDF
    Background Prehospital delays in receiving emergency care for suspected stroke and myocardial infarction (MI) patients have significant impacts on health outcomes. Use of Emergency Medical Services (EMS) has been shown to reduce these delays. However, disparities in EMS transport delays are thought to exist. Therefore the objective of this study was to investigate and identify disparities in EMS transport times for suspected stroke and MI patients. Methods Over 3,900 records of suspected stroke and MI patients, reported during 2006–2009, were obtained from two EMS agencies (EMS 1 & EMS 2) in Tennessee. Summary statistics of transport time intervals were computed. Multivariable logistic models were used to identify predictors of time intervals exceeding EMS guidelines. Results Only 66 and 10 % of suspected stroke patients were taken to stroke centers by EMS 1 and 2, respectively. Most (80–83 %) emergency calls had response times within the recommended 10 min. However, over 1/3 of the calls had on-scene times exceeding the recommended 15 min. Predictors of time intervals exceeding EMS guidelines were EMS agency, patient age, season and whether or not patients were taken to a specialty center. The odds of total transport time exceeding EMS guidelines were significantly lower for patients not taken to specialty centers. Noteworthy was the 72 % lower odds of total time exceeding guidelines for stroke patients served by EMS 1 compared to those served by EMS 2. Additionally, for every decade increase in age of the patient, the odds of on-scene time exceeding guidelines increased by 15 and 19 % for stroke and MI patients, respectively. Conclusion In this study, prehospital delays, as measured by total transport time exceeding guideline was influenced by season, EMS agency responsible, patient age and whether or not the patient is transported to a specialty center. The magnitude of the delays associated with some of the factors are large enough to be clinically important although others, though statistically significant, may not be large enough to be clinically important. These findings should be useful for guiding future studies and local health initiatives that seek to reduce disparities in prehospital delays so as to improve health services and outcomes for stroke and MI patients

    Memory and food intake in sheep: Effects of previous exposure to straw on intake and behaviour later in life

    Get PDF
    The ban on open-air burning of agricultural by-products by the European Union created disposal problems on many farms. Success was limited in attempts at feeding agricultural by-products like cereal straws to previously grazed livestock. This initial reluctance to accept unfamiliar feeds was also reported when livestock were fed whole-grain cereals in drought, or when grazed on new pastures and shrubs. It is suggested that previous exposure to feed might speed up the rate at which it is accepted, particularly if such experiences take place at pre-weaning. This study aimed at establishing the veracity of this assertion, and whether an early learning experience is carried over into adulthood. Two feeding trails were carried out with lambs not exposed (NE) to straw and those given access to straw at 12 weeks of age for either 10 (E-10) or 28 (E-28) days. At 24 weeks (Experiment 1), 10 lambs from each of the three treatment groups were tested, over 21 days, on their readiness to accept straw as feed. At 36 weeks (Experiment 2), another batch of lambs (from the E-28 and NE groups only) were similarly tested. For each, the lambs were penned individually (in view of lambs from their own treatment group) and also offered a concentrate supplement to meet daily nutrient requirements. In both experiments, intake of straw OM, N and DOM, as well as leaf to stem ratio in reject straw, were assessed for each penned lamb. Animal behaviour pattern was monitored once every 5 min, over an 8-h period, immediately after first confinement. Frequency of eating, idling, ruminating, or drinking were all found to be significantly greater (

    Ectozoochory by hares (Lepus Crawshayi) in Queen Elizabeth National Park, Uganda

    Get PDF
    Volume: 7

    Effect of different salinity levels in drinking water on growth of broiler chickens

    Get PDF
    During breaks in supply of treated water, farmers turn to surface and underground sources, such as wells and boreholes. Though seemingly wholesome, such water usually contains dissolved salts of various kinds that may affect productivity in poultry and other farm livestock. Fifteen 2-week-old, imported broiler hybrid chicks were fed a common ration, but offered drinking water from one of three sources, for 21 days, to investigate any effects of water quality on productivity. Three treatments (water source), each with five replicates (individually penned birds) were tested, in a completely randomised designed experiment. The treatments were (i) water from the tap (TAP), (ii) water from a borehole (BH1), and (iii) water from a second borehole (BH2). Birds were raised in battery cages, given water and fed ad libitum and weighed weekly. Water samples from the three sources were analyzed weekly for quality (i.e. conductivity, salinity, dissolved oxygen, pH, and total dissolved solids). Mean water salinity level were 0.00, 0.07 and 3.80 per cent for TAP, BH1, and BH2, respectively. Water treatment had no significant effects (P>0.05) on feed intake (110.8, 95.3 and 106.1 g per bird per day), weight gain (45.0, 43.6 and 43.0 g per bird per day), feed conversion ratio (46.8, 50.0 and 47.2%), and final weight of birds after 21 days (1.33, 1.30 and 1.32 kg), for TAP, BH1, and BH2, respectively. However, water intake by birds was significantly (

    The Incidence Risk, Clustering, and Clinical Presentation of La Crosse Virus Infections in the Eastern United States, 2003–2007

    Get PDF
    BACKGROUND:Although La Crosse virus (LACV) is one of the most common causes of pediatric arboviral infections in the United States, little has been done to assess its geographic distribution, identify areas of higher risk of disease, and to provide a national picture of its clinical presentation. Therefore, the objective of this study was to investigate the geographic distribution of LACV infections reported in the United States, to identify hot-spots of infection, and to present its clinical picture. METHODS AND FINDINGS:Descriptive and cluster analyses were performed on probable and confirmed cases of LACV infections reported to the Centers for Disease Control and Prevention from 2003-2007. A total of 282 patients had reported confirmed LACV infections during the study period. Of these cases the majority (81 percent) presented during the summer, occurred in children 15 years and younger (83.3 percent), and were found in male children (64.9 percent). Clinically, the infections presented as meningioencephalitis (56.3 percent), encephalitis (20.7 percent), meningitis (17.2 percent), or uncomplicated fever (5 percent). Deaths occurred in 1.9 percent of confirmed cases, and in 8.6 percent of patients suffering from encephalitis. The majority of these deaths were in patients 15 years and younger. The county-level incidence risk among counties (n = 136) reporting both probable and confirmed cases for children 15 years and younger (n = 355) ranged from 0.2 to 228.7 per 100,000 persons. The southern United States experienced a significantly higher (p<0.05) incidence risk during the months of June, July, August, and October then the northern United States. There was significant (p<0.05) clustering of high risk in several geographic regions with three deaths attributed to complications from LAC encephalitis occurring in two of these hot-spots of infections. CONCLUSIONS:Both the incidence risk and case fatality rates were found to be higher than previously reported. We detected clustering in four geographic regions, a shift from the prior geographic distributions, and developed maps identifying high-risk areas. These findings are useful for raising awareness among health care providers regarding areas at a high risk of infections and for guiding targeted multifaceted interventions by public health officials

    Prevalence and Predictors of Pre-Diabetes and Diabetes among Adults 18 Years or Older in Florida: A Multinomial Logistic Modeling Approach

    Get PDF
    Background: Individuals with pre-diabetes and diabetes have increased risks of developing macro-vascular complications including heart disease and stroke; which are the leading causes of death globally. The objective of this study was to estimate the prevalence of pre-diabetes and diabetes, and to investigate their predictors among adults ≥18 years in Florida. Methods: Data covering the time period January-December 2013, were obtained from Florida’s Behavioral Risk Factor Surveillance System (BRFSS). Survey design of the study was declared using SVYSET statement of STATA 13.1. Descriptive analyses were performed to estimate the prevalence of pre-diabetes and diabetes. Predictors of pre-diabetes and diabetes were investigated using multinomial logistic regression model. Model goodness-of-fit was evaluated using both the multinomial goodness-of-fit test proposed by Fagerland, Hosmer, and Bofin, as well as, the Hosmer-Lemeshow’s goodness of fit test. Results: There were approximately 2,983 (7.3%) and 5,189 (12.1%) adults in Florida diagnosed with pre-diabetes and diabetes, respectively. Over half of the study respondents were white, married and over the age of 45 years while 36.4% reported being physically inactive, overweight (36.4%) or obese (26.4%), hypertensive (34.6%), hypercholesteremic (40.3%), and 26% were arthritic. Based on the final multivariable multinomial model, only being overweight (Relative Risk Ratio [RRR] = 1.85, 95% Confidence Interval [95% CI] = 1.41, 2.42), obese (RRR = 3.41, 95% CI = 2.61, 4.45), hypertensive (RRR = 1.69, 95% CI = 1.33, 2.15), hypercholesterolemic (RRR = 1.94, 95% CI = 1.55, 2.43), and arthritic (RRR = 1.24, 95% CI = 1.00, 1.55) had significant associations with pre-diabetes. However, more predictors had significant associations with diabetes and the strengths of associations tended to be higher than for the association with pre-diabetes. For instance, the relative risk ratios for the association between diabetes and being overweight (RRR = 2.00, 95% CI = 1.55, 2.57), or obese (RRR = 4.04, 95% CI = 3.22, 5.07), hypertensive (RRR = 2.66, 95% CI = 2.08, 3.41), hypercholesterolemic (RRR = 1.98, 95% CI = 1.61, 2.45) and arthritic (RRR = 1.28, 95% CI = 1.04, 1.58) were all further away from the null than their associations with pre-diabetes. Moreover, a number of variables such as age, income level, sex, and level of physical activity had significant association with diabetes but not pre-diabetes. The risk of diabetes increased with increasing age, lower income, in males, and with physical inactivity. Insufficient physical activity had no significant association with the risk of diabetes or pre-diabetes. Conclusions: There is evidence of differences in the strength of association of the predictors across levels of diabetes status (pre-diabetes and diabetes) among adults ≥18 years in Florida. It is important to monitor populations at high risk for pre-diabetes and diabetes, so as to help guide health programming decisions and resource allocations to control the condition

    Cardiovascular Effects of Listening to Fast Speech and Normal Speech

    Get PDF
    Background: Some previous works on the psychological impact of speech on the cardiovascular system have mainly focused on the speaker as the individual in whom clinical outcomes are being measured. There is limited data on the effects of listening to the fast speech on cardiovascular responses. Aim: The aim of the study was to comparatively examine blood pressure and heart rate changes upon listening to normal and fast speeches. Method: A total of 88 (22 females and 66 males) normotensive adults were recruited for the study from a university population. All subjects were made to listen to two different 13-minutes audio recordings of normal speech (news commentary) and fast speech (a radio sports presentation). Blood pressure and pulse rate changes were taken at 4-minutes time intervals during listening to the audio recordings. Based on the enthusiasm and patronage of the sports program, participants were classified as ‘‘Regular’’ listeners and ‘‘Non-regular’’ listeners. Blood pressure and pulse rate changes were calculated as the mean net area under the curve response and differences were analysed with analysis of variance. Results: Systolic, diastolic and pulse rate responses were significantly higher in both the Regular and Non-Regular listener groups during listening to the fast-speech audio presentation as compared to the News Commentary presentation. Conclusion: Although there is limited data, listening to fast speech itself may act as a psychosocial stressor that predisposes to an increased cardiovascular response manifested as higher blood pressure and heart rate
    • …
    corecore