74 research outputs found

    Impact of Heart Rate Intensity on Shooting Accuracy during Games in NCAA Division I Women Basketball Players

    Get PDF
    Shooting accuracy in basketball is key to winning games. While there are various factors as to why a team either makes or misses their shots, the intensity of play is likely a contributing factor. A player who has played the majority of the game would likely have a higher, more intense heart rate (HR). Depending on the athlete, this could impact shooting accuracy. Examining the relationship between HR intensity and shooting accuracy has not been looked at in a real game setting before. Therefore, we set out to determine the impact heart rate intensity has on shooting accuracy in a game setting. Purpose: The purpose of this study was to determine the impact of heart rate intensity on shooting accuracy in a game setting in NCAA Division I female basketball players. Methods: We examined the team stats for shooting accuracy from overall attempts, three point attempts, and free throws during five games. During games players wore HR monitors that transmitted to a mobile app that displayed their HR in real time. Every time a shot was attempted, we recorded what kind of shot, where on the floor it came from, whether it was made or missed, and the HR zone that the athlete was at when it took place. The HR zones that were compared were 1) 70-80% HR max, 2) 80-90% HR max, and 3) 90-100% HR max. These data were input into a spreadsheet to calculate the average team shooting percentage across these three HR zones for overall shooting, free throws, and 3-pointers. Results: As indicated in the table, the team shooting percentage was highest for all types of shooting when players were at the lowest HR intensity. Shooting accuracy declined at higher HR intensities

    Comparison of Heart Rate Intensity in Practice, Conditioning, and Games in NCAA Division I Women Basketball Players

    Get PDF
    Background: An athlete’s heart rate (HR) is an important variable in quantifying the intensity of exercise. Workouts that increase HR are an important stimulus for training adaptations and conditioning. At other times, workouts that do not overly stress the HR may be desired to allow for recovery. The principle of specificity emphasizes that athletes should train specific to the way they will need to perform in competition. Because of this, monitoring HR during training and competition can be a useful tool. While exercise intensity in endurance sports has been previously investigated, less is known regarding the HR response in team sports, particularly women’s basketball. Purpose: Compare the average HR response to basketball training and competition in: 1) open gym 5 on 5 scrimmage, 2) an actual basketball game against a different opponent, and 3) conditioning session. Methods: We had an NCAA Division I women’s basketball team wear heart rate monitors for open gym scrimmages, actual games, and conditioning practices. For the open gym sessions, the team scrimmaged against each other 5v5 for ~90 minutes and the average HR over 4 open gym sessions was determined. For the actual games against other opponents, the average HR response for the team was averaged over 3 games. The conditioning sessions consisted of repeated, intermittent short sprint efforts over the course of 30-60 minutes, and the average HR over 7 conditioning sessions was calculated. The data that was collected was added to a spreadsheet where we used it to find the team’s average for both the scrimmages, games, and conditioning. Results: During open gym scrimmages and conditioning sessions the women had a higher heart rate average as a whole team compared to the games. The games had the lowest HR out of all three conditions that were collected

    The Relationship between Objective and Subjective Markers of Training Stress in NCAA Division I Women Basketball Players

    Get PDF
    An athlete’s training stress score (TSS) is an objective marker of overall training volume and can be determined by tracking total time spent at specific heart rate (HR) zones. Additionally, an athlete’s power factor (PF) or explosive strength is an important marker of performance and can be measured objectively with power testing equipment. While these measures of training stress and performance are important, a coach with limited resources may not have access to the equipment or expertise to measure these variables. On a subjective level, perceived recovery status (PRS) prior to practice and the rating of perceived exertion (RPE) during a practice can be used to measure stress of training. While the relationship between these objective and subjective markers of training stress have been studied in endurance sports, less descriptive data is available for the these responses in intermittent, team sports. We decided to base our research on women’s basketball athletes due to the lack of studies for this demographic. Purpose: To determine the relationship between PRS and PF, PRS and TSS, and PRS and RPE in NCAA Division I female basketball athletes. Methods: Data was collected over several weeks during both the off-season and competition season in 12 NCAA Division I women’s basketball players. Prior to practices at the end of the week, their PF was measured by performing a 4-jump test on a jump mat. Increased PF values indicate more explosive strength. The players also indicated their subjective rating of recovery on the PRS index before practice with higher values indicating the player felt more recovered. RPE was measured after each practice as a rating of how hard the player felt practice was with higher values indicating a more stressful practice. Finally, their TSS was calculated for the entire week by measuring their heart rates and time spent in specific HR zones. The relationship between PRS-PF, PRS-TSS, and PRS-RPE was then calculated by Pearson correlations. Results: Comparing PRS- PF, there was a weak positive correlation (r = .305) on average for the team, while seven of the twelve players (58%) had at least a moderately positive correlation (r \u3e .4). PRS-TSS displayed a very weak negative correlation (r = -.077). PRS-RPE showed a very weak positive relationship (r = .141). Conclusion: We hypothesized that as the athlete felt more recovered (higher PRS), their explosive strength measured by the jump test would also increase (higher PF). Over half of the players observed could provide an accurate subjective measure of how prepared they were for practice that correlated with their actual explosive strength prior to practice. For these athletes, the PRS might be a useful surrogate to daily power testing. This would allow the coach to adjust practice accordingly without the need for special equipment or additional testing. While examining the other relationships, PRS vs TSS and PRS vs RPE, we did not see a strong relationship in either. This might indicate that quantifying training stress by HR measurement may not be easily replaced by subjective measures

    Relations Between Trajectories of Peer Victimization and Measures of Psychosocial Adjustment

    Get PDF
    Background: Peer victimization has been consistently associated with a host of negative outcomes including aggression, depressive symptoms, and academic difficulties. However, few studies have examined how individual changes in victimization over time, or trajectories of victimization, are related to these outcomes. Objectives: The current study aimed to identify different trajectories of physical and relational victimization in third through fifth grade. Additionally, relations between peer victimization trajectories and a range of psychosocial outcomes, including proactive and reactive aggression, depressive symptoms, and academic difficulties, were examined. Finally, the impact of gender on the associations between trajectories of peer victimization and psychosocial adjustment were considered. Methods: Third through fifth grade teachers and students completed study measures over the course of three years resulting in a total sample of 670 elementary school aged youth. Hypotheses: Consistent with previous research, four trajectories were expected to emerge from the data. Trajectories characterized by high levels of victimization were expected to be positively associated with reactive aggression, depressive symptoms, and academic difficulties. Finally, victimized boys were expected to exhibit aggressive outcomes, whereas girls were expected to exhibit more depressive symptoms in response to victimization. Results: Three trajectories emerged for both physical and relational aggression and for both boys and girls. Intercepts and slopes of victimization remained largely unrelated to all psychosocial outcomes. Gender did not impact relations between trajectories of victimization and psychosocial outcomes. Conclusions: The current study suggests that three, similar trajectory groups can be identified between physical and relational victimization in children in 3rd through 5th grade. Findings regarding the relations between psychosocial outcomes and gender are discusse

    Youth Perceptions of Staff as a Predictor of Restrictive Housing and Recidivism in Juvenile Detention Facilities

    Get PDF
    Background: Youth perceptions of detention center staff may be particularly important for achieving desired outcomes both within juvenile detention centers and after youth are released. In order to fill gaps in the literature, the current study aimed to determine the role of youth perceptions of staff by (a) establishing the appropriate use of a youth perceptions of staff measure, (b) examining the relationship between youth perceptions of staff and restrictive housing and recidivism, as well as (c) evaluating the moderating role of callous unemotional (CU) traits. Methods: Youth admitted into two juvenile detention facilities in the Midwestern United States were administered questionnaires and assented to participate in research, resulting in a sample of 228 youth. Hypotheses: It was expected that a one factor model would best characterize the use of the youth perceptions of staff measure. Further, it was anticipated that more negative perceptions of staff would be related to increased risk for and incidents of restrictive housing and detainment over the course of one year. High levels of CU traits were expected to moderate the associations between youth perceptions of staff and outcomes of interest. Results: Findings differed between the two facilities. Youth perceptions of staff emerged as a significant predictor of risk for recidivism in facility one but not facility two. Further, youth perceptions of staff was a significant predictor of risk for and frequency of restrictive housing in facility two but not in facility one. Additionally, for youth exhibiting higher levels of CU traits, more negative perceptions of staff were associated with increases in the frequency of restrictive housing in facility two. Conclusions: The current study suggests that youth perceptions of staff may be an important factor to consider within juvenile detention facilities, and that these perceptions may be particularly important for youth exhibiting CU traits. Further, it appears that the implementation of universal interventions is important to consider in these associations

    Past and future drought in Mongolia

    Get PDF
    The severity of recent droughts in semiarid regions is increasingly attributed to anthropogenic climate change, but it is unclear whether these moisture anomalies exceed those of the past and how past variability compares to future projections. On the Mongolian Plateau, a recent decade-long drought that exceeded the variability in the instrumental record was associated with economic, social, and environmental change. We evaluate this drought using an annual reconstruction of the Palmer Drought Severity Index (PDSI) spanning the last 2060 years in concert with simulations of past and future drought through the year 2100 CE. We show that although the most recent drought and pluvial were highly unusual in the last 2000 years, exceeding the 900-year return interval in both cases, these events were not unprecedented in the 2060-year reconstruction, and events of similar duration and severity occur in paleoclimate, historical, and future climate simulations. The Community Earth System Model (CESM) ensemble suggests a drying trend until at least the middle of the 21st century, when this trend reverses as a consequence of elevated precipitation. Although the potential direct effects of elevated CO2 on plant water use efficiency exacerbate uncertainties about future hydroclimate trends, these results suggest that future drought projections for Mongolia are unlikely to exceed those of the last two millennia, despite projected warming

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    A workplace intervention designed to interrupt prolonged occupational sitting: Self-reported perceptions of health from a cohort of desk-based employees over 26 weeks

    No full text
    Purpose – The purpose of this paper is to investigate the effectiveness of a workplace intervention designed to interrupt prolonged occupational sitting time (POST) and its impact on the self-reported health of a cohort of desk-based employees.Design/methodology/approach – In total, 43 participants received an interactive computer-based software intervention for 26 weeks. For the first 13 weeks the intervention passively prompted the participants to interrupt POST and perform brief bouts of non-purposeful movement. The second 13 weeks involved the passivity of the intervention being removed, with the intervention only accessible voluntarily by the participant. This approach was adopted to determine the sustainability of the intervention to change workplace health behaviour.Findings – ANOVA results revealed a significant interaction between group and test occasion, F(2, 42) = 2.79, p d = 0.37.Research limitations/implications – An action research approach was implemented for this study, and hence the participants were organised into one group. Based on a communitarian model, the intervention aimed to monitor how desk-based employees adapted to specific health behaviours, and therefore a control group was not included.Practical implications – Passively prompting desk-based employees to interrupt POST and perform non-purposeful movement at work improved self-reported health. Participant perceptions of health were maintained following the removal of the passive feature of the intervention.Social implications – Interventions predicated on a social ecological model that modify how employees interact with the workplace environment might provide a framework for health behaviour change in populations where sitting is customary.Originality/value – The passive approach used in this study removed the individual decision-making process to engage in health behaviour change, and established a sustainable effect on participant health
    corecore