719 research outputs found

    Risk factors for presentation to hospital with severe anaemia in Tanzanian children: a case-control study.

    Get PDF
    In malaria endemic areas anaemia is a usually silent condition that nevertheless places a considerable burden on health services. Cases of severe anaemia often require hospitalization and blood transfusions. The objective of this study was to assess risk factors for admission with anaemia to facilitate the design of anaemia control programmes. We conducted a prospective case-control study of children aged 2-59 months admitted to a district hospital in southern Tanzania. There were 216 cases of severe anaemia [packed cell volume (PCV) < 25%] and 234 age-matched controls (PCV > or = 25%). Most cases [55.6% (n = 120)] were < 1 year of age. Anaemia was significantly associated with the educational level of parents, type of accommodation, health-seeking behaviour, the child's nutritional status and recent and current medical history. Of these, the single most important factor was Plasmodium falciparum parasitaemia [OR 4.3, 95% confidence interval (CI) 2.9-6.5, P < 0.001]. Multivariate analysis showed that increased recent health expenditure [OR 2.2 (95% CI 1.3-3.9), P = 0.005], malnutrition [OR 2.4 (95%CI 1.3-4.3), P < 0.001], living > 10 km from the hospital [OR 3.0 (95% CI 1.9-4.9), P < 0.001], a history of previous blood transfusion [OR 3.8 (95% CI 1.7-9.1), P < 0.001] and P. falciparum parasitaemia [OR 9.5 (95% CI 4.3-21.3), P < 0.001] were independently related to risk of being admitted with anaemia. These findings are considered in terms of the pathophysiological pathway leading to anaemia. The concentration of anaemia in infants and problems of access to health services and adequate case management underline the need for targeted preventive strategies for anaemia control

    Goal setting and self-efficacy among delinquent, at-risk and not at-risk adolescents

    Get PDF
    Setting clear achievable goals that enhance self-efficacy and reputational status directs the energies of adolescents into socially conforming or non-conforming activities. This present study investigates the characteristics and relationships between goal setting and self-efficacy among a matched sample of 88 delinquent (18 % female), 97 at-risk (20 % female), and 95 not at-risk adolescents (20 % female). Four hypotheses related to this were tested. Findings revealed that delinquent adolescents reported fewest goals, set fewer challenging goals, had a lower commitment to their goals, and reported lower levels of academic and self-regulatory efficacy than those in the at-risk and not at-risk groups. Discriminant function analysis indicated that adolescents who reported high delinquency goals and low educational and interpersonal goals were likely to belong to the delinquent group, while adolescents who reported high educational and interpersonal goals and low delinquency goals were likely to belong to the not at-risk group. The at-risk and not at-risk groups could not be differentiated. A multinomial logistic regression also revealed that adolescents were more likely to belong to the delinquent group if they reported lower self-regulatory efficacy and lower goal commitment. These findings have important implications for the development of prevention and intervention programs, particularly for those on a trajectory to delinquency. Specifically, programs should focus on assisting adolescents to develop clear self-set achievable goals and support them through the process of attaining them, particularly if the trajectory towards delinquency is to be addressed

    Outcomes in randomised controlled trials in prevention and management of carious lesions:a systematic review

    Get PDF
    Abstract Background Inconsistent outcome reporting is one significant hurdle to combining results from trials into systematic reviews. Core outcome sets (COS) can reduce this barrier. The aim of this review was to map outcomes reported in caries prevention and management randomised controlled trials (RCT) as a first step to COS development. We also investigated RCT characteristics and reporting of primary outcomes and sample size calculations. Methods PubMed, Embase, Web of Knowledge and Cochrane CENTRAL were systematically searched (1 January 1968 to 25 August 2015). Inclusion criteria: RCTs comparing any technique for prevention or management of caries with another or placebo and RCTs comparing interventions to support patients undergoing treatment of caries (without setting, dentition or age restrictions). Categories were developed through piloting and group consensus and outcomes grouped accordingly. Results Of 4773 search results, 764 were potentially relevant, full text was available for 731 papers and 605 publications met the inclusion criteria and were included. For all outcomes across the time periods 1968–1980 and 2001–2010, reporting of outcome ‘caries experience’ reduced from 39% to 18%; ‘clinical performance of the restoration’ reporting increased from 33% to 42% although there was a reduction to 22% in 2011–2015. Emerging outcome domains include ‘lesion activity’ and ‘pulp health-related outcomes’, accounting for 1% and 0%, respectively, during 1968–1980 and 10% and 4% for 2011–2015. Reporting ‘resource efficiency’ and ‘quality of life measures’ have remained at a low level. No publications reported tooth survival independent of an index such as DMFT or equivalent. Primary outcomes were only identified as such in 414 (68%) of the reports. Conclusions Over the past 50 years, outcome reporting for trials on prevention and management of carious lesions have tended to focus on outcomes measuring caries experience and restoration material clinical performance with lesion activity and cost-effectiveness increasingly being reported. Patient-reported and patient-focused outcomes are becoming more common (although as secondary outcomes) but remain low in use. The challenge with developing a COS will be balancing commonly previously reported outcomes against those more relevant for the future. Trial registration PROSPERO, CRD42015025310 . Registered on 14 August 2015, Trials (Schwendicke et al., Trials 16:397, 2015) and COMET initiative online (COMET, 2017)

    Baby food pouches and Baby-Led Weaning: Associations with energy intake, eating behaviour and infant weight status.

    Full text link
    Although concern is frequently expressed regarding the potential impact of baby food pouch use and Baby-Led Weaning (BLW) on infant health, research is scarce. Data on pouch use, BLW, energy intake, eating behaviour and body mass index (BMI) were obtained for 625 infants aged 7-10 months in the First Foods New Zealand study. Frequent pouch use was defined as ≥5 times/week during the past month. Traditional spoon-feeding (TSF), "partial" BLW and "full" BLW referred to the relative proportions of spoon-feeding versus infant self-feeding, assessed at 6 months (retrospectively) and current age. Daily energy intake was determined using two 24-h dietary recalls, and caregivers reported on a variety of eating behaviours. Researchers measured infant length and weight, and BMI z-scores were calculated (World Health Organization Child Growth Standards). In total, 28% of infants consumed food from pouches frequently. Frequent pouch use was not significantly related to BMI z-score (mean difference, 0.09; 95% CI -0.09, 0.27) or energy intake (92 kJ/day; -19, 202), but was associated with greater food responsiveness (standardised mean difference, 0.3; 95% CI 0.1, 0.4), food fussiness (0.3; 0.1, 0.4) and selective/restrictive eating (0.3; 0.2, 0.5). Compared to TSF, full BLW was associated with greater daily energy intake (BLW at 6 months: mean difference 150 kJ/day; 95% CI 4, 297; BLW at current age: 180 kJ/day; 62, 299) and with a range of eating behaviours, including greater satiety responsiveness, but not BMI z-score (6 months: 0.06 (-0.18, 0.30); current age: 0.06 (-0.13, 0.26)). In conclusion, neither feeding approach was associated with weight in infants, despite BLW being associated with greater energy intake compared with TSF. However, infants who consumed pouches frequently displayed higher food fussiness and more selective eating.fals

    A neural circuit model of decision uncertainty and change-of-mind

    Get PDF
    Decision-making is often accompanied by a degree of confidence on whether a choice is correct. Decision uncertainty, or lack in confidence, may lead to change-of-mind. Studies have identified the behavioural characteristics associated with decision confidence or change-of-mind, and their neural correlates. Although several theoretical accounts have been proposed, there is no neural model that can compute decision uncertainty and explain its effects on change-of-mind. We propose a neuronal circuit model that computes decision uncertainty while accounting for a variety of behavioural and neural data of decision confidence and change-of-mind, including testable model predictions. Our theoretical analysis suggests that change-of-mind occurs due to the presence of a transient uncertainty-induced choice-neutral stable steady state and noisy fluctuation within the neuronal network. Our distributed network model indicates that the neural basis of change-of-mind is more distinctively identified in motor-based neurons. Overall, our model provides a framework that unifies decision confidence and change-of-mind

    Baby Food Pouches, Baby-Led Weaning, and Iron Status in New Zealand Infants: An Observational Study.

    Get PDF
    Iron deficiency in infants can impact development, and there are concerns that the use of baby food pouches and baby-led weaning may impair iron status. First Foods New Zealand (FFNZ) was an observational study of 625 New Zealand infants aged 6.9 to 10.1 months. Feeding methods were defined based on parental reports of infant feeding at "around 6 months of age": "frequent" baby food pouch use (five+ times per week) and "full baby-led weaning" (the infant primarily self-feeds). Iron status was assessed using a venepuncture blood sample. The estimated prevalence of suboptimal iron status was 23%, but neither feeding method significantly predicted body iron concentrations nor the odds of iron sufficiency after controlling for potential confounding factors including infant formula intake. Adjusted ORs for iron sufficiency were 1.50 (95% CI: 0.67-3.39) for frequent pouch users compared to non-pouch users and 0.91 (95% CI: 0.45-1.87) for baby-led weaning compared to traditional spoon-feeding. Contrary to concerns, there was no evidence that baby food pouch use or baby-led weaning, as currently practiced in New Zealand, were associated with poorer iron status in this age group. However, notable levels of suboptimal iron status, regardless of the feeding method, emphasise the ongoing need for paying attention to infant iron nutrition.fals

    Nutritional Implications of Baby-Led Weaning and Baby Food Pouches as Novel Methods of Infant Feeding: Protocol for an Observational Study

    Get PDF
    BACKGROUND: The complementary feeding period is a time of unparalleled dietary change for every human, during which the diet changes from one that is 100% milk to one that resembles the usual diet of the wider family in less than a year. Despite this major dietary shift, we know relatively little about food and nutrient intake in infants worldwide and virtually nothing about the impact of baby food "pouches" and "baby-led weaning" (BLW), which are infant feeding approaches that are becoming increasingly popular. Pouches are squeezable containers with a plastic spout that have great appeal for parents, as evidenced by their extraordinary market share worldwide. BLW is an alternative approach to introducing solids that promotes infant self-feeding of whole foods rather than being fed purées, and is popular and widely advocated on social media. The nutritional and health impacts of these novel methods of infant feeding have not yet been determined. OBJECTIVE: The aim of the First Foods New Zealand study is to determine the iron status, growth, food and nutrient intakes, breast milk intake, eating and feeding behaviors, dental health, oral motor skills, and choking risk of New Zealand infants in general and those who are using pouches or BLW compared with those who are not. METHODS: Dietary intake (two 24-hour recalls supplemented with food photographs), iron status (hemoglobin, plasma ferritin, and soluble transferrin receptor), weight status (BMI), food pouch use and extent of BLW (questionnaire), breast milk intake (deuterium oxide "dose-to-mother" technique), eating and feeding behaviors (questionnaires and video recording of an evening meal), dental health (photographs of upper and lower teeth for counting of caries and developmental defects of enamel), oral motor skills (questionnaires), and choking risk (questionnaire) will be assessed in 625 infants aged 7.0 to 9.9 months. Propensity score matching will be used to address bias caused by differences in demographics between groups so that the results more closely represent a potential causal effect. RESULTS: This observational study has full ethical approval from the Health and Disability Ethics Committees New Zealand (19/STH/151) and was funded in May 2019 by the Health Research Council (HRC) of New Zealand (grant 19/172). Data collection commenced in July 2020, and the first results are expected to be submitted for publication in 2022. CONCLUSIONS: This large study will provide much needed data on the implications for nutritional intake and health with the use of baby food pouches and BLW in infancy. TRIAL REGISTRATION: Australian New Zealand Clinical Trials Registry ACTRN12620000459921; http://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=379436. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): DERR1-10.2196/29048.Published onlin

    Network-Based Approach for Modeling and Analyzing Coronary Angiography

    Full text link
    Significant intra-observer and inter-observer variability in the interpretation of coronary angiograms are reported. This variability is in part due to the common practices that rely on performing visual inspections by specialists (e.g., the thickness of coronaries). Quantitative Coronary Angiography (QCA) approaches are emerging to minimize observer's error and furthermore perform predictions and analysis on angiography images. However, QCA approaches suffer from the same problem as they mainly rely on performing visual inspections by utilizing image processing techniques. In this work, we propose an approach to model and analyze the entire cardiovascular tree as a complex network derived from coronary angiography images. This approach enables to analyze the graph structure of coronary arteries. We conduct the assessments of network integration, degree distribution, and controllability on a healthy and a diseased coronary angiogram. Through our discussion and assessments, we propose modeling the cardiovascular system as a complex network is an essential phase to fully automate the interpretation of coronary angiographic images. We show how network science can provide a new perspective to look at coronary angiograms

    Athlete and coach perceptions of technology needs for evaluating running performance

    Get PDF
    This article was published in the journal, Sports Enginnering [Springer Verlag / © International Sports Engineering Association]. The definitive version is available at: http://dx.doi.org/10.1007/s12283-010-0049-9Athletes and their support team utilise technology to measure and evaluate technique and athletic performance. Existing techniques for motion and propulsion measurement and analysis include a combination of indirect methods (high-speed video) and direct methods (force plates and pressure systems). These methods are predominantly limited to controlled laboratory environments (in a small area relative to the competition environment), require expert advice and support, and can take significant time to evaluate the data. Consequently, the more advanced measurement techniques are considered to be restricted to specific coaching sessions, or periods in the year leading up to competition, when the time and expertise of further support staff are available. The more widely used, and simple, devices for monitoring 'performance' during running include stopwatches, GPS tracking and accelerometer-based systems to count strides. These provide useful information on running duration, distance and velocity but lack detailed information on many key aspects of running technique. In order to begin the process of development of more innovative technologies for routine use by athletes and coaches, a study was required to improve the understanding of athletes' and coaches' perception of their requirements from measurement technology. This study outlines a systematic approach to elicit and evaluate their perceptions, and presents the findings from interviews and a questionnaire. The qualitative data are presented as a hierarchical graphical plot (structured relationship model) showing six general dimensions (technique, footwear and surface, environment, performance, injury and cardiovascular) and shows the development of these general dimensions from the interviewee quotations. The questionnaire quantitative data enhances the study by further ranking characteristics that arise from the interviews. A contrast is shown between short and longer distance runner groups, as might be expected. The current technology available to elite runners is briefly reviewed in relation to the 22 characteristics identified as important to measure. The conclusions highlight the need for newer technologies to measure aspects of running style and performance in a portable and integrated manner, with suggestions as to size and weight likely to be acceptable to users for emerging devices. © 2010 International Sports Engineering Association
    corecore