110 research outputs found

    The Changes in the Mortality Rates of Low Birth Weight Infant and Very Low Birth Weight Infant in Korea over the Past 40 Years

    Get PDF
    Total 36 reports on the mortality rates (MRs) of low birth weight infants (LBWI) and very LBWI (VLBWI) in Korea from the 1967 through 2001 were analyzed. We compared the changes in the MR by 5 and 10-yr interval. The MRs observed by 5-yr intervals from the early 1960s through the 1990s have drastically decreased. TheMRs of LBWI are as follows: 23.1% and 23.6% in the 1960s, 17.3% and 16.8% in the 1970s, 14.1% and 14.4% in the 1980s, and 8.1% in the early 1990s. The MRsof VLBWI have also fallen and were reported as follows: 68.2% and 63.7% in the 1960s, 55.8% and 57.6% in the 1970s, 56.2% and 48.1% in the 1980s, 33.5% and 24.5% in the 1990s, and 11.7% in the early 2000s. In every 10-yr period, the MRs of LBWI have decreased from 23.4% in 1960, to 17.0% in 1970, to 14.2% in 1980, and to 8.1% in 1990. The MRs of VLBWI also have decreased from 66.2% in 1960, to 56.7% in 1970, to 50.8% in 1980, to 32.9% in 1990, and to 11.7% in 2000. TheMR of LBWI and VLBWI has gone down remarkably due to improvements in neonatology in Korea as shown above

    Complementary feeding: a Global Network cluster randomized controlled trial

    Get PDF
    Background: Inadequate and inappropriate complementary feeding are major factors contributing to excess morbidity and mortality in young children in low resource settings. Animal source foods in particular are cited as essential to achieve micronutrient requirements. The efficacy of the recommendation for regular meat consumption, however, has not been systematically evaluated. Methods/Design: A cluster randomized efficacy trial was designed to test the hypothesis that 12 months of daily intake of beef added as a complementary food would result in greater linear growth velocity than a micronutrient fortified equi-caloric rice-soy cereal supplement. The study is being conducted in 4 sites of the Global Network for Women\u27s and Children\u27s Health Research located in Guatemala, Pakistan, Democratic Republic of the Congo (DRC) and Zambia in communities with toddler stunting rates of at least 20%. Five clusters per country were randomized to each of the food arms, with 30 infants in each cluster. The daily meat or cereal supplement was delivered to the home by community coordinators, starting when the infants were 6 months of age and continuing through 18 months. All participating mothers received nutrition education messages to enhance complementary feeding practices delivered by study coordinators and through posters at the local health center. Outcome measures, obtained at 6, 9, 12, and 18 months by a separate assessment team, included anthropometry, dietary variety and diversity scores, biomarkers of iron, zinc and Vitamin B(12) status (18 months), neurocognitive development (12 and 18 months), and incidence of infectious morbidity throughout the trial. The trial was supervised by a trial steering committee, and an independent data monitoring committee provided oversight for the safety and conduct of the trial. Discussion: Findings from this trial will test the efficacy of daily intake of meat commencing at age 6 months and, if beneficial, will provide a strong rationale for global efforts to enhance local supplies of meat as a complementary food for young children

    The role of the General Practitioner in weight management in primary care – a cross sectional study in General Practice

    Get PDF
    BACKGROUND: Obesity has become a global pandemic, considered the sixth leading cause of mortality by the WHO. As gatekeepers to the health system, General Practitioners are placed in an ideal position to manage obesity. Yet, very few consultations address weight management. This study aims to explore reasons why patients attending General Practice appointments are not engaging with their General Practitioner (GP) for weight management and their perception of the role of the GP in managing their weight. METHODS: In February 2006, 367 participants aged between 17 and 64 were recruited from three General Practices in Melbourne to complete a waiting room self - administered questionnaire. Questions included basic demographics, the role of the GP in weight management, the likelihood of bringing up weight management with their GP and reasons why they would not, and their nominated ideal person to consult for weight management. Physical measurements to determine weight status were then completed. The statistical methods included means and standard deviations to summarise continuous variables such as weight and height. Sub groups of weight and questionnaire answers were analysed using the chi2 test of significant differences taking p as < 0.05. RESULTS: The population sample had similar obesity co-morbidity rates to the National Heart Foundation data. 74% of patients were not likely to bring up weight management when they visit their GP. Negative reasons were time limitation on both the patient's and doctor's part and the doctor lacking experience. The GP was the least likely person to tell a patient to lose weight after partner, family and friends. Of the 14% that had been told by their GP to lose weight, 90% had cardiovascular obesity related co-morbidities. GPs (15%) were 4th in the list of ideal persons to manage weight after personal trainer CONCLUSION: Patients do not have confidence in their GPs for weight management, preferring other health professionals who may lack evidence based training. Concurrently, GPs target only those with obesity related co-morbidities. Further studies evaluating GPs' opinions about weight management, effective strategies that can be implemented in primary care and the co-ordination of the team approach need to be done

    The impact of timing of maternal influenza immunization on infant antibody levels at birth

    Get PDF
    Pregnant women and infants are at an increased risk of severe disease after influenza infection. Maternal immunization is a potent tool to protect both these at‐risk groups. While the primary aim of maternal influenza vaccination is to protect the mother, a secondary benefit is the transfer of protective antibodies to the infant. A recent study using the tetanus, diphtheria and acellular pertussis (Tdap) vaccine indicated that children born to mothers immunized in the second trimester of pregnancy had the highest antibody titres compared to children immunized in the third trimester. The aim of the current study was to investigate how the timing of maternal influenza immunization impacts infant antibody levels at birth. Antibody titres were assessed in maternal and cord blood samples by both immunoglobulin (Ig)G‐binding enzyme‐linked immunosorbent assay (ELISA) and haemagglutination inhibition assay (HAI). Antibody titres to the H1N1 component were significantly higher in infants born to mothers vaccinated in either the second or third trimesters than infants born to unvaccinated mothers. HAI levels in the infant were significantly lower when maternal immunization was performed less than 4 weeks before birth. These studies confirm that immunization during pregnancy increases the antibody titre in infants. Importantly, antibody levels in cord blood were significantly higher when the mother was vaccinated in either trimesters 2 or 3, although titres were significantly lower if the mother was immunized less than 4 weeks before birth. Based on these data, seasonal influenza vaccination should continue to be given in pregnancy as soon as it becomes available

    Topography as a modifier of breeding habitats and concurrent vulnerability to malaria risk in the western Kenya highlands

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Topographic parameters such as elevation, slope, aspect, and ruggedness play an important role in malaria transmission in the highland areas. They affect biological systems, such as larval habitats presence and productivity for malaria mosquitoes. This study investigated whether the distribution of local spatial malaria vectors and risk of infection with malaria parasites in the highlands is related to topography.</p> <p>Methods</p> <p>Four villages each measuring 9 Km<sup>2 </sup>lying between 1400-1700 m above sea level in the western Kenya highlands were categorized into a pair of broad and narrow valley shaped terrain sites. Larval, indoor resting adult malaria vectors and infection surveys were collected originating from the valley bottom and ending at the hilltop on both sides of the valley during the rainy and dry seasons. Data collected at a distance of ≤500 m from the main river/stream were categorized as valley bottom and those above as uphill. Larval surveys were categorized by habitat location while vectors and infections by house location.</p> <p>Results</p> <p>Overall, broad flat bottomed valleys had a significantly higher number of anopheles larvae/dip in their habitats than in narrow valleys during both the dry (1.89 versus 0.89 larvae/dip) and the rainy season (1.66 versus 0.89 larvae/dip). Similarly, vector adult densities/house in broad valley villages were higher than those within narrow valley houses during both the dry (0.64 versus 0.40) and the rainy season (0.96 versus 0.09). Asymptomatic malaria prevalence was significantly higher in participants residing within broad than those in narrow valley villages during the dry (14.55% vs. 7.48%) and rainy (17.15% vs. 1.20%) season. Malaria infections were wide spread in broad valley villages during both the dry and rainy season, whereas over 65% of infections were clustered at the valley bottom in narrow valley villages during both seasons.</p> <p>Conclusion</p> <p>Despite being in the highlands, local areas within low gradient topography characterized by broad valley bottoms have stable and significantly high malaria risk unlike those with steep gradient topography, which exhibit seasonal variations. Topographic parameters could therefore be considered in identification of high-risk malaria foci to help enhance surveillance or targeted control activities in regions where they are most needed.</p

    Insecticide-treated net (ITN) ownership, usage, and malaria transmission in the highlands of western Kenya

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Insecticide-treated bed nets (ITNs) are known to be highly effective in reducing malaria morbidity and mortality. However, usage varies among households, and such variations in actual usage may seriously limit the potential impact of nets and cause spatial heterogeneity on malaria transmission. This study examined ITN ownership and underlying factors for among-household variation in use, and malaria transmission in two highland regions of western Kenya.</p> <p>Methods</p> <p>Cross-sectional surveys were conducted on ITN ownership (possession), compliance (actual usage among those who own ITNs), and malaria infections in occupants of randomly sampled houses in the dry and the rainy seasons of 2009.</p> <p>Results</p> <p>Despite ITN ownership reaching more than 71%, compliance was low at 56.3%. The compliance rate was significantly higher during the rainy season compared with the dry season (62% vs. 49.6%). Both malaria parasite prevalence (11.8% vs. 5.1%) and vector densities (1.0 vs.0.4 female/house/night) were significantly higher during the rainy season than during the dry season. Other important factors affecting the use of ITNs include: a household education level of at least primary school level, significantly high numbers of nuisance mosquitoes, and low indoor temperatures. Malaria prevalence in the rainy season was about 30% lower in ITN users than in non-ITN users, but this percentage was not significantly different during the dry season.</p> <p>Conclusion</p> <p>In malaria hypo-mesoendemic highland regions of western Kenya, the gap between ITNownership and usage is generally high with greater usage recorded during the high transmission season. Because of the low compliance among those who own ITNs, there is a need to sensitize households on sustained use of ITNs in order to optimize their role as a malaria control tool.</p

    Biomarkers of Nutrition for Development (BOND)—Iron Review

    Get PDF
    This is the fifth in the series of reviews developed as part of the Biomarkers of Nutrition for Development (BOND) program. The BOND Iron Expert Panel (I-EP) reviewed the extant knowledge regarding iron biology, public health implications, and the relative usefulness of currently available biomarkers of iron status from deficiency to overload. Approaches to assessing intake, including bioavailability, are also covered. The report also covers technical and laboratory considerations for the use of available biomarkers of iron status, and concludes with a description of research priorities along with a brief discussion of new biomarkers with potential for use across the spectrum of activities related to the study of iron in human health. The I-EP concluded that current iron biomarkers are reliable for accurately assessing many aspects of iron nutrition. However, a clear distinction is made between the relative strengths of biomarkers to assess hematological consequences of iron deficiency versus other putative functional outcomes, particularly the relationship between maternal and fetal iron status during pregnancy, birth outcomes, and infant cognitive, motor and emotional development. The I-EP also highlighted the importance of considering the confounding effects of inflammation and infection on the interpretation of iron biomarker results, as well as the impact of life stage. Finally, alternative approaches to the evaluation of the risk for nutritional iron overload at the population level are presented, because the currently designated upper limits for the biomarker generally employed (serum ferritin) may not differentiate between true iron overload and the effects of subclinical inflammation
    corecore