970 research outputs found

    Evaluating the control of HPAIV H5N1 in Vietnam: virus transmission within infected flocks reported before and after vaccination

    Get PDF
    Background: Currently, the highly pathogenic avian influenza virus (HPAIV) of the subtype H5N1 is believed to have reached an endemic cycle in Vietnam. We used routine surveillance data on HPAIV H5N1 poultry outbreaks in Vietnam to estimate and compare the within-flock reproductive number of infection (R(0)) for periods before (second epidemic wave, 2004-5; depopulation-based disease control) and during (fourth epidemic wave, beginning 2007; vaccination-based disease control) vaccination

    The emergence of international food safety standards and guidelines: understanding the current landscape through a historical approach

    Get PDF
    Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches

    Elucidation of the controlled-release behavior of metoprolol succinate from directly compressed xanthan gum-chitosan polymers: computational and experimental studies

    Get PDF
    The development and evaluation of a controlled-release (CR) pharmaceutical solid dosage form comprising xanthan gum (XG), low molecular weight chitosan (LCS) and metoprolol succinate (MS) is reported. The research is, partly, based upon the utilization of computational tools; in this case molecular dynamics simulations (MDs) and response surface method (RSM), in order to underpin the design/prediction and to minimize the experimental work required to achieve the desired pharmaceutical outcomes. The capability of the system to control the release of MS was studied as a function of LCS (% w/w) and total polymer (LCS and XG) to drug ratio (P:D) at different tablet tensile strengths. MDs trajectories, obtained by using different ratios of XG:LCS as well as XG and high molecular weight CS (HCS), showed that the driving force for the interaction between XG and LCS is electrostatic in nature, the most favourable complex is formed when LCS is used at 15 % (w/w) and, importantly, that the interaction between XG and LCS is more favourable than that between XG and HCS. RSM outputs revealed that the release of the drug from the LCS/XG matrix is highly dependent on both the % LCS and the P:D ratio and that the required CR effect can be achieved when using weight fractions of LCS ≤ 20% and P:D ratios ≥ 2.6:1. Results obtained from in-vitro drug release and swelling studies on the prepared tablets showed that using LCS at the weight fractions suggested by MDs and RSM data plays a major role in overcoming the high sensitivity of the controlled drug release effect of XG on ionic strength and pH changes of the dissolution media. In addition, it was found that polymer relaxation is the major contributor to the release of MS from LCS-XG tablets. Using Raman spectroscopy, MS was shown to be localized more in the core of the tablets at the initial stages of dissolution due to film formation between LCS and XG on the tablet surface which prevents excess water penetration into the matrix. In the later stages of the dissolution process, the film starts to dissolve/erode allowing full tablet hydration and a uniform drug distribution in the swollen tablet

    Associations between attributes of live poultry trade and HPAI H5N1 outbreaks: a descriptive and network analysis study in northern Vietnam

    Get PDF
    Background: The structure of contact between individuals plays an important role in the incursion and spread of contagious diseases in both human and animal populations. In the case of avian influenza, the movement of live birds is a well known risk factor for the geographic dissemination of the virus among poultry flocks. Live bird markets (LBM's) contribute to the epidemiology of avian influenza due to their demographic characteristics and the presence of HPAI H5N1 virus lineages. The relationship between poultry producers and live poultry traders (LPT's) that operate in LBM's has not been adequately documented in HPAI H5N1-affected SE Asian countries. The aims of this study were to document and study the flow of live poultry in a poultry trade network in northern Vietnam, and explore its potential role in the risk for HPAI H5N1 during 2003 to 2006

    Identifying an essential package for school-age child health: economic analysis

    Get PDF
    This chapter presents the investment case for providing an integrated package of essential health services for children attending primary schools in low- and middle- income countries (LMICs). In doing so, it builds on chapter 20 in this volume (Bundy, Schultz, and others 2017), which presents a range of relevant health services for the school- age population and the economic rationale for adminis- tering them through educational systems. This chapter identifies a package of essential health services that low- and middle-income countries (LMICs) can aspire to implement through the primary and secondary school platforms. In addition, the chapter considers the design of such programs, including targeting strategies. Upper- middle-income countries and high-income countries (HICs) typically aim to implement such interventions on a larger scale and to include and promote additional health services relevant to their populations. Studies have docu- mented the contribution of school health interventions to a range of child health and educational outcomes, partic- ularly in the United States (Durlak and others 2011; Murray and others 2007; Shackleton and others 2016). Health services selected for the essential package are those that have demonstrated benefits and relevance for children in LMICs. The estimated costs of implementation are drawn from the academic literature. The concept of a package of essential school health interventions and its justification through a cost-benefit perspective was pioneered by Jamison and Leslie (1990). As chapter 20 notes, health services for school-age children can promote educational outcomes, including access, attendance, and academic achievement, by mitigat- ing earlier nutrition and health deprivations and by addressing current infections and nutritional deficiencies (Bundy, Schultz, and others 2017). This age group is partic- ularly at risk for parasitic helminth infections (Jukes, Drake, and Bundy 2008), and malaria has become prevalent in school-age populations as control for younger children delays the acquisition of immunity from early childhood to school age (Brooker and others 2017). Furthermore, school health services are commonly viewed as a means for build- ing and reinforcing healthy habits to lower the risk of non- communicable disease later in life (Bundy 2011). This chapter focuses on packages and programs to reach school-age children, while the previous chapter, chapter 24 (Horton and Black 2017), focuses on early childhood inter- ventions, and the next chapter, chapter 26 (Horton and others 2017), focuses on adolescent interventions. These packages are all part of the same continuum of care from age 5 years to early adulthood, as discussed in chapter 1 (Bundy, de Silva, and others 2017). A particular emphasis of the economic rationale for targeting school-age children is to promote their health and education while they are in the process of learning; many of the interventions that are part of the package have been shown to yield substantial benefits in educational outcomes (Bundy 2011; Jukes, Drake, and Bundy 2008). They might be viewed as health interventions that leverage the investment in education. Schools are an effective platform through which to deliver the essential package of health and nutrition ser- vices (Bundy, Schultz, and others 2017). Primary enroll- ment and attendance rates increased substantially during the Millennium Development Goals era, making schools a delivery platform with the potential to reach large num- bers of children equitably. Furthermore, unlike health centers, almost every community has a primary school, and teachers can be trained to deliver simple health inter- ventions, resulting in the potential for high returns for relatively low costs by using the existing infrastructure. This chapter identifies a core set of interventions for children ages 5–14 years that can be delivered effectively through schools. It then simulates the returns to health and education and benchmarks them against the costs of the intervention, drawing on published estimates. The invest- ment returns illustrate the scale of returns provided by school-based health interventions, highlighting the value of integrated health services and the parameters driving costs, benefits, and value for money (the ratio of benefits to costs). Countries seeking to introduce such a package need to undertake context-specific analyses of critical needs to ensure that the package responds to the specific local needs

    Exposure Assessment Approaches for Engineered Nanomaterials

    Full text link
    Products based on nanotechnology are rapidly emerging in the marketplace, sometimes with little notice to consumers of their nanotechnology pedigree. This wide variety of nanotechnology products will result (in some cases) in unintentional human exposure to purposely engineered nanoscale materials via the dermal, inhalation, ingestion, and ocular pathways. Occupational, consumer, and environmental exposure to the nanomaterials should be characterized during the entire product lifecycle—manufacture, use, and disposal. Monitoring the fate and transport of engineered nanomaterials is complicated by the lack of detection techniques and the lack of a defined set of standardized metrics to be consistently measured. New exposure metrics may be required for engineered nanomaterials, but progress is possible by building on existing tools. An exposure metric matrix could organize existing data by relating likely exposure pathways (dermal, inhalation, ocular, ingestion) with existing measurements of important characteristics of nanoscale materials (particle number, mass, size distribution, charge). Nanomaterial characteristics not commonly measured, but shown to initiate a biological response during toxicity testing, signal a need for further research, such as the pressing need to develop monitoring devices capable of measuring those aspects of engineered nanomaterials that result in biological responses in humans. Modeling the behavior of nanoparticles may require new types of exposure models that individually track particles through the environment while keeping track of the particle shape, surface area, and other surface characteristics as the nanoparticles are transformed or become reactive. Lifecycle analysis could also be used to develop conceptual models of exposure from engineered nanomaterials.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/79200/1/j.1539-6924.2010.01446.x.pd

    Managed Aquifer Recharge as a Tool to Enhance Sustainable Groundwater Management in California

    Get PDF
    A growing population and an increased demand for water resources have resulted in a global trend of groundwater depletion. Arid and semi-arid climates are particularly susceptible, often relying on groundwater to support large population centers or irrigated agriculture in the absence of sufficient surface water resources. In an effort to increase the security of groundwater resources, managed aquifer recharge (MAR) programs have been developed and implemented globally. MAR is the approach of intentionally harvesting and infiltrating water to recharge depleted aquifer storage. California is a prime example of this growing problem, with three cities that have over a million residents and an agricultural industry that was valued at 47 billion dollars in 2015. The present-day groundwater overdraft of over 100 km3 (since 1962) indicates a clear disparity between surface water supply and water demand within the state. In the face of groundwater overdraft and the anticipated effects of climate change, many new MAR projects are being constructed or investigated throughout California, adding to those that have existed for decades. Some common MAR types utilized in California include injection wells, infiltration basins (also known as spreading basins, percolation basins, or recharge basins), and low-impact development. An emerging MAR type that is actively being investigated is the winter flooding of agricultural fields using existing irrigation infrastructure and excess surface water resources, known as agricultural MAR. California therefore provides an excellent case study to look at the historical use and performance of MAR, ongoing and emerging challenges, novel MAR applications, and the potential for expansion of MAR. Effective MAR projects are an essential tool for increasing groundwater security, both in California and on a global scale. This chapter aims to provide an overview of the most common MAR types and applications within the State of California and neighboring semi-arid regions

    The roles and values of wild foods in agricultural systems

    Get PDF
    Almost every ecosystem has been amended so that plants and animals can be used as food, fibre, fodder, medicines, traps and weapons. Historically, wild plants and animals were sole dietary components for hunter–gatherer and forager cultures. Today, they remain key to many agricultural communities. The mean use of wild foods by agricultural and forager communities in 22 countries of Asia and Africa (36 studies) is 90–100 species per location. Aggregate country estimates can reach 300–800 species (e.g. India, Ethiopia, Kenya). The mean use of wild species is 120 per community for indigenous communities in both industrialized and developing countries. Many of these wild foods are actively managed, suggesting there is a false dichotomy around ideas of the agricultural and the wild: hunter–gatherers and foragers farm and manage their environments, and cultivators use many wild plants and animals. Yet, provision of and access to these sources of food may be declining as natural habitats come under increasing pressure from development, conservation-exclusions and agricultural expansion. Despite their value, wild foods are excluded from official statistics on economic values of natural resources. It is clear that wild plants and animals continue to form a significant proportion of the global food basket, and while a variety of social and ecological drivers are acting to reduce wild food use, their importance may be set to grow as pressures on agricultural productivity increase.</jats:p
    corecore