16 research outputs found

    Aquatic food security:insights into challenges and solutions from an analysis of interactions between fisheries, aquaculture, food safety, human health, fish and human welfare, economy and environment

    Get PDF
    Fisheries and aquaculture production, imports, exports and equitability of distribution determine the supply of aquatic food to people. Aquatic food security is achieved when a food supply is sufficient, safe, sustainable, shockproof and sound: sufficient, to meet needs and preferences of people; safe, to provide nutritional benefit while posing minimal health risks; sustainable, to provide food now and for future generations; shock-proof, to provide resilience to shocks in production systems and supply chains; and sound, to meet legal and ethical standards for welfare of animals, people and environment. Here, we present an integrated assessment of these elements of the aquatic food system in the United Kingdom, a system linked to dynamic global networks of producers, processors and markets. Our assessment addresses sufficiency of supply from aquaculture, fisheries and trade; safety of supply given biological, chemical and radiation hazards; social, economic and environmental sustainability of production systems and supply chains; system resilience to social, economic and environmental shocks; welfare of fish, people and environment; and the authenticity of food. Conventionally, these aspects of the food system are not assessed collectively, so information supporting our assessment is widely dispersed. Our assessment reveals trade-offs and challenges in the food system that are easily overlooked in sectoral analyses of fisheries, aquaculture, health, medicine, human and fish welfare, safety and environment. We highlight potential benefits of an integrated, systematic and ongoing process to assess security of the aquatic food system and to predict impacts of social, economic and environmental change on food supply and demand

    Use of a chemical probe to increase safety for human volunteers in toxicokinetic studies

    No full text
    International audienceTo avoid interspecies extrapolation in toxicokinetics and drug development, it is convenient to directly develop human data. In that case, exposure dose should pose null or negligible risk to the exposed individual, but still be sufficiently high to allow quantification. We propose to reduce the dose received by human volunteers during exposure, and to compensate for loss of information by exposing the same volunteers to a nontoxic agent. This method was applied to develop 1,3-butadiene (BD) exposure protocols for humans. To study the potential of such a procedure, we worked with simulated data. Three exposure times (20, 10, and 5 minutes) and four exposure concentrations (2,1, 0.8, and 0.5 ppm) were used to define 12 inhalation exposure scenarios for BD. Isoflurane was used as a probe, with simulated exposure of 20 subjects to 20 ppm isoflurane during 15 minutes. Isoflurane or BD-exhaled air concentrations were supposed to be measured 10 times. A three-compartment physiological toxicokinetic model was used to jointly describe BD and isoflurane data. For each subject, BD data were analyzed, in a Bayesian framework, either alone or together with the isoflurane data. The precision of BD metabolic rate constant or fraction metabolized was increased, and bias reduced, when BD and probe data were considered jointly. An exposure to 10 ppm x min BD and 300 ppm x min isoflurane gave equivalent precision and bias as a unique exposure to 40 ppm x min BD. The BD dose received by volunteers could therefore be at least quartered if BD exposure was supplemented with that of a probe

    Toxicity Testing in the 21st Century: A Vision and a Strategy

    No full text
    With the release of the landmark report Toxicity Testing in the 21st Century: A Vision and a Strategy, the U.S. National Academy of Sciences, in 2007, precipitated a major change in the way toxicity testing is conducted. It envisions increased efficiency in toxicity testing and decreased animal usage by transitioning from current expensive and lengthy in vivo testing with qualitative endpoints to in vitro toxicity pathway assays on human cells or cell lines using robotic high-throughput screening with mechanistic quantitative parameters. Risk assessment in the exposed human population would focus on avoiding significant perturbations in these toxicity pathways. Computational systems biology models would be implemented to determine the dose-response models of perturbations of pathway function. Extrapolation of in vitro results to in vivo human blood and tissue concentrations would be based on pharmacokinetic models for the given exposure condition. This practice would enhance human relevance of test results, and would cover several test agents, compared to traditional toxicological testing strategies. As all the tools that are necessary to implement the vision are currently available or in an advanced stage of development, the key prerequisites to achieving this paradigm shift are a commitment to change in the scientific community, which could be facilitated by a broad discussion of the vision, and obtaining necessary resources to enhance current knowledge of pathway perturbations and pathway assays in humans and to implement computational systems biology models. Implementation of these strategies would result in a new toxicity testing paradigm firmly based on human biology

    Risk Assessment

    No full text
    Risk assessment for metallic substances usually follows the generally accepted framework format for risk assessment for all toxic substances, which involves (1) exposure assessment, (2) hazard identification, (3) assessment of dose-response relationships, and (4) risk characterization. The importance of risk communication is also addressed. Risk assessment/risk communication is of particular relevance for metals and metalloids because all living organisms are exposed to these elements. Lead, cadmium, mercury, and the metalloid arsenic have been responsible for many human poisonings and even deaths. It is, hence, imperative that readers of this Handbook have a firm perspective on the exposure levels of metallic substances that produce adverse health effects and the various risk assessment approaches that have been used and are evolving to protect the health and well-being of living organisms. Because of the increasing use of nanomaterials, a recent concern is the dose metric for inhaled metallic nanoparticles. Regardless of exposure route, the following risk assessment considerations are important: biomonitoring approaches, identification of the mode of action for toxicity of metallic species for hazard identification, determining dose-effect relationships, the construction of dose-response curves, and the development of benchmark doses for various metallic species, which are discussed in relation to protecting sensitive subpopulations because not all individuals within a general population are at equal risk for toxicity. Risk characterization using modern biomarkers that are capable of detecting early cellular effects to low-dose exposures to metallic substances will play an increasingly important role in assessing risk from exposure to this class of toxic substances on an individual or mixture basis. The issue of metal-/metalloid-induced carcinogenesis is of ever increasing importance because many of the elements associated with this cellular outcome produce a number of early cellular effects, including the formation of reactive oxygen species, modification of apoptosis, and methylation of DNA. Finally, the issue of risk communication/risk management is of great importance because these issues are critical to addressing the health concerns of exposed populations and the practical, ethical, and financial issues related to reducing hazardous exposures to metallic substances.</p
    corecore