161 research outputs found

    Determining the Importance of Macro and Trace Dietary Minerals on Growth and Nutrient Retention in Juvenile Penaeus monodon

    Get PDF
    Twelve minerals were screened to identify key dietary minerals important for Penaeus monodon growth. The minerals selected included boron, calcium plus phosphorus (assessed in tandem at a 1:1 ratio), cobalt, copper, magnesium, manganese, potassium, selenium, sodium, strontium and zinc. Twelve purified casein/gelatin-based diets were formulated and contained minerals at two levels: below expected requirements, as attributed by the basal formulation (-) and above expected requirements by adding inorganic minerals (+). The two levels were allocated to dietary treatments in juvenile prawns in accordance with the PB design. A two-level screening analysis was employed where effect of each mineral at level - or + across twelve diets were compared to identify the minerals of importance for culture performance of juvenile prawns. Calcium plus phosphorus (at a 1:1 ratio), magnesium, boron, manganese, selenium and zinc produced the greatest positive effects on weight gain, feed conversion efficiency, biomass gain and nutrient/energy retention. Particularly, boron and manganese significantly increased retention of key macronutrients and energy including gross energy, crude protein and crude lipid. Our study demonstrates the importance of several macro and trace minerals in prawn diets and the pressing need to refine their requirements for P. monodon

    Mineral nutrition in penaeid shrimp

    Get PDF
    This review summarises the current knowledge of mineral nutrition for penaeid shrimp. It investigates how the aquatic environment and the lifecycle of shrimp affect requirements and the role that minerals play in shrimp health. Methods of supplying minerals via either water or to feed, and novel ways of supplementing minerals within feed, are discussed. The requirements for individual minerals are summarised with recommendations for minimum levels of dietary inclusion for semi-intensive and intensive commercial shrimp culture presented where data permits. Estimates of dietary requirement remain broad for most minerals for the main shrimp production species (Penaeus vannamei, Penaeus monodon and Penaeus japonicus), with some essential minerals remaining unstudied (Table 2 in Section 5.10). Mineral nutrition will become more important as intensification and diversification of production systems provide new challenges to shrimp aquaculture.</p

    Building a framework for process-oriented evaluation of Regional Climate Outlook Forums

    Get PDF
    In many regions around the world, Regional Climate Outlook Forums (RCOFs) provide seasonal climate information and forecasts to decision-makers at regional and national levels. Despite having two decades of experience, the forums have not been systematically monitored or evaluated. To address this gap, and to better inform nascent and widespread efforts in climate services, the authors propose a process-oriented evaluation framework derived from literature on decision support and climate communication around the production and use of scientific information.The authors apply this framework to a case study of the Caribbean RCOF (CariCOF), where they have been engaged in a collaborative effort to integrate climate information and decision processes to enhance regional climate resilience. The authors’ examination of the CariCOF shows an evolution toward the use of more advanced and more diverse climate products, as well as greater awareness of user feedback. It also reveals shortfalls of the CariCOF, including a lack of diverse stakeholder participation, a need for better understanding of best practices to tailor information, undeveloped market research of climate products, insufficient experimentation and vetting of communication mechanisms, and the absence of a way to steward a diverse network of regional actors. The authors’ analysis also provides insight that allowed for improvements in the climate services framework to include mechanisms to respond to changing needs and conditions. The authors’ process-oriented framework can serve as a starting point for evaluating RCOFs and other organizations charged with the provision of climate services

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p&lt;0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p&lt;0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p&lt;0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    The state of the Martian climate

    Get PDF
    60°N was +2.0°C, relative to the 1981–2010 average value (Fig. 5.1). This marks a new high for the record. The average annual surface air temperature (SAT) anomaly for 2016 for land stations north of starting in 1900, and is a significant increase over the previous highest value of +1.2°C, which was observed in 2007, 2011, and 2015. Average global annual temperatures also showed record values in 2015 and 2016. Currently, the Arctic is warming at more than twice the rate of lower latitudes

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p&lt;0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p&lt;0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p&gt;0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p&lt;0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    State of the climate in 2018

    Get PDF
    In 2018, the dominant greenhouse gases released into Earth’s atmosphere—carbon dioxide, methane, and nitrous oxide—continued their increase. The annual global average carbon dioxide concentration at Earth’s surface was 407.4 ± 0.1 ppm, the highest in the modern instrumental record and in ice core records dating back 800 000 years. Combined, greenhouse gases and several halogenated gases contribute just over 3 W m−2 to radiative forcing and represent a nearly 43% increase since 1990. Carbon dioxide is responsible for about 65% of this radiative forcing. With a weak La Niña in early 2018 transitioning to a weak El Niño by the year’s end, the global surface (land and ocean) temperature was the fourth highest on record, with only 2015 through 2017 being warmer. Several European countries reported record high annual temperatures. There were also more high, and fewer low, temperature extremes than in nearly all of the 68-year extremes record. Madagascar recorded a record daily temperature of 40.5°C in Morondava in March, while South Korea set its record high of 41.0°C in August in Hongcheon. Nawabshah, Pakistan, recorded its highest temperature of 50.2°C, which may be a new daily world record for April. Globally, the annual lower troposphere temperature was third to seventh highest, depending on the dataset analyzed. The lower stratospheric temperature was approximately fifth lowest. The 2018 Arctic land surface temperature was 1.2°C above the 1981–2010 average, tying for third highest in the 118-year record, following 2016 and 2017. June’s Arctic snow cover extent was almost half of what it was 35 years ago. Across Greenland, however, regional summer temperatures were generally below or near average. Additionally, a satellite survey of 47 glaciers in Greenland indicated a net increase in area for the first time since records began in 1999. Increasing permafrost temperatures were reported at most observation sites in the Arctic, with the overall increase of 0.1°–0.2°C between 2017 and 2018 being comparable to the highest rate of warming ever observed in the region. On 17 March, Arctic sea ice extent marked the second smallest annual maximum in the 38-year record, larger than only 2017. The minimum extent in 2018 was reached on 19 September and again on 23 September, tying 2008 and 2010 for the sixth lowest extent on record. The 23 September date tied 1997 as the latest sea ice minimum date on record. First-year ice now dominates the ice cover, comprising 77% of the March 2018 ice pack compared to 55% during the 1980s. Because thinner, younger ice is more vulnerable to melting out in summer, this shift in sea ice age has contributed to the decreasing trend in minimum ice extent. Regionally, Bering Sea ice extent was at record lows for almost the entire 2017/18 ice season. For the Antarctic continent as a whole, 2018 was warmer than average. On the highest points of the Antarctic Plateau, the automatic weather station Relay (74°S) broke or tied six monthly temperature records throughout the year, with August breaking its record by nearly 8°C. However, cool conditions in the western Bellingshausen Sea and Amundsen Sea sector contributed to a low melt season overall for 2017/18. High SSTs contributed to low summer sea ice extent in the Ross and Weddell Seas in 2018, underpinning the second lowest Antarctic summer minimum sea ice extent on record. Despite conducive conditions for its formation, the ozone hole at its maximum extent in September was near the 2000–18 mean, likely due to an ongoing slow decline in stratospheric chlorine monoxide concentration. Across the oceans, globally averaged SST decreased slightly since the record El Niño year of 2016 but was still far above the climatological mean. On average, SST is increasing at a rate of 0.10° ± 0.01°C decade−1 since 1950. The warming appeared largest in the tropical Indian Ocean and smallest in the North Pacific. The deeper ocean continues to warm year after year. For the seventh consecutive year, global annual mean sea level became the highest in the 26-year record, rising to 81 mm above the 1993 average. As anticipated in a warming climate, the hydrological cycle over the ocean is accelerating: dry regions are becoming drier and wet regions rainier. Closer to the equator, 95 named tropical storms were observed during 2018, well above the 1981–2010 average of 82. Eleven tropical cyclones reached Saffir–Simpson scale Category 5 intensity. North Atlantic Major Hurricane Michael’s landfall intensity of 140 kt was the fourth strongest for any continental U.S. hurricane landfall in the 168-year record. Michael caused more than 30 fatalities and 25billion(U.S.dollars)indamages.InthewesternNorthPacific,SuperTyphoonMangkhutledto160fatalitiesand25 billion (U.S. dollars) in damages. In the western North Pacific, Super Typhoon Mangkhut led to 160 fatalities and 6 billion (U.S. dollars) in damages across the Philippines, Hong Kong, Macau, mainland China, Guam, and the Northern Mariana Islands. Tropical Storm Son-Tinh was responsible for 170 fatalities in Vietnam and Laos. Nearly all the islands of Micronesia experienced at least moderate impacts from various tropical cyclones. Across land, many areas around the globe received copious precipitation, notable at different time scales. Rodrigues and Réunion Island near southern Africa each reported their third wettest year on record. In Hawaii, 1262 mm precipitation at Waipā Gardens (Kauai) on 14–15 April set a new U.S. record for 24-h precipitation. In Brazil, the city of Belo Horizonte received nearly 75 mm of rain in just 20 minutes, nearly half its monthly average. Globally, fire activity during 2018 was the lowest since the start of the record in 1997, with a combined burned area of about 500 million hectares. This reinforced the long-term downward trend in fire emissions driven by changes in land use in frequently burning savannas. However, wildfires burned 3.5 million hectares across the United States, well above the 2000–10 average of 2.7 million hectares. Combined, U.S. wildfire damages for the 2017 and 2018 wildfire seasons exceeded $40 billion (U.S. dollars)

    Effect of anti-interleukin drugs in patients with COVID-19 and signs of cytokine release syndrome (COV-AID): a factorial, randomised, controlled trial.

    Full text link
    BACKGROUND: Infections with SARS-CoV-2 continue to cause significant morbidity and mortality. Interleukin (IL)-1 and IL-6 blockade have been proposed as therapeutic strategies in COVID-19, but study outcomes have been conflicting. We sought to study whether blockade of the IL-6 or IL-1 pathway shortened the time to clinical improvement in patients with COVID-19, hypoxic respiratory failure, and signs of systemic cytokine release syndrome. METHODS: We did a prospective, multicentre, open-label, randomised, controlled trial, in hospitalised patients with COVID-19, hypoxia, and signs of a cytokine release syndrome across 16 hospitals in Belgium. Eligible patients had a proven diagnosis of COVID-19 with symptoms between 6 and 16 days, a ratio of the partial pressure of oxygen to the fraction of inspired oxygen (PaO(2):FiO(2)) of less than 350 mm Hg on room air or less than 280 mm Hg on supplemental oxygen, and signs of a cytokine release syndrome in their serum (either a single ferritin measurement of more than 2000 μg/L and immediately requiring high flow oxygen or mechanical ventilation, or a ferritin concentration of more than 1000 μg/L, which had been increasing over the previous 24 h, or lymphopenia below 800/mL with two of the following criteria: an increasing ferritin concentration of more than 700 μg/L, an increasing lactate dehydrogenase concentration of more than 300 international units per L, an increasing C-reactive protein concentration of more than 70 mg/L, or an increasing D-dimers concentration of more than 1000 ng/mL). The COV-AID trial has a 2 × 2 factorial design to evaluate IL-1 blockade versus no IL-1 blockade and IL-6 blockade versus no IL-6 blockade. Patients were randomly assigned by means of permuted block randomisation with varying block size and stratification by centre. In a first randomisation, patients were assigned to receive subcutaneous anakinra once daily (100 mg) for 28 days or until discharge, or to receive no IL-1 blockade (1:2). In a second randomisation step, patients were allocated to receive a single dose of siltuximab (11 mg/kg) intravenously, or a single dose of tocilizumab (8 mg/kg) intravenously, or to receive no IL-6 blockade (1:1:1). The primary outcome was the time to clinical improvement, defined as time from randomisation to an increase of at least two points on a 6-category ordinal scale or to discharge from hospital alive. The primary and supportive efficacy endpoints were assessed in the intention-to-treat population. Safety was assessed in the safety population. This study is registered online with ClinicalTrials.gov (NCT04330638) and EudraCT (2020-001500-41) and is complete. FINDINGS: Between April 4, and Dec 6, 2020, 342 patients were randomly assigned to IL-1 blockade (n=112) or no IL-1 blockade (n=230) and simultaneously randomly assigned to IL-6 blockade (n=227; 114 for tocilizumab and 113 for siltuximab) or no IL-6 blockade (n=115). Most patients were male (265 [77%] of 342), median age was 65 years (IQR 54-73), and median Systematic Organ Failure Assessment (SOFA) score at randomisation was 3 (2-4). All 342 patients were included in the primary intention-to-treat analysis. The estimated median time to clinical improvement was 12 days (95% CI 10-16) in the IL-1 blockade group versus 12 days (10-15) in the no IL-1 blockade group (hazard ratio [HR] 0·94 [95% CI 0·73-1·21]). For the IL-6 blockade group, the estimated median time to clinical improvement was 11 days (95% CI 10-16) versus 12 days (11-16) in the no IL-6 blockade group (HR 1·00 [0·78-1·29]). 55 patients died during the study, but no evidence for differences in mortality between treatment groups was found. The incidence of serious adverse events and serious infections was similar across study groups. INTERPRETATION: Drugs targeting IL-1 or IL-6 did not shorten the time to clinical improvement in this sample of patients with COVID-19, hypoxic respiratory failure, low SOFA score, and low baseline mortality risk. FUNDING: Belgian Health Care Knowledge Center and VIB Grand Challenges program
    corecore