400 research outputs found

    Analysis of a niche market for farm-raised black sea bass Centropristis striata in North Carolina

    Get PDF
    A demand analysis for farm-raised black sea bass (BSB) was conducted in the upscale niche restaurant market of North Carolina (NC) via field sample surveys of restaurants drawn at random from the population of all NC restaurants. The analysis determines the effects of niche market variables on BSB quantity demanded at the individual restaurant level. Sample results were extrapolated to the full population of NC restaurants to estimate statewide niche market demand for farm-raised BSB. Results indicate that 15.9 percent of sampled restaurants meet the predetermined niche market criteria, producing a statewide NC niche market size estimate of 3,279 restaurants. Most (88 percent) surveyed restaurants serve a suburban rather than tourist or urban/professional clientele. Surveyed niche market restaurant chefs prefer fresh, chilled fish products (88 percent) of moderate fat content (41 percent). Beyond taste and appearance product attributes, chefs identified freshness, continuous availability, and fish size as most important. Few (7 percent) niche market restaurants currently purchase BSB, but most (76 percent) reported that they would purchase farm-raised BSB similar to those evaluated in the survey if they were available for a price similar to the price of substitute species like grouper. Some (14 percent) reported problems with ocean-caught BSB availability. A majority (66 percent) had no preference for ocean-caught over farm-raised BSB products. Regression analyses showed that higher prices for substitute species and higher dinner entrée prices have positive effects on BSB purchases, resulting in greater BSB demand. Higher BSB prices have a negative effect on BSB demand. The only significant effect of season was moderately lower demand in winter. Effects of geographical location were not significant. For a likely example scenario, estimated NC statewide BSB niche market demand was 179,077 kg (394,798 lb) per year. A potential industry limitation is NC chefs' preference for whole weight fish products exceeding 908 g (2.0 lb)

    Clustering approaches to improve the performance of low cost air pollution sensors

    Get PDF
    Low cost air pollution sensors have substantial potential for atmospheric research and for the applied control of pollution in the urban environment, including more localized warnings to the public. The current generation of single-chemical gas sensors experience degrees of interference from other co-pollutants and have sensitivity to environmental factors such as temperature, wind speed and supply voltage. There are uncertainties introduced also because of sensor-to-sensor response variability, although this is less well reported. The sensitivity of Metal Oxide Sensors (MOS) to volatile organic compounds (VOCs) changed with relative humidity (RH) by up to a factor of five over the range 19-90%RH and with an uncertainty in the correction of a factor two at any given RH. The short-term (second to minute) stabilities of MOS and electrochemical CO sensor responses were reasonable. During more extended use inter-sensor quantitative comparability was degraded due to unpredictable variability in individual sensor responses (to either measurand or interference or both) drifting over timescales of several hours to days. For timescales longer than a week identical sensors showed slow, often downwards, drifts in their responses which diverged across six CO sensors by up to 30% after two weeks. The measurement derived from the median sensor within clusters of 6, 8 and up to 21 sensors was evaluated against individual sensor performance and external reference values. The clustered approach maintained the cost competitiveness of a sensor device, but the median concentration from the ensemble of sensor signals largely eliminated the randomised hour-to-day response drift seen in individual sensors and excluded the effects of small numbers of poorly performing sensors that drifted significantly over longer time periods. The results demonstrate that for individual sensors to be optimally comparable to one another, and to reference instruments, they would likely require frequent calibration. The use of a cluster median value eliminates unpredictable medium term response changes, and other longer term outlier behaviours, extending the likely period needed between calibration and making a linear interpolation between calibrations more appropriate. Through the use of sensor clusters rather than individual sensors existing low cost technologies could deliver significantly improved quality of observations

    UK surface NO2 levels dropped by 42 % during the COVID-19 lockdown : Impact on surface O3

    Get PDF
    We report changes in surface nitrogen dioxide (NO2) across the UK during the COVID-19 pandemic when large and rapid emission reductions accompanied a nationwide lockdown (23 March-31 May 2020, inclusively), and compare them with values from an equivalent period over the previous 5 years. Data are from the Automatic Urban and Rural Network (AURN), which forms the basis of checking nationwide compliance with ambient air quality directives. We calculate that NO2 reduced by 42 %±9.8 % on average across all 126 urban AURN sites, with a slightly larger (48 %±9.5 %) reduction at sites close to the roadside (urban traffic). We also find that ozone (O3) increased by 11 % on average across the urban background network during the lockdown period. Total oxidant levels (OxCombining double low lineNO2+O3) increased only slightly on average (3.2 %±0.2 %), suggesting the majority of this change can be attributed to photochemical repartitioning due to the reduction in NOx. Generally, we find larger, positive Ox changes in southern UK cities, which we attribute to increased UV radiation and temperature in 2020 compared to previous years. The net effect of the NO2 and O3 changes is a sharp decrease in exceedances of the NO2 air quality objective limit for the UK, with only one exceedance in London in 2020 up until the end of May. Concurrent increases in O3 exceedances in London emphasize the potential for O3 to become an air pollutant of concern as NOx emissions are reduced in the next 10-20 years.

    The impact on passenger car emissions associated with the promotion and demise of diesel fuel

    Get PDF
    The promotion and growth in the use of diesel fuel in passenger cars in the UK and Europe over the past two decades led to considerable adverse air quality impacts in urban areas and more widely. In this work, we construct a multi-decade analysis of passenger car emissions in the UK based on real driving emissions data. An important part of the study is the use of extensive vehicle emission remote sensing data covering multiple measurement locations, time periods, environmental conditions and consisting of over 600,000 measurements. These data are used to consider two scenarios: first, that diesel fuel use was not promoted in the early 2000s for climate mitigation reasons, and second, that there was not a dramatic decline in diesel fuel use following the Dieselgate scandal. The strong growth of diesel fuel use coincided with a time when diesel NOx emissions were high and, conversely, the strong decrease of diesel fuel use coincided with a time when diesel vehicle after-treatment systems for NOx control were effective. We estimate that the growth in diesel car use in the UK results in excess NOx emissions of 721 kt over a three decade period; equivalent to over 7 times total annual passenger car NOx emissions and greater than total UK NOx emissions of 681.8 kt in 2021 and with an associated damage cost of £5.875 billion. However, the sharp move away from diesel fuel post-Dieselgate only reduced NOx emissions by 41 kt owing to the effectiveness of modern diesel aftertreatment systems

    ENIGMA and global neuroscience: A decade of large-scale studies of the brain in health and disease across more than 40 countries

    Get PDF
    This review summarizes the last decade of work by the ENIGMA (Enhancing NeuroImaging Genetics through Meta Analysis) Consortium, a global alliance of over 1400 scientists across 43 countries, studying the human brain in health and disease. Building on large-scale genetic studies that discovered the first robustly replicated genetic loci associated with brain metrics, ENIGMA has diversified into over 50 working groups (WGs), pooling worldwide data and expertise to answer fundamental questions in neuroscience, psychiatry, neurology, and genetics. Most ENIGMA WGs focus on specific psychiatric and neurological conditions, other WGs study normal variation due to sex and gender differences, or development and aging; still other WGs develop methodological pipelines and tools to facilitate harmonized analyses of "big data" (i.e., genetic and epigenetic data, multimodal MRI, and electroencephalography data). These international efforts have yielded the largest neuroimaging studies to date in schizophrenia, bipolar disorder, major depressive disorder, post-traumatic stress disorder, substance use disorders, obsessive-compulsive disorder, attention-deficit/hyperactivity disorder, autism spectrum disorders, epilepsy, and 22q11.2 deletion syndrome. More recent ENIGMA WGs have formed to study anxiety disorders, suicidal thoughts and behavior, sleep and insomnia, eating disorders, irritability, brain injury, antisocial personality and conduct disorder, and dissociative identity disorder. Here, we summarize the first decade of ENIGMA's activities and ongoing projects, and describe the successes and challenges encountered along the way. We highlight the advantages of collaborative large-scale coordinated data analyses for testing reproducibility and robustness of findings, offering the opportunity to identify brain systems involved in clinical syndromes across diverse samples and associated genetic, environmental, demographic, cognitive, and psychosocial factors

    The Decline in Vitamin Research Funding:A Missed Opportunity?

    Get PDF
    Background: The National Nutrition Research Roadmap has called for support of greater collaborative, interdisciplinary research for multiple areas of nutrition research. However, a substantial reduction in federal funding makes responding to these calls challenging. Objectives: The objectives of this study were to examine temporal trends in research funding and to discuss the potential consequences of these trends. Methods: We searched the NIH RePORTER database to identify NIH research grants and USASpending to identify National Science Foundation and USDA research grants awarded from 1992 to 2015. We focused on those that pertained to vitamin research. For the years 2000 to 2015, we examined funding trends for different vitamins, including vitamins A, B (one-carbon B-vitamins were considered separately from other B-vitamins), C, D, E, and K. Results: From 1992 to 2015, total federal research spending increased from similar to14to14 to 45 billion (2016 US dollars). Although vitamin research spending increased from similar to89to89 to 95 million, the proportion of grants awarded for vitamin research declined by more than two-thirds, from 0.65% in 1992 to 0.2% in 2015. Federal agencies awarded 6035 vitamin research grants over the time period, with vitamin A associated with the most research projects per year on average (n = 115) and vitamin K the fewest (n = 8). Vitamin D research projects were associated with the greatest average yearly project value ($34.8 million). Conclusions: Vitamin research has faced a disproportionate decline in research funding from 1992 to 2015. Insufficient federal research funding streams risk stalling progress in vitamin research and leaving important advancements unrealized

    Whole exome sequencing identifies genetic variants in inherited thrombocytopenia with secondary qualitative function defects

    Get PDF
    Inherited thrombocytopenias are a heterogeneous group of disorders characterised by abnormally low platelet counts which can be associated with abnormal bleeding. Next generation sequencing has previously been employed in these disorders for the confirmation of suspected genetic abnormalities, and more recently in the discovery of novel disease causing genes. However its full potential has not previously been utilised. Over the past 6 years we have sequenced the exomes from 55 patients, including 37 index cases and 18 additional family members, all of whom were recruited to the UK Genotyping and Phenotyping of Platelets study. All patients had inherited or sustained thrombocytopenia of unknown aetiology with platelet counts varying from 11-186x109 /L. Of the 51 patients phenotypically tested, 37 (73%), had an additional secondary qualitative platelet defect. Using whole exome sequencing analysis we have identified “pathogenic” or “likely pathogenic” variants in 46% (17/37) of our index patients with thrombocytopenia. In addition, we report variants of uncertain significance in 12 index cases which include novel candidate genetic variants in previously unreported genes in four index cases. These results demonstrate that whole exome sequencing is an efficient method for elucidating potential pathogenic genetic variants in inherited thrombocytopenia. Whole exome sequencing also has the added benefit of discovering potentially pathogenic genetic variants for further study in novel genes not previously implicated in inherited thrombocytopenia

    Beyond gene-disease validity: capturing structured data on inheritance, allelic requirement, disease-relevant variant classes, and disease mechanism for inherited cardiac conditions

    Get PDF
    Background: As the availability of genomic testing grows, variant interpretation will increasingly be performed by genomic generalists, rather than domain-specific experts. Demand is rising for laboratories to accurately classify variants in inherited cardiac condition (ICC) genes, including secondary findings. // Methods: We analyse evidence for inheritance patterns, allelic requirement, disease mechanism and disease-relevant variant classes for 65 ClinGen-curated ICC gene-disease pairs. We present this information for the first time in a structured dataset, CardiacG2P, and assess application in genomic variant filtering. // Results: For 36/65 gene-disease pairs, loss of function is not an established disease mechanism, and protein truncating variants are not known to be pathogenic. Using the CardiacG2P dataset as an initial variant filter allows for efficient variant prioritisation whilst maintaining a high sensitivity for retaining pathogenic variants compared with two other variant filtering approaches. // Conclusions: Access to evidence-based structured data representing disease mechanism and allelic requirement aids variant filtering and analysis and is a pre-requisite for scalable genomic testing

    Genetic association study of QT interval highlights role for calcium signaling pathways in myocardial repolarization.

    Get PDF
    The QT interval, an electrocardiographic measure reflecting myocardial repolarization, is a heritable trait. QT prolongation is a risk factor for ventricular arrhythmias and sudden cardiac death (SCD) and could indicate the presence of the potentially lethal mendelian long-QT syndrome (LQTS). Using a genome-wide association and replication study in up to 100,000 individuals, we identified 35 common variant loci associated with QT interval that collectively explain ∼8-10% of QT-interval variation and highlight the importance of calcium regulation in myocardial repolarization. Rare variant analysis of 6 new QT interval-associated loci in 298 unrelated probands with LQTS identified coding variants not found in controls but of uncertain causality and therefore requiring validation. Several newly identified loci encode proteins that physically interact with other recognized repolarization proteins. Our integration of common variant association, expression and orthogonal protein-protein interaction screens provides new insights into cardiac electrophysiology and identifies new candidate genes for ventricular arrhythmias, LQTS and SCD
    corecore