97 research outputs found

    An Update on Solid Grass Biomass Fuels in Vermont

    Get PDF
    This report documents recent testing involving the densification and combustion of solid, grass biomass fuels in a small commercial boiler (342,100 BTU/hr output rating). Fuel briquettes (or “pucks”) were made from Switchgrass, Miscanthus, Reed Canary, MulchHay and “Ag Biomass” / Field Residue as well as mixtures of these feedstocks with ground wood chips. Our findings were: 1.On-farm, small scale densification of grass and agricultural biomass solid fuels via pucking is feasible with a conversion (densification) cost of 49−148pertonandafinishedfuelcostintherangeof49-148 per ton anda finished fuel cost in the range of 85-228 per ton (5.2–14.4permillionBTU).2.Sustained,reliablecombustionofdensifiedgrassandagriculturalbiomasssolidfuelsinalightcommercialboiler(EvoWorldHC100Eco)isfeasiblewith73−903.ThetestoftheAgBiomass/FieldResiduefueldemonstratedfeasibilityatacurrentdeliveredpriceof5.2 –14.4 per million BTU). 2.Sustained, reliable combustion of densified grass and agricultural biomass solid fuels in a light commercial boiler (EvoWorld HC100 Eco) is feasible with 73-90% combustion efficiency,and with no ash fusion or clinker development. Longer, sustained overnight runs did result in some combustion chamber clogging with ash and fuel residue which may be resolved with further boiler tuning and clean out cycle timing adjustment. 3.The test of the Ag Biomass / Field Residue fuel demonstrated feasibility at a current delivered price of 214 per ton(13.2permillionBTU)supportingapotentialpaybackperiodof3.6yearsontheboiler.Athigherproductionvolumeprojectsapathto13.2per million BTU) supporting a potential payback period of 3.6 years on the boiler. At higher production volume projects a path to 85 per ton ($5.2per million BTU)and a potential payback period of 2.4 years

    Increasing Supply and Quality of Local Storage Vegetables

    Get PDF
    This project installed environmental monitoring equipment to improve storage conditions and ultimately the quality of 1,736 tons of winter storage crops at 9 farms throughout Vermont . The cumulative market value of these storage crops produced during the 2012-2014 growing seasons was 3.5million.Improvedstoragemonitoringledtobettercontrolofstorageconditions,inpartthroughautomatednotificationtofarmerswhenabnormalconditionswereoccurring.Thisallowedforpromptcorrectionofproblemssuchasopendoorsandfailingorinoperativecoolingequipment.Lossesofstoragecrops(cullrates)werereducedfrom 153.5 million. Improved storage monitoring led to better control of storage conditions, in part through automated notification to farmers when abnormal conditions were occurring. This allowed for prompt correction of problems such as open doors and failing or inoperative cooling equipment. Losses of storage crops (cull rates) were reduced from ~15% to ~5% of stored volume. Sixty-six energy efficiency measures were also implemented at 5 of these farms, saving a total of 40,269 kWh of electricity and 5,800 annually. The systems deployed have increased the confidence of growers to expand their winter storage of Vermont-grown vegetables, leading to an increased supply of local produce outside of the traditional growing and marketing season

    Promoting Adoption of Biomass Fuels for Heating Vegetable Greenhouses in Vermont

    Get PDF
    In the Northeast, early and late season production of food crops using greenhouses requires the addition of heat to maintain temperature and also to control humidity. The heating fuel used is generally propane or other fossil fuels. The use of greenhouses, and greenhouse heating, are on the increase in Vermont as growers respond to increased demand for local food throughout the year. Greenhouse production is also on the rise because it allows growers to protect against extreme weather events such as heavy rain or drought, and it affords better control of the growing environment, leading to improved yield and quality. However, using fossil fuels to control the growing environment is costly and these fuels also contribute to greenhouse gas emissions. Vermont greenhouse growers produce 24.5millionincropsusing2.6millionsquarefeetofgrowingareaatanestimatedannualheatingcostof24.5 million in crops using 2.6 million square feet of growing area at an estimated annual heating cost of 1.8 million. Many of these growers are interested in alternatives to fossil fuels for heating in order to improve their profitability and/or reduce their environmental impact. This project demonstrated the use of biomass heating for greenhouse vegetable production at sites across Vermont. From 2008 through 2015, 25 growers received cost-share funds for greenhouse biomass heating systems. The total installed cost of these systems was 312,766;theaveragecostpersystemwas312,766; the average cost per system was 12,511 and the average cost-share (i.e. sponsor funding) on these projects was 44% of the total cost. The growers installed a variety of system types depending on desired fuel, heating load and method of heat distribution (hot air or hot water). The project started in 2008 and the systems have operated for the equivalent of 96 growing seasons in total with an average of 3.8 growing seasons per system, an average net fuel savings of $2,696 per system per year, and an average payback of 4.8 years (at full cost). From 2008 through 2015 a total of 15.3 trillion BTU of biomass energy was provided to these greenhouses, equivalent to 167,000 gallons of propane. The cumulative equivalent carbon dioxide emissions avoided by this substitution of fuel is estimated to be 2.14 million pounds. This is roughly equivalent to the annual emissions from 204 cars, or 2.3 million miles of car travel

    Addressing the unmet need for visualizing Conditional Random Fields in Biological Data

    Get PDF
    Background: The biological world is replete with phenomena that appear to be ideally modeled and analyzed by one archetypal statistical framework - the Graphical Probabilistic Model (GPM). The structure of GPMs is a uniquely good match for biological problems that range from aligning sequences to modeling the genome-to-phenome relationship. The fundamental questions that GPMs address involve making decisions based on a complex web of interacting factors. Unfortunately, while GPMs ideally fit many questions in biology, they are not an easy solution to apply. Building a GPM is not a simple task for an end user. Moreover, applying GPMs is also impeded by the insidious fact that the complex web of interacting factors inherent to a problem might be easy to define and also intractable to compute upon. Discussion: We propose that the visualization sciences can contribute to many domains of the bio-sciences, by developing tools to address archetypal representation and user interaction issues in GPMs, and in particular a variety of GPM called a Conditional Random Field(CRF). CRFs bring additional power, and additional complexity, because the CRF dependency network can be conditioned on the query data. Conclusions: In this manuscript we examine the shared features of several biological problems that are amenable to modeling with CRFs, highlight the challenges that existing visualization and visual analytics paradigms induce for these data, and document an experimental solution called StickWRLD which, while leaving room for improvement, has been successfully applied in several biological research projects.Comment: BioVis 2014 conferenc

    The re-birth of the "beat": A hyperlocal online newsgathering model

    Get PDF
    This is an Author's Accepted Manuscript of an article published in Journalism Practice, 6(5-6), 754 - 765, 2012, copyright Taylor & Francis, available online at: http://www.tandfonline.com/10.1080/17512786.2012.667279.Scholars have long lamented the death of the 'beat' in news journalism. Today's journalists generate more copy than they used to, a deluge of PR releases often keeping them in the office, and away from their communities. Consolidation in industry has dislodged some journalists from their local sources. Yet hyperlocal online activity is thriving if journalists have the time and inclination to engage with it. This paper proposes an exploratory, normative schema intended to help local journalists systematically map and monitor their own hyperlocal online communities and contacts, with the aim of re-establishing local news beats online as networks. This model is, in part, technologically-independent. It encompasses proactive and reactive news-gathering and forward planning approaches. A schema is proposed, developed upon suggested news-gathering frameworks from the literature. These experiences were distilled into an iterative, replicable schema for local journalism. This model was then used to map out two real-world 'beats' for local news-gathering. Journalists working within these local beats were invited to trial the models created. It is hoped that this research will empower journalists by improving their information auditing, and could help re-define journalists' relationship with their online audiences

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    A Simple Standard for Sharing Ontological Mappings (SSSOM).

    Get PDF
    Despite progress in the development of standards for describing and exchanging scientific information, the lack of easy-to-use standards for mapping between different representations of the same or similar objects in different databases poses a major impediment to data integration and interoperability. Mappings often lack the metadata needed to be correctly interpreted and applied. For example, are two terms equivalent or merely related? Are they narrow or broad matches? Or are they associated in some other way? Such relationships between the mapped terms are often not documented, which leads to incorrect assumptions and makes them hard to use in scenarios that require a high degree of precision (such as diagnostics or risk prediction). Furthermore, the lack of descriptions of how mappings were done makes it hard to combine and reconcile mappings, particularly curated and automated ones. We have developed the Simple Standard for Sharing Ontological Mappings (SSSOM) which addresses these problems by: (i) Introducing a machine-readable and extensible vocabulary to describe metadata that makes imprecision, inaccuracy and incompleteness in mappings explicit. (ii) Defining an easy-to-use simple table-based format that can be integrated into existing data science pipelines without the need to parse or query ontologies, and that integrates seamlessly with Linked Data principles. (iii) Implementing open and community-driven collaborative workflows that are designed to evolve the standard continuously to address changing requirements and mapping practices. (iv) Providing reference tools and software libraries for working with the standard. In this paper, we present the SSSOM standard, describe several use cases in detail and survey some of the existing work on standardizing the exchange of mappings, with the goal of making mappings Findable, Accessible, Interoperable and Reusable (FAIR). The SSSOM specification can be found at http://w3id.org/sssom/spec. Database URL: http://w3id.org/sssom/spec

    Characterizing Long COVID: Deep Phenotype of a Complex Condition.

    Get PDF
    BACKGROUND: Numerous publications describe the clinical manifestations of post-acute sequelae of SARS-CoV-2 (PASC or long COVID ), but they are difficult to integrate because of heterogeneous methods and the lack of a standard for denoting the many phenotypic manifestations. Patient-led studies are of particular importance for understanding the natural history of COVID-19, but integration is hampered because they often use different terms to describe the same symptom or condition. This significant disparity in patient versus clinical characterization motivated the proposed ontological approach to specifying manifestations, which will improve capture and integration of future long COVID studies. METHODS: The Human Phenotype Ontology (HPO) is a widely used standard for exchange and analysis of phenotypic abnormalities in human disease but has not yet been applied to the analysis of COVID-19. FINDINGS: We identified 303 articles published before April 29, 2021, curated 59 relevant manuscripts that described clinical manifestations in 81 cohorts three weeks or more following acute COVID-19, and mapped 287 unique clinical findings to HPO terms. We present layperson synonyms and definitions that can be used to link patient self-report questionnaires to standard medical terminology. Long COVID clinical manifestations are not assessed consistently across studies, and most manifestations have been reported with a wide range of synonyms by different authors. Across at least 10 cohorts, authors reported 31 unique clinical features corresponding to HPO terms; the most commonly reported feature was Fatigue (median 45.1%) and the least commonly reported was Nausea (median 3.9%), but the reported percentages varied widely between studies. INTERPRETATION: Translating long COVID manifestations into computable HPO terms will improve analysis, data capture, and classification of long COVID patients. If researchers, clinicians, and patients share a common language, then studies can be compared/pooled more effectively. Furthermore, mapping lay terminology to HPO will help patients assist clinicians and researchers in creating phenotypic characterizations that are computationally accessible, thereby improving the stratification, diagnosis, and treatment of long COVID. FUNDING: U24TR002306; UL1TR001439; P30AG024832; GBMF4552; R01HG010067; UL1TR002535; K23HL128909; UL1TR002389; K99GM145411

    Increased Incidence of Vestibular Disorders in Patients With SARS-CoV-2

    Get PDF
    OBJECTIVE: Determine the incidence of vestibular disorders in patients with SARS-CoV-2 compared to the control population. STUDY DESIGN: Retrospective. SETTING: Clinical data in the National COVID Cohort Collaborative database (N3C). METHODS: Deidentified patient data from the National COVID Cohort Collaborative database (N3C) were queried based on variant peak prevalence (untyped, alpha, delta, omicron 21K, and omicron 23A) from covariants.org to retrospectively analyze the incidence of vestibular disorders in patients with SARS-CoV-2 compared to control population, consisting of patients without documented evidence of COVID infection during the same period. RESULTS: Patients testing positive for COVID-19 were significantly more likely to have a vestibular disorder compared to the control population. Compared to control patients, the odds ratio of vestibular disorders was significantly elevated in patients with untyped (odds ratio [OR], 2.39; confidence intervals [CI], 2.29-2.50; CONCLUSIONS: The incidence of vestibular disorders differed between COVID-19 variants and was significantly elevated in COVID-19-positive patients compared to the control population. These findings have implications for patient counseling and further research is needed to discern the long-term effects of these findings

    Narratives of Change and Theorisations on Continuity: the Duality of the Concept of Emerging Power in International Relations

    Full text link
    • 

    corecore