24,372 research outputs found

    A cost-benefit analysis of a pellet boiler with electrostatic precipitator versus conventional biomass technology: A case study of an institutional boiler in Syracuse, New York

    Full text link
    BACKGROUND: Biomass facilities have received increasing attention as a strategy to increase the use of renewable fuels and decrease greenhouse gas emissions from the electric generation and heating sectors, but these facilities can potentially increase local air pollution and associated health effects. Comparing the economic costs and public health benefits of alternative biomass fuel, heating technology, and pollution control technology options provides decision-makers with the necessary information to make optimal choices in a given location. METHODS: For a case study of a combined heat and power biomass facility in Syracuse, New York, we used stack testing to estimate emissions of fine particulate matter (PM2.5) for both the deployed technology (staged combustion pellet boiler with an electrostatic precipitator) and a conventional alternative (wood chip stoker boiler with a multicyclone). We used the atmospheric dispersion model AERMOD to calculate the contribution of either fuel-technology configuration to ambient primary PM2.5 in a 10 km x 10 km region surrounding the facility, and we quantified the incremental contribution to population mortality and morbidity. We assigned economic values to health outcomes and compared the health benefits of the lower-emitting technology with the incremental costs. RESULTS: In total, the incremental annualized cost of the lower-emitting pellet boiler was 190,000greater,drivenbyagreatercostofthepelletfuelandpollutioncontroltechnology,offsetinpartbyreducedfuelstoragecosts.PM2.5emissionswereafactorof23lowerwiththepelletboilerwithelectrostaticprecipitator,withcorrespondingdifferencesincontributionstoambientprimaryPM2.5concentrations.Themonetaryvalueofthepublichealthbenefitsofselectingthepelletfiredboilertechnologywithelectrostaticprecipitatorwas190,000 greater, driven by a greater cost of the pellet fuel and pollution control technology, offset in part by reduced fuel storage costs. PM2.5 emissions were a factor of 23 lower with the pellet boiler with electrostatic precipitator, with corresponding differences in contributions to ambient primary PM2.5 concentrations. The monetary value of the public health benefits of selecting the pellet-fired boiler technology with electrostatic precipitator was 1.7 million annually, greatly exceeding the differential costs even when accounting for uncertainties. Our analyses also showed complex spatial patterns of health benefits given non-uniform age distributions and air pollution levels. CONCLUSIONS: The incremental investment in a lower-emitting staged combustion pellet boiler with an electrostatic precipitator was well justified by the population health improvements over the conventional wood chip technology with a multicyclone, even given the focus on only primary PM2.5 within a small spatial domain. Our analytical framework could be generalized to other settings to inform optimal strategies for proposed new facilities or populations.This research was supported by the New York State Energy Research and Development Authority (NYSERDA), via an award to the Northeast States for Coordinated Air Use Management (Agreement #92229). The SCICHEM work of KMZ was supported by the Electric Power Research Institute (EPRI)

    ERIGrid Holistic Test Description for Validating Cyber-Physical Energy Systems

    Get PDF
    Smart energy solutions aim to modify and optimise the operation of existing energy infrastructure. Such cyber-physical technology must be mature before deployment to the actual infrastructure, and competitive solutions will have to be compliant to standards still under development. Achieving this technology readiness and harmonisation requires reproducible experiments and appropriately realistic testing environments. Such testbeds for multi-domain cyber-physical experiments are complex in and of themselves. This work addresses a method for the scoping and design of experiments where both testbed and solution each require detailed expertise. This empirical work first revisited present test description approaches, developed a newdescription method for cyber-physical energy systems testing, and matured it by means of user involvement. The new Holistic Test Description (HTD) method facilitates the conception, deconstruction and reproduction of complex experimental designs in the domains of cyber-physical energy systems. This work develops the background and motivation, offers a guideline and examples to the proposed approach, and summarises experience from three years of its application.This work received funding in the European Community’s Horizon 2020 Program (H2020/2014–2020) under project “ERIGrid” (Grant Agreement No. 654113)

    An Assessment to Benchmark the Seismic Performance of a Code-Conforming Reinforced-Concrete Moment-Frame Building

    Get PDF
    This report describes a state-of-the-art performance-based earthquake engineering methodology that is used to assess the seismic performance of a four-story reinforced concrete (RC) office building that is generally representative of low-rise office buildings constructed in highly seismic regions of California. This “benchmark” building is considered to be located at a site in the Los Angeles basin, and it was designed with a ductile RC special moment-resisting frame as its seismic lateral system that was designed according to modern building codes and standards. The building’s performance is quantified in terms of structural behavior up to collapse, structural and nonstructural damage and associated repair costs, and the risk of fatalities and their associated economic costs. To account for different building configurations that may be designed in practice to meet requirements of building size and use, eight structural design alternatives are used in the performance assessments. Our performance assessments account for important sources of uncertainty in the ground motion hazard, the structural response, structural and nonstructural damage, repair costs, and life-safety risk. The ground motion hazard characterization employs a site-specific probabilistic seismic hazard analysis and the evaluation of controlling seismic sources (through disaggregation) at seven ground motion levels (encompassing return periods ranging from 7 to 2475 years). Innovative procedures for ground motion selection and scaling are used to develop acceleration time history suites corresponding to each of the seven ground motion levels. Structural modeling utilizes both “fiber” models and “plastic hinge” models. Structural modeling uncertainties are investigated through comparison of these two modeling approaches, and through variations in structural component modeling parameters (stiffness, deformation capacity, degradation, etc.). Structural and nonstructural damage (fragility) models are based on a combination of test data, observations from post-earthquake reconnaissance, and expert opinion. Structural damage and repair costs are modeled for the RC beams, columns, and slabcolumn connections. Damage and associated repair costs are considered for some nonstructural building components, including wallboard partitions, interior paint, exterior glazing, ceilings, sprinkler systems, and elevators. The risk of casualties and the associated economic costs are evaluated based on the risk of structural collapse, combined with recent models on earthquake fatalities in collapsed buildings and accepted economic modeling guidelines for the value of human life in loss and cost-benefit studies. The principal results of this work pertain to the building collapse risk, damage and repair cost, and life-safety risk. These are discussed successively as follows. When accounting for uncertainties in structural modeling and record-to-record variability (i.e., conditional on a specified ground shaking intensity), the structural collapse probabilities of the various designs range from 2% to 7% for earthquake ground motions that have a 2% probability of exceedance in 50 years (2475 years return period). When integrated with the ground motion hazard for the southern California site, the collapse probabilities result in mean annual frequencies of collapse in the range of [0.4 to 1.4]x10 -4 for the various benchmark building designs. In the development of these results, we made the following observations that are expected to be broadly applicable: (1) The ground motions selected for performance simulations must consider spectral shape (e.g., through use of the epsilon parameter) and should appropriately account for correlations between motions in both horizontal directions; (2) Lower-bound component models, which are commonly used in performance-based assessment procedures such as FEMA 356, can significantly bias collapse analysis results; it is more appropriate to use median component behavior, including all aspects of the component model (strength, stiffness, deformation capacity, cyclic deterioration, etc.); (3) Structural modeling uncertainties related to component deformation capacity and post-peak degrading stiffness can impact the variability of calculated collapse probabilities and mean annual rates to a similar degree as record-to-record variability of ground motions. Therefore, including the effects of such structural modeling uncertainties significantly increases the mean annual collapse rates. We found this increase to be roughly four to eight times relative to rates evaluated for the median structural model; (4) Nonlinear response analyses revealed at least six distinct collapse mechanisms, the most common of which was a story mechanism in the third story (differing from the multi-story mechanism predicted by nonlinear static pushover analysis); (5) Soil-foundation-structure interaction effects did not significantly affect the structural response, which was expected given the relatively flexible superstructure and stiff soils. The potential for financial loss is considerable. Overall, the calculated expected annual losses (EAL) are in the range of 52,000to52,000 to 97,000 for the various code-conforming benchmark building designs, or roughly 1% of the replacement cost of the building (8.8M).Theselossesaredominatedbytheexpectedrepaircostsofthewallboardpartitions(includinginteriorpaint)andbythestructuralmembers.Lossestimatesaresensitivetodetailsofthestructuralmodels,especiallytheinitialstiffnessofthestructuralelements.Lossesarealsofoundtobesensitivetostructuralmodelingchoices,suchasignoringthetensilestrengthoftheconcrete(40EAL)orthecontributionofthegravityframestooverallbuildingstiffnessandstrength(15changeinEAL).Althoughthereareanumberoffactorsidentifiedintheliteratureaslikelytoaffecttheriskofhumaninjuryduringseismicevents,thecasualtymodelinginthisstudyfocusesonthosefactors(buildingcollapse,buildingoccupancy,andspatiallocationofbuildingoccupants)thatdirectlyinformthebuildingdesignprocess.Theexpectedannualnumberoffatalitiesiscalculatedforthebenchmarkbuilding,assumingthatanearthquakecanoccuratanytimeofanydaywithequalprobabilityandusingfatalityprobabilitiesconditionedonstructuralcollapseandbasedonempiricaldata.Theexpectedannualnumberoffatalitiesforthecodeconformingbuildingsrangesbetween0.05102and0.21102,andisequalto2.30102foranoncodeconformingdesign.Theexpectedlossoflifeduringaseismiceventisperhapsthedecisionvariablethatownersandpolicymakerswillbemostinterestedinmitigating.Thefatalityestimationcarriedoutforthebenchmarkbuildingprovidesamethodologyforcomparingthisimportantvalueforvariousbuildingdesigns,andenablesinformeddecisionmakingduringthedesignprocess.Theexpectedannuallossassociatedwithfatalitiescausedbybuildingearthquakedamageisestimatedbyconvertingtheexpectedannualnumberoffatalitiesintoeconomicterms.Assumingthevalueofahumanlifeis8.8M). These losses are dominated by the expected repair costs of the wallboard partitions (including interior paint) and by the structural members. Loss estimates are sensitive to details of the structural models, especially the initial stiffness of the structural elements. Losses are also found to be sensitive to structural modeling choices, such as ignoring the tensile strength of the concrete (40% change in EAL) or the contribution of the gravity frames to overall building stiffness and strength (15% change in EAL). Although there are a number of factors identified in the literature as likely to affect the risk of human injury during seismic events, the casualty modeling in this study focuses on those factors (building collapse, building occupancy, and spatial location of building occupants) that directly inform the building design process. The expected annual number of fatalities is calculated for the benchmark building, assuming that an earthquake can occur at any time of any day with equal probability and using fatality probabilities conditioned on structural collapse and based on empirical data. The expected annual number of fatalities for the code-conforming buildings ranges between 0.05*10 -2 and 0.21*10 -2 , and is equal to 2.30*10 -2 for a non-code conforming design. The expected loss of life during a seismic event is perhaps the decision variable that owners and policy makers will be most interested in mitigating. The fatality estimation carried out for the benchmark building provides a methodology for comparing this important value for various building designs, and enables informed decision making during the design process. The expected annual loss associated with fatalities caused by building earthquake damage is estimated by converting the expected annual number of fatalities into economic terms. Assuming the value of a human life is 3.5M, the fatality rate translates to an EAL due to fatalities of 3,500to3,500 to 5,600 for the code-conforming designs, and 79,800forthenoncodeconformingdesign.ComparedtotheEALduetorepaircostsofthecodeconformingdesigns,whichareontheorderof79,800 for the non-code conforming design. Compared to the EAL due to repair costs of the code-conforming designs, which are on the order of 66,000, the monetary value associated with life loss is small, suggesting that the governing factor in this respect will be the maximum permissible life-safety risk deemed by the public (or its representative government) to be appropriate for buildings. Although the focus of this report is on one specific building, it can be used as a reference for other types of structures. This report is organized in such a way that the individual core chapters (4, 5, and 6) can be read independently. Chapter 1 provides background on the performance-based earthquake engineering (PBEE) approach. Chapter 2 presents the implementation of the PBEE methodology of the PEER framework, as applied to the benchmark building. Chapter 3 sets the stage for the choices of location and basic structural design. The subsequent core chapters focus on the hazard analysis (Chapter 4), the structural analysis (Chapter 5), and the damage and loss analyses (Chapter 6). Although the report is self-contained, readers interested in additional details can find them in the appendices

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Bi-dimensional Composition with Domain Specific Languages

    Get PDF

    Drought: Economic Consequences and Policies for Mitigation Global Overview

    Get PDF
    The natural variation in climate around the world means that periods of severe shortfall of rainfall are inevitable, and some times occur on a large geographical scale. Human settlements have adapted to this reality in many different ways, including the development of agricultural systems that feature variously robust aspects in the face of drought. As climates change under the influence of modified atmospheric composition, it seems likely that many parts of the world will face increased incidence of drought and thus more challenging tasks for farm managers, managers of non- farm enterprises that are sensitive to drought, national policy makers and, last but not least, households in rural areas that are close to subsistence levels even in non-drought seasons. The agricultural economics profession must continue to contribute to better dealing with all these challenges.Environmental Economics and Policy,

    Assistive technology design and development for acceptable robotics companions for ageing years

    Get PDF
    © 2013 Farshid Amirabdollahian et al., licensee Versita Sp. z o. o. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs license, which means that the text may be used for non-commercial purposes, provided credit is given to the author.A new stream of research and development responds to changes in life expectancy across the world. It includes technologies which enhance well-being of individuals, specifically for older people. The ACCOMPANY project focuses on home companion technologies and issues surrounding technology development for assistive purposes. The project responds to some overlooked aspects of technology design, divided into multiple areas such as empathic and social human-robot interaction, robot learning and memory visualisation, and monitoring persons’ activities at home. To bring these aspects together, a dedicated task is identified to ensure technological integration of these multiple approaches on an existing robotic platform, Care-O-Bot®3 in the context of a smart-home environment utilising a multitude of sensor arrays. Formative and summative evaluation cycles are then used to assess the emerging prototype towards identifying acceptable behaviours and roles for the robot, for example role as a butler or a trainer, while also comparing user requirements to achieved progress. In a novel approach, the project considers ethical concerns and by highlighting principles such as autonomy, independence, enablement, safety and privacy, it embarks on providing a discussion medium where user views on these principles and the existing tension between some of these principles, for example tension between privacy and autonomy over safety, can be captured and considered in design cycles and throughout project developmentsPeer reviewe
    corecore