8,326 research outputs found
A new dry biomedical electrode
Electronic circuitry contains new operational amplifier which incorporates monolithic super-gain transistors. Electrode does not provide voltage amplification; instead, it acts as current amplifier to make it possible to pick up electrical potentials from surface of highly resistant dry skin
Energy and Economic Growth
Physical theory shows that energy is necessary for economic production and therefore growth but the mainstream theory of economic growth, except for specialized resource economics models, pays no attention to the role of energy. This paper reviews the relevant biophysical theory, mainstream and resource economics models of growth, the critiques of mainstream models, and the various mechanisms that can weaken the links between energy and growth. Finally we review the empirical literature that finds that energy used per unit of economic output has declined, but that this is to a large extent due to a shift from poorer quality fuels such as coal to the use of higher quality fuels, and especially electricity. Furthermore, time series analysis shows that energy and GDP cointegrate and energy use Granger causes GDP when additional variables such as energy prices or other production inputs are included. As a result, prospects for further large reductions in energy intensity seem limited.
Carbon Free Boston: Waste Technical Report
Part of a series of reports that includes:
Carbon Free Boston: Summary Report;
Carbon Free Boston: Social Equity Report;
Carbon Free Boston: Technical Summary;
Carbon Free Boston: Buildings Technical Report;
Carbon Free Boston: Transportation Technical Report;
Carbon Free Boston: Energy Technical Report;
Carbon Free Boston: Offsets Technical Report;
Available at http://sites.bu.edu/cfb/OVERVIEW:
For many people, their most perceptible interaction with their environmental footprint is through the
waste that they generate. On a daily basis people have numerous opportunities to decide whether to
recycle, compost or throwaway. In many cases, such options may not be present or apparent. Even
when such options are available, many lack the knowledge of how to correctly dispose of their waste,
leading to contamination of valuable recycling or compost streams. Once collected, people give little
thought to how their waste is treated. For Boston’s waste, plastic in the disposal stream acts becomes a
fossil fuel used to generate electricity. Organics in the waste stream have the potential to be used to
generate valuable renewable energy, while metals and electronics can be recycled to offset virgin
materials. However, challenges in global recycling markets are burdening municipalities, which are
experiencing higher costs to maintain their recycling.
The disposal of solid waste and wastewater both account for a large and visible anthropogenic impact
on human health and the environment. In terms of climate change, landfilling of solid waste and
wastewater treatment generated emissions of 131.5 Mt CO2e in 2016 or about two percent of total
United States GHG emissions that year. The combustion of solid waste contributed an additional 11.0 Mt
CO2e, over half of which (5.9 Mt CO2e) is attributable to the combustion of plastic [1]. In Massachusetts,
the GHG emissions from landfills (0.4 Mt CO2e), waste combustion (1.2 Mt CO2e), and wastewater (0.5
Mt CO2e) accounted for about 2.7 percent of the state’s gross GHG emissions in 2014 [2].
The City of Boston has begun exploring pathways to Zero Waste, a goal that seeks to systematically
redesign our waste management system that can simultaneously lead to a drastic reduction in emissions
from waste. The easiest way to achieve zero waste is to not generate it in the first place. This can start at
the source with the decision whether or not to consume a product. This is the intent behind banning
disposable items such as plastic bags that have more sustainable substitutes. When consumption occurs,
products must be designed in such a way that their lifecycle impacts and waste footprint are considered.
This includes making durable products, limiting the use of packaging or using organic packaging
materials, taking back goods at the end of their life, and designing products to ensure compatibility with
recycling systems. When reducing waste is unavoidable, efforts to increase recycling and organics
diversion becomes essential for achieving zero waste. [TRUNCATED]Published versio
Selection of neutralizing antibody escape mutants with type A influenza virus HA-specific polyclonal antisera: possible significance for antigenic drift
Ten antisera were produced in rabbits by two or three intravenous injections of inactivated whole influenza type A virions. All contained haemagglutination-inhibition (HI) antibody directed predominantly to an epitope in antigenic site B and, in addition, various amounts of antibodies to an epitope in site A and in site D. The ability of untreated antisera to select neutralization escape mutants was investigated by incubating virus possessing the homologous haemagglutinin with antiserum adjusted to contain anti-B epitope HI titres of 100, 1000 and 10000 HIU/ml. Virus-antiserum mixtures were inoculated into embryonated hen's eggs, and progeny virus examined without further selection. Forty percent of the antisera at a titre of 1000 HIU/ml selected neutralizing antibody escape mutants as defined by their lack of reactivity to Mab HC10 (site B), and unchanged reactivity to other Mabs to site A and site D epitopes. All escape mutant-selecting antisera had a ratio of anti-site B (HC10)-epitope antibody[ratio]other antibodies of [gt-or-equal, slanted]2·0[ratio]1. The antiserum with the highest ratio (7·4[ratio]1) selected escape mutants in all eggs tested in four different experiments. No antiserum used at a titre of 10000 HIU/ml allowed multiplication of any virus. All antisera used at a titre of 100 HIU/ml permitted virus growth, but this was wild-type (wt) virus. We conclude that a predominant epitope-specific antibody response, a titre of [gt-or-equal, slanted]1000 HIU/ml, and a low absolute titre of other antibodies ([less-than-or-eq, slant]500 HIU/ml) are three requirements for the selection of escape mutants. None of the antisera in this study could have selected escape mutants without an appropriate dilution factor, so the occurrence of an escape mutant-selecting antiserum in nature is likely to be a rare event
A Bayesian spatio-temporal model of panel design data: airborne particle number concentration in Brisbane, Australia
This paper outlines a methodology for semi-parametric spatio-temporal
modelling of data which is dense in time but sparse in space, obtained from a
split panel design, the most feasible approach to covering space and time with
limited equipment. The data are hourly averaged particle number concentration
(PNC) and were collected, as part of the Ultrafine Particles from Transport
Emissions and Child Health (UPTECH) project. Two weeks of continuous
measurements were taken at each of a number of government primary schools in
the Brisbane Metropolitan Area. The monitoring equipment was taken to each
school sequentially. The school data are augmented by data from long term
monitoring stations at three locations in Brisbane, Australia.
Fitting the model helps describe the spatial and temporal variability at a
subset of the UPTECH schools and the long-term monitoring sites. The temporal
variation is modelled hierarchically with penalised random walk terms, one
common to all sites and a term accounting for the remaining temporal trend at
each site. Parameter estimates and their uncertainty are computed in a
computationally efficient approximate Bayesian inference environment, R-INLA.
The temporal part of the model explains daily and weekly cycles in PNC at the
schools, which can be used to estimate the exposure of school children to
ultrafine particles (UFPs) emitted by vehicles. At each school and long-term
monitoring site, peaks in PNC can be attributed to the morning and afternoon
rush hour traffic and new particle formation events. The spatial component of
the model describes the school to school variation in mean PNC at each school
and within each school ground. It is shown how the spatial model can be
expanded to identify spatial patterns at the city scale with the inclusion of
more spatial locations.Comment: Draft of this paper presented at ISBA 2012 as poster, part of UPTECH
projec
Reference manual for the Langley Research Center flight simulation computing system
The researchers at the Langley Research Center Flight Simulation Computing System are provided with an advanced real-time digital simulation capability. This capability is controlled at the user interface level by the Real Time Simulation Supervisor. The Supervisor is a group of subprograms loaded with a simulation application program. The Supervisor provides the interface between the application program and the operating system, and coordinates input and output to and from the simulation hardware. The Supervisor also performs various utility functions as required by a simulation application program
Impact of New Madrid Seismic Zone Earthquakes on the Central USA, Vol. 1 and 2
The information presented in this report has been developed to support the Catastrophic Earthquake Planning Scenario workshops held by the Federal Emergency Management Agency. Four FEMA Regions (Regions IV, V, VI and VII) were involved in the New Madrid Seismic Zone (NMSZ) scenario workshops. The four FEMA Regions include eight states, namely Illinois, Indiana, Kentucky, Tennessee, Alabama, Mississippi, Arkansas and Missouri.
The earthquake impact assessment presented hereafter employs an analysis methodology comprising three major components: hazard, inventory and fragility (or vulnerability). The hazard characterizes not only the shaking of the ground but also the consequential transient and permanent deformation of the ground due to strong ground shaking as well as fire and flooding. The inventory comprises all assets in a specific region, including the built environment and population data. Fragility or vulnerability functions relate the severity of shaking to the likelihood of reaching or exceeding damage states (light, moderate, extensive and near-collapse, for example). Social impact models are also included and employ physical infrastructure damage results to estimate the effects on exposed communities. Whereas the modeling software packages used (HAZUS MR3; FEMA, 2008; and MAEviz, Mid-America Earthquake Center, 2008) provide default values for all of the above, most of these default values were replaced by components of traceable provenance and higher reliability than the default data, as described below.
The hazard employed in this investigation includes ground shaking for a single scenario event representing the rupture of all three New Madrid fault segments. The NMSZ consists of three fault segments: the northeast segment, the reelfoot thrust or central segment, and the southwest segment. Each segment is assumed to generate a deterministic magnitude 7.7 (Mw7.7) earthquake caused by a rupture over the entire length of the segment. US Geological Survey (USGS) approved the employed magnitude and hazard approach. The combined rupture of all three segments simultaneously is designed to approximate the sequential rupture of all three segments over time. The magnitude of Mw7.7 is retained for the combined rupture. Full liquefaction susceptibility maps for the entire region have been developed and are used in this study.
Inventory is enhanced through the use of the Homeland Security Infrastructure Program (HSIP) 2007 and 2008 Gold Datasets (NGA Office of America, 2007). These datasets contain various types of critical infrastructure that are key inventory components for earthquake impact assessment. Transportation and utility facility inventories are improved while regional natural gas and oil pipelines are added to the inventory, alongside high potential loss facility inventories. The National Bridge Inventory (NBI, 2008) and other state and independent data sources are utilized to improve the inventory. New fragility functions derived by the MAE Center are employed in this study for both buildings and bridges providing more regionally-applicable estimations of damage for these infrastructure components. Default fragility values are used to determine damage likelihoods for all other infrastructure components.
The study reports new analysis using MAE Center-developed transportation network flow models that estimate changes in traffic flow and travel time due to earthquake damage. Utility network modeling was also undertaken to provide damage estimates for facilities and pipelines. An approximate flood risk model was assembled to identify areas that are likely to be flooded as a result of dam or levee failure. Social vulnerability identifies portions of the eight-state study region that are especially vulnerable due to various factors such as age, income, disability, and language proficiency. Social impact models include estimates of displaced and shelter-seeking populations as well as commodities and medical requirements. Lastly, search and rescue requirements quantify the number of teams and personnel required to clear debris and search for trapped victims.
The results indicate that Tennessee, Arkansas, and Missouri are most severely impacted. Illinois and Kentucky are also impacted, though not as severely as the previous three states. Nearly 715,000 buildings are damaged in the eight-state study region. About 42,000 search and rescue personnel working in 1,500 teams are required to respond to the earthquakes. Damage to critical infrastructure (essential facilities, transportation and utility lifelines) is substantial in the 140 impacted counties near the rupture zone, including 3,500 damaged bridges and nearly 425,000 breaks and leaks to both local and interstate pipelines. Approximately 2.6 million households are without power after the earthquake. Nearly 86,000 injuries and fatalities result from damage to infrastructure. Nearly 130 hospitals are damaged and most are located in the impacted counties near the rupture zone. There is extensive damage and substantial travel delays in both Memphis, Tennessee, and St. Louis, Missouri, thus hampering search and rescue as well as evacuation. Moreover roughly 15 major bridges are unusable. Three days after the earthquake, 7.2 million people are still displaced and 2 million people seek temporary shelter. Direct economic losses for the eight states total nearly $300 billion, while indirect losses may be at least twice this amount.
The contents of this report provide the various assumptions used to arrive at the impact estimates, detailed background on the above quantitative consequences, and a breakdown of the figures per sector at the FEMA region and state levels. The information is presented in a manner suitable for personnel and agencies responsible for establishing response plans based on likely impacts of plausible earthquakes in the central USA.Armu W0132T-06-02unpublishednot peer reviewe
Carbon Free Boston: Offsets Technical Report
Part of a series of reports that includes:
Carbon Free Boston: Summary Report;
Carbon Free Boston: Social Equity Report;
Carbon Free Boston: Technical Summary;
Carbon Free Boston: Buildings Technical Report;
Carbon Free Boston: Transportation Technical Report;
Carbon Free Boston: Waste Technical Report;
Carbon Free Boston: Energy Technical Report;
Available at http://sites.bu.edu/cfb/OVERVIEW:
The U.S. Environmental Protection Agency defines offsets as a specific activity or set of activities
intended to reduce GHG emissions, increase the storage of carbon, or enhance GHG removals from the
atmosphere [1]. From a city perspective, they provide a mechanism to negate residual GHG emissions—
those the city is unable to reduce directly—by supporting projects that avoid or sequester them outside
of the city’s reporting boundary.
Offsetting GHG emissions is a controversial topic for cities, as the co-benefits of the investment are
typically not realized locally. For this reason, offsetting emissions is considered a last resort, a strategy
option available when the city has exhausted all others. However, offsets are likely to be a necessity to
achieve carbon neutrality by 2050 and promote emissions reductions in the near term. While public and
private sector partners pursue the more complex systems transformation, cities can utilize offsets to
support short-term and relatively cost-effective reductions in emissions. Offsets can be a relatively
simple, certain, and high-impact way to support the transition to a low-carbon world.
This report focuses on carbon offset certificates, more often referred to as offsets. Each offset
represents a metric ton of verified carbon dioxide (CO2) or equivalent emissions that is reduced,
avoided, or permanently removed from the atmosphere (“sequestered”) through an action taken by the
creator of the offset. The certificates can be traded and retiring (that is, not re-selling) offsets can be a
useful component of an overall voluntary emissions reduction strategy, alongside activities to lower an
organization’s direct and indirect emissions. In the Global Protocol for Community-Scale Greenhouse Gas
Emissions Inventories (GPC), the GHG accounting system used by the City of Boston, any carbon offset
certificates that the City has can be deducted from the City’s total GHG emissions.http://sites.bu.edu/cfb/files/2019/06/CFB_Offsets_Technical_Report_051619.pdfPublished versio
- …