1,612 research outputs found

    Reducing Urban Pollution Exposure from Road Transport(RUPERT)

    Get PDF
    This paper presents the preliminary results of a two-year study on reducing urban pollution exposure from road transport (RUPERT). The main aim of this project is to develop a new modelling framework for nitrogen dioxide, carbon monoxide and particulate matter to simulate exposures of different population groups across a city, and to assess the impact of roadside concentrations on these exposures. This will be achieved by modelling the frequency distribution of personal exposures (PEFDs) as a function of urban background and roadside concentrations, under different traffic conditions. The modelling approach combines new and existing models relating traffic and air pollution data, with particular emphasis of the impact of congestion, and the probabilistic modelling framework of personal exposure. Modelling of roadside concentrations consists of two main elements, namely the analysis of concentrations patterns at different roadside sites and of the relationship between traffic conditions and added roadside pollution. Roadside concentrations are predicted using empirically derived relationships; statistical models, novel statistics and artificial neural networks namely feed forward neural network and radial basis neural network. The exposure modelling is carried out by linking two models: the INDAIR model, which is designed to simulate probabilistically diurnal profiles of air pollutant concentrations in a range of microenvironments, and the EXPAIR model, which is designed to simulate population exposure patterns based on population time-activity patterns and a library of micro-environmental concentrations derived from the INDAIR model

    Efficient O-demethylation of lignin monoaromatics using the peroxygenase activity of cytochrome P450 enzymes

    Get PDF
    A crucial reaction in harnessing renewable carbon from lignin is O-demethylation. We demonstrate the selective O-demethylation of syringol and guaiacol using different cytochrome P450 enzymes. These can efficiently use hydrogen peroxide which, when compared to nicotinamide cofactor-dependent monooxygenases and synthetic methods, allows for cheap and clean O-demethylation of lignin-derived aromatics.Alix C. Harlington, Keith E. Shearwin, Stephen G. Bell and Fiona Whela

    Combining a ractopamine feeding regime and porcine somatotropin has additive effects on finisher pig performance

    Get PDF
    Treatment of finisher pigs with dietary ractopamine (RAC; Paylean®, Elanco Animal Health, NSW) improves daily gain and feed efficiency commensurate with increased protein deposition in finishing pigs (Dunshea et al., 1993). However, effects of RAC on P2 fat deposition are equivocal. Dunshea et al. (1993) found no change in gilts and barrows, whilst a trend towards reduced P2 depth was observed in boars fed dietary RAC. Exogenous porcine somatotropin (pST; Reporcin®, OzBioPharm Pty Ltd, Victoria) improves daily gain and feed efficiency and increases the ratio oflean to fat in carcases of boars, gilts and barrows (Campbell et al., 1989). As both technologies are applied at the end of the finishing phase, it is of interest to determine whether a combination of RAC and pST has additive effects on pig performance

    Virtual embedded librarianship for information literacy teaching.

    Get PDF
    This paper, reports on the planning and preliminary results of an action research project undertaken for the redesign of an online distance learning information literacy (IL) module on the basis of virtual 'embedded librarianship'. The research project, which followed an action research design, brought together the IL module coordinator and an Academic Liaison Librarian, working at different institutions to collaboratively redesign the assessment and teaching of the module. Data were collected via a qualitative analysis of students' work and a series of open-ended questions addressed to students on the value of the approach followed. Students reacted positively to the embedded librarianship design and engaged constructively in situated learning. Challenges included time-zones differences, the contribution level of students and lack of confidence. The paper puts emphasis on educating future information professionals as embedded information literacy partners, promoting the development of transferable skills and a collaborative/sharing online working ethos

    National-scale analysis of low flow frequency: historical trends and potential future changes

    Get PDF
    The potential impact of climate change on hydrological extremes is of increasing concern across the globe. Here, a national-scale grid-based hydrological model is used to investigate historical trends and potential future changes in low flow frequency across Great Britain. The historical analyses use both observational data (1891–2015) and ensemble data from a regional climate model (1900–2006). The results show relatively few significant trends in historical low flows (2- or 20-year return period), whether based on 7- or 30-day annual minima. Significant negative trends seen in some limited parts of the country when using observational data are generally not seen when using climate model data. The future analyses use climate model ensemble data for both near future and far future time periods (2020–2049 and 2070–2099 respectively), which are compared to a baseline sub-period from the historical ensemble (1975–2004). The results show future reductions in low flows, which are generally larger in the south of the country, at the higher (20-year) return period, and for the later time period. Reductions are more limited if the estimates of future potential evaporation include the effect of increased carbon dioxide concentrations on stomatal resistance. Such reductions in river flow could have significant impacts on the aquatic environment and on agriculture, and present a challenge for water managers, especially as reductions in water supply are likely to occur alongside increases in demand

    Weighing black holes with warm absorbers

    Get PDF
    We present a new technique for determining an upper limit for the mass of the black hole in active galactic nuclei showing warm absorption features. The method relies on the balance of radiative and gravitational forces acting on outflowing warm absorber clouds. It has been applied to 6 objects: five Seyfert 1 galaxies: IC 4329a, MCG-6-30-15, NGC 3516, NGC 4051 and NGC 5548; and one radio-quiet quasar: MR 2251-178. We discuss our result in comparison with other methods. The procedure could also be applied to any other radiatively driven optically thin outflow in which the spectral band covering the major absorption is directly observed.Comment: 13 pages, 6 figures, 7 tables. MNRAS accepte

    Implications of quantum automata for contextuality

    Full text link
    We construct zero-error quantum finite automata (QFAs) for promise problems which cannot be solved by bounded-error probabilistic finite automata (PFAs). Here is a summary of our results: - There is a promise problem solvable by an exact two-way QFA in exponential expected time, but not by any bounded-error sublogarithmic space probabilistic Turing machine (PTM). - There is a promise problem solvable by an exact two-way QFA in quadratic expected time, but not by any bounded-error o(loglogn) o(\log \log n) -space PTMs in polynomial expected time. The same problem can be solvable by a one-way Las Vegas (or exact two-way) QFA with quantum head in linear (expected) time. - There is a promise problem solvable by a Las Vegas realtime QFA, but not by any bounded-error realtime PFA. The same problem can be solvable by an exact two-way QFA in linear expected time but not by any exact two-way PFA. - There is a family of promise problems such that each promise problem can be solvable by a two-state exact realtime QFAs, but, there is no such bound on the number of states of realtime bounded-error PFAs solving the members this family. Our results imply that there exist zero-error quantum computational devices with a \emph{single qubit} of memory that cannot be simulated by any finite memory classical computational model. This provides a computational perspective on results regarding ontological theories of quantum mechanics \cite{Hardy04}, \cite{Montina08}. As a consequence we find that classical automata based simulation models \cite{Kleinmann11}, \cite{Blasiak13} are not sufficiently powerful to simulate quantum contextuality. We conclude by highlighting the interplay between results from automata models and their application to developing a general framework for quantum contextuality.Comment: 22 page
    corecore