1,962 research outputs found

    Pulsed Laval nozzle study of the kinetics of OH with unsaturated hydrocarbons at very low temperatures

    Get PDF
    The kinetics of reactions of the OH radical with ethene, ethyne (acetylene), propyne (methyl acetylene) and t-butyl-hydroperoxide were studied at temperatures of 69 and 86 K using laser flash-photolysis combined with laser-induced fluorescence spectroscopy. A new pulsed Laval nozzle apparatus is used to provide the low-temperature thermalised environment at a single density of similar to 4 x 10(16) molecule cm(-3) in N-2. The density and temperature within the flow are determined using measurements of impact pressure and rotational populations from laser-induced fluorescence spectroscopy of NO and OH. For ethene, rate coefficients were determined to be k(2) = (3.22 +/- 0.46) x 10(-11) and (2.12 +/- 0.12) x 10(-11) cm(3) molecule(-1) s(-1) at T = 69 and 86 K, respectively, in good agreement with a master-equation calculation utilising an ab initio surface recently calculated for this reaction by Cleary et al. (P. A. Cleary, M. T. Baeza Romero, M. A. Blitz, D. E. Heard, M. J. Pilling, P. W. Seakins and L. Wang, Phys. Chem. Chem. Phys., 2006, 8, 5633-5642) For ethyne, no previous data exist below 210 K and a single measurement at 69 K was only able to provide an approximate upper limit for the rate coefficient of k(3) < 1 x 10(-12) cm(3) molecule(-1) s (-1), consistent with the presence of a small activation barrier of similar to 5 kJ mol (-1) between the reagents and the OH-C2H2 adduct. For propyne, there are no previous measurements below 253 K, and rate coefficients of k(4) = (5.08 +/- 0.65), (5.02 +/- 1.11) and (3.11 +/- 0.09) x 10(-12) cm(3) molecule(-1) s(-1) were obtained at T = 69, 86 and 299 K, indicating a much weaker temperature dependence than for ethene. The rate coefficient k(1) = (7.8 +/- 2.5) x 10(-11) cm(3) molecule(-1) s (-1) was obtained for the reaction of OH with t-butyl-hydroperoxide at T = 86 K. Studies of the reaction of OH with benzene and toluene yielded complex kinetic profiles of OH which did not allow the extraction of rate coefficients. Uncertainties are quoted at the 95% confidence limit and include systematic errors

    Estuarine research report 41

    Get PDF
    A review of management documents and peer reviewed literature was undertaken to evaluate the level of protection intertidal shellfish are given from vehicle and horse users on sand beaches. Database searches were conducted to find policies that related to vehicle and/or horse management on sand beaches. Using findings from peer reviewed literature, policies were assessed for how shellfish populations could be impacted. For example, policies that concentrate vehicle traffic into specific areas which contain shellfish were considered to have negative impacts because literature has shown heavy traffic has detrimental effects. Internationally, policies controlling vehicle and horse users utilise five common options: complete bans, seasonal closures, permits, area-based and zone-based designation. These management options usually focus on erosion prevention and ensuring safety of users with little consideration of ecological impacts. When ecology is considered, this concentrates on protecting the more visible species (e.g. nesting birds) rather than infaunal biota. Shellfish were not directly mentioned in any management policies that control vehicle and horse users. Shellfish in New Zealand are protected similarly to the rest of the world, and no policies designed to directly benefit these types of animals. Vehicle and horse users on sand beaches are controlled with bylaws; the creation and implementation of which depends on each local authority. Management of these users therefore does not occur uniformly over New Zealand regions. Where bylaws are in place, these generally confine vehicle and horse users to the intertidal zone; areas that shellfish, such as tuatua (Paphies donacina) and toheroa (P. ventricosa), are abundant. Seasonal beach restrictions are also generally rare, with the amount or type of traffic used on the beach unregulated. In order to successfully protect intertidal species such as tuatua, scientific information which identifies and describes the distribution, vulnerable life-stages and the relationship between beach traffic and shellfish vulnerability is needed

    Triple collisions (e+p+Be7) in solar plasma

    Full text link
    Several nuclear reactions involving the Be7 nucleus, not included into the standard model of the pp-chain, are discussed. A qualitative analysis of their possible influence on the fate of the Be7 in solar plasma and of their role in the interpretation of the solar neutrino experiments is given. As an example, the reaction rate of the nonradiative production of B8 in the triple collision p + e^- + Be7 ---> B8 + e^- is estimated in the framework of the adiabatic approximation. For the solar interior conditions the triple collision reaction rate is approximately 10^{-4} of that for the binary process p + Be7 ---> B8 + gamma .Comment: RevTeX, 15 pages, submitted to Nucl.Phys.

    Architectural mismatch tolerance

    Get PDF
    The integrity of complex software systems built from existing components is becoming more dependent on the integrity of the mechanisms used to interconnect these components and, in particular, on the ability of these mechanisms to cope with architectural mismatches that might exist between components. There is a need to detect and handle (i.e. to tolerate) architectural mismatches during runtime because in the majority of practical situations it is impossible to localize and correct all such mismatches during development time. When developing complex software systems, the problem is not only to identify the appropriate components, but also to make sure that these components are interconnected in a way that allows mismatches to be tolerated. The resulting architectural solution should be a system based on the existing components, which are independent in their nature, but are able to interact in well-understood ways. To find such a solution we apply general principles of fault tolerance to dealing with arch itectural mismatche

    The Effect of Age, Gender, and Previous Gaming Experience on Game Play Performance

    Full text link

    From E_8 to F via T

    Full text link
    We argue that T-duality and F-theory appear automatically in the E_8 gauge bundle perspective of M-theory. The 11-dimensional supergravity four-form determines an E_8 bundle. If we compactify on a two-torus, this data specifies an LLE_8 bundle where LG is a centrally-extended loopgroup of G. If one of the circles of the torus is smaller than sqrt(alpha') then it is also smaller than a nontrivial circle S in the LLE_8 fiber and so a dimensional reduction on the total space of the bundle is not valid. We conjecture that S is the circle on which the T-dual type IIB theory is compactified, with the aforementioned torus playing the role of the F-theory torus. As tests we reproduce the T-dualities between NS5-branes and KK-monopoles, as well as D6 and D7-branes where we find the desired F-theory monodromy. Using Hull's proposal for massive IIA, this realization of T-duality allows us to confirm that the Romans mass is the central extension of our LE_8. In addition this construction immediately reproduces the conjectured formula for global topology change from T-duality with H-flux.Comment: 25 pages, 4 eps figure

    Return-Volatility Relationship: Insights from Linear and Non-Linear Quantile Regression

    Get PDF
    The purpose of this paper is to examine the asymmetric relationship between price and implied volatility and the associated extreme quantile dependence using linear and non linear quantile regression approach. Our goal in this paper is to demonstrate that the relationship between the volatility and market return as quantified by Ordinary Least Square (OLS) regression is not uniform across the distribution of the volatility-price return pairs using quantile regressions. We examine the bivariate relationship of six volatility-return pairs, viz. CBOE-VIX and S&P-500, FTSE-100 Volatility and FTSE-100, NASDAQ-100 Volatility (VXN) and NASDAQ, DAX Volatility (VDAX) and DAX-30, CAC Volatility (VCAC) and CAC-40 and STOXX Volatility (VSTOXX) and STOXX. The assumption of a normal distribution in the return series is not appropriate when the distribution is skewed and hence OLS does not capture the complete picture of the relationship. Quantile regression on the other hand can be set up with various loss functions, both parametric and non-parametric (linear case) and can be evaluated with skewed marginal based copulas (for the non linear case). Which is helpful in evaluating the non-normal and non-linear nature of the relationship between price and volatility. In the empirical analysis we compare the results from linear quantile regression (LQR) and copula based non linear quantile regression known as copula quantile regression (CQR). The discussion of the properties of the volatility series and empirical findings in this paper have significance for portfolio optimization, hedging strategies, trading strategies and risk management in general

    The toxbox: specific DNA sequence requirements for activation of Vibrio cholerae virulence genes by ToxT

    Full text link
    The Gram-negative, curved rod Vibrio cholerae causes the severe diarrhoeal disease cholera. The two major virulence factors produced by V. cholerae during infection are the cholera toxin (CT) and the toxin-coregulated pilus (TCP). Transcription of the genes encoding both CT and the components of the TCP is directly activated by ToxT, a transcription factor in the AraC/XylS family. ToxT binds upstream of the ctxAB genes, encoding CT, and upstream of tcpA , the first gene in a large operon encoding the components of the TCP. The DNA sequences upstream of ctxAB and tcpA that contain ToxT binding sites do not have any significant similarity other than being AT-rich. Extensive site-directed mutagenesis was performed on the region upstream of tcpA previously shown to be protected by ToxT, and we identified specific base pairs important for activation of tcpA transcription by ToxT. This genetic approach was complemented by copper-phenanthroline footprinting experiments that showed protection by ToxT of the base pairs identified as most important for transcription activation in the mutagenesis experiments. Based on this new information and on previous work, we propose the presence of a ToxT-binding motif – the ‘toxbox’– in promoters regulated by ToxT. At tcpA , two toxbox elements are present in a direct repeat configuration and both are required for activation of transcription by ToxT. The identity of only a few of the base pairs within the toxbox is important for activation by ToxT, and we term these the core toxbox elements. Lastly, we examined ToxT binding to a mutant having 5 bp inserted between the two toxboxes at tcpA and found that occupancy of both binding sites is retained regardless of the positions of the binding sites relative to each other on the face of the DNA. This suggests that ToxT binds independently as a monomer to each toxbox in the tcpA direct repeat, in accordance with what we observed previously with the inverted repeat ToxT sites between acfA and acfD .Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/75755/1/j.1365-2958.2006.05053.x.pd

    Simple extensions of reflection subgroups of primitive complex reflection groups

    Get PDF
    If GG is a finite primitive complex reflection group, all reflection subgroups of GG and their inclusions are determined up to conjugacy. As a consequence, it is shown that if the rank of GG is nn and if GG can be generated by nn reflections, then for every set RR of nn reflections which generate GG, every subset of RR generates a parabolic subgroup of GG.Comment: 17 pages, 20 table

    Kernel density classification and boosting: an L2 sub analysis

    Get PDF
    Kernel density estimation is a commonly used approach to classification. However, most of the theoretical results for kernel methods apply to estimation per se and not necessarily to classification. In this paper we show that when estimating the difference between two densities, the optimal smoothing parameters are increasing functions of the sample size of the complementary group, and we provide a small simluation study which examines the relative performance of kernel density methods when the final goal is classification. A relative newcomer to the classification portfolio is “boosting”, and this paper proposes an algorithm for boosting kernel density classifiers. We note that boosting is closely linked to a previously proposed method of bias reduction in kernel density estimation and indicate how it will enjoy similar properties for classification. We show that boosting kernel classifiers reduces the bias whilst only slightly increasing the variance, with an overall reduction in error. Numerical examples and simulations are used to illustrate the findings, and we also suggest further areas of research
    corecore