776 research outputs found

    Using exomarkers to assess mitochondrial reactive species in vivo

    Get PDF
    Background: The ability to measure the concentrations of small damaging and signalling molecules such as reactive oxygen species (ROS) in vivo is essential to understanding their biological roles. While a range of methods can be applied to in vitro systems, measuring the levels and relative changes in reactive species in vivo is challenging. Scope of review: One approach towards achieving this goal is the use of exomarkers. In this, exogenous probe compounds are administered to the intact organism and are then transformed by the reactive molecules in vivo to produce a diagnostic exomarker. The exomarker and the precursor probe can be analysed ex vivo to infer the identity and amounts of the reactive species present in vivo. This is akin to the measurement of biomarkers produced by the interaction of reactive species with endogenous biomolecules. Major conclusions and general significance: Our laboratories have developed mitochondria-targeted probes that generate exomarkers that can be analysed ex vivo by mass spectrometry to assess levels of reactive species within mitochondria in vivo. We have used one of these compounds, MitoB, to infer the levels of mitochondrial hydrogen peroxide within flies and mice. Here we describe the development of MitoB and expand on this example to discuss how better probes and exomarkers can be developed. This article is part of a Special Issue entitled Current methods to study reactive oxygen species - pros and cons and biophysics of membrane proteins. Guest Editor: Christine Winterbourn. Abbreviations: EPR, electron paramagnetic resonance; GFP, green fluorescent protein; 4-HNE, 4-hydroxynonenal; MitoB, 3-(dihydroxyboronyl)benzyltriphenylphosphonium bromide; MitoP, (3-hydroxybenzyl)triphenylphosphonium bromide; ROS, reactive oxygen species; SOD, superoxide dismutase; TPMP, methyltriphenylphosphonium; TPP, triphenylphosphonium catio

    Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond

    Full text link
    Many historians of the calculus deny significant continuity between infinitesimal calculus of the 17th century and 20th century developments such as Robinson's theory. Robinson's hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley's criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz's infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz's defense of infinitesimals is more firmly grounded than Berkeley's criticism thereof. We show, moreover, that Leibniz's system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz's strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity.Comment: 69 pages, 3 figure

    Identification and quantification of protein S-nitrosation by nitrite in the mouse heart during ischemia.

    Get PDF
    Nitrate (NO3-) and nitrite (NO2-) are known to be cardioprotective and to alter energy metabolism in vivo NO3- action results from its conversion to NO2- by salivary bacteria, but the mechanism(s) by which NO2- affects metabolism remains obscure. NO2- may act by S-nitrosating protein thiols, thereby altering protein activity. But how this occurs, and the functional importance of S-nitrosation sites across the mammalian proteome, remain largely uncharacterized. Here we analyzed protein thiols within mouse hearts in vivo using quantitative proteomics to determine S-nitrosation site occupancy. We extended the thiol-redox proteomic technique, isotope-coded affinity tag labeling, to quantify the extent of NO2--dependent S-nitrosation of proteins thiols in vivo Using this approach, called SNOxICAT (S-nitrosothiol redox isotope-coded affinity tag), we found that exposure to NO2- under normoxic conditions or exposure to ischemia alone results in minimal S-nitrosation of protein thiols. However, exposure to NO2- in conjunction with ischemia led to extensive S-nitrosation of protein thiols across all cellular compartments. Several mitochondrial protein thiols exposed to the mitochondrial matrix were selectively S-nitrosated under these conditions, potentially contributing to the beneficial effects of NO2- on mitochondrial metabolism. The permeability of the mitochondrial inner membrane to HNO2, but not to NO2-, combined with the lack of S-nitrosation during anoxia alone or by NO2- during normoxia places constraints on how S-nitrosation occurs in vivo and on its mechanisms of cardioprotection and modulation of energy metabolism. Quantifying S-nitrosated protein thiols now allows determination of modified cysteines across the proteome and identification of those most likely responsible for the functional consequences of NO2- exposure

    Ten Misconceptions from the History of Analysis and Their Debunking

    Full text link
    The widespread idea that infinitesimals were "eliminated" by the "great triumvirate" of Cantor, Dedekind, and Weierstrass is refuted by an uninterrupted chain of work on infinitesimal-enriched number systems. The elimination claim is an oversimplification created by triumvirate followers, who tend to view the history of analysis as a pre-ordained march toward the radiant future of Weierstrassian epsilontics. In the present text, we document distortions of the history of analysis stemming from the triumvirate ideology of ontological minimalism, which identified the continuum with a single number system. Such anachronistic distortions characterize the received interpretation of Stevin, Leibniz, d'Alembert, Cauchy, and others.Comment: 46 pages, 4 figures; Foundations of Science (2012). arXiv admin note: text overlap with arXiv:1108.2885 and arXiv:1110.545

    Oink: an Implementation and Evaluation of Modern Parity Game Solvers

    Full text link
    Parity games have important practical applications in formal verification and synthesis, especially to solve the model-checking problem of the modal mu-calculus. They are also interesting from the theory perspective, as they are widely believed to admit a polynomial solution, but so far no such algorithm is known. In recent years, a number of new algorithms and improvements to existing algorithms have been proposed. We implement a new and easy to extend tool Oink, which is a high-performance implementation of modern parity game algorithms. We further present a comprehensive empirical evaluation of modern parity game algorithms and solvers, both on real world benchmarks and randomly generated games. Our experiments show that our new tool Oink outperforms the current state-of-the-art.Comment: Accepted at TACAS 201

    Semantic Labelling and Learning for Parity Game Solving in LTL Synthesis

    Full text link
    We propose "semantic labelling" as a novel ingredient for solving games in the context of LTL synthesis. It exploits recent advances in the automata-based approach, yielding more information for each state of the generated parity game than the game graph can capture. We utilize this extra information to improve standard approaches as follows. (i) Compared to strategy improvement (SI) with random initial strategy, a more informed initialization often yields a winning strategy directly without any computation. (ii) This initialization makes SI also yield smaller solutions. (iii) While Q-learning on the game graph turns out not too efficient, Q-learning with the semantic information becomes competitive to SI. Since already the simplest heuristics achieve significant improvements the experimental results demonstrate the utility of semantic labelling. This extra information opens the door to more advanced learning approaches both for initialization and improvement of strategies

    LNCS

    Get PDF
    Discrete-time Markov Chains (MCs) and Markov Decision Processes (MDPs) are two standard formalisms in system analysis. Their main associated quantitative objectives are hitting probabilities, discounted sum, and mean payoff. Although there are many techniques for computing these objectives in general MCs/MDPs, they have not been thoroughly studied in terms of parameterized algorithms, particularly when treewidth is used as the parameter. This is in sharp contrast to qualitative objectives for MCs, MDPs and graph games, for which treewidth-based algorithms yield significant complexity improvements. In this work, we show that treewidth can also be used to obtain faster algorithms for the quantitative problems. For an MC with n states and m transitions, we show that each of the classical quantitative objectives can be computed in O((n+m)⋅t2) time, given a tree decomposition of the MC with width t. Our results also imply a bound of O(Îș⋅(n+m)⋅t2) for each objective on MDPs, where Îș is the number of strategy-iteration refinements required for the given input and objective. Finally, we make an experimental evaluation of our new algorithms on low-treewidth MCs and MDPs obtained from the DaCapo benchmark suite. Our experiments show that on low-treewidth MCs and MDPs, our algorithms outperform existing well-established methods by one or more orders of magnitude

    Search for CP Violation in the Decay Z -> b (b bar) g

    Full text link
    About three million hadronic decays of the Z collected by ALEPH in the years 1991-1994 are used to search for anomalous CP violation beyond the Standard Model in the decay Z -> b \bar{b} g. The study is performed by analyzing angular correlations between the two quarks and the gluon in three-jet events and by measuring the differential two-jet rate. No signal of CP violation is found. For the combinations of anomalous CP violating couplings, h^b=h^AbgVb−h^VbgAb{\hat{h}}_b = {\hat{h}}_{Ab}g_{Vb}-{\hat{h}}_{Vb}g_{Ab} and hb∗=h^Vb2+h^Ab2h^{\ast}_b = \sqrt{\hat{h}_{Vb}^{2}+\hat{h}_{Ab}^{2}}, limits of \hat{h}_b < 0.59and and h^{\ast}_{b} < 3.02$ are given at 95\% CL.Comment: 8 pages, 1 postscript figure, uses here.sty, epsfig.st
    • 

    corecore