5,168 research outputs found

    Autophagy generates citrullinated peptides in human synoviocytes: a possible trigger for anti-citrullinated peptide antibodies

    Get PDF
    OBJECTIVES: Autophagy may represent a functional processing event that creates a substrate for autoreactivity. In particular, autophagy may play a role in the pathogenesis of RA, since autophagy is a key cellular event involved in the generation of citrullinated peptides, with consequent breakage of tolerance. Thus, in RA, autophagy may be the common feature in several situations (including smoking, joint injury and infection) that may drive the adaptive responses to citrullinated self-proteins. The aim of this study was the analysis, in vitro, of the role of autophagy in the generation of citrullinated peptides and, in vivo, of the relationship between autophagy and the production of anti-CCP antibodies (Abs). METHODS: For autophagy induction, fibroblast-like synoviocytes, primary fibroblasts and monocytes were stimulated with tunicamycin or rapamycin. Peptidyl arginine deiminase activity was tested by enzyme-linked immunosorbent assay, and protein citrullination was evaluated by western blotting. The main citrullinated RA candidate antigens, vimentin, α-enolase and filaggrin, were demonstrated by immunoprecipitation. The relationship between autophagy and anti-CCP Abs was analysed in 30 early-active RA patients. RESULTS: Our results demonstrated in vitro a role for autophagy in the citrullination process. Cells treated with tunicamycin or rapamycin showed peptidyl arginine deiminase 4 activation, with consequent protein citrullination. Immunoblotting and immunoprecipitation experiments, using specific Abs, identified the main citrullinated proteins: vimentin, α-enolase and filaggrin. In vivo, a significant association between levels of autophagy and anti-CCP Abs was observed in treatment-naïve early-active RA patients. CONCLUSION: These findings support the view that the processing of proteins in autophagy generates citrullinated peptides recognized by the immune system in RA

    Population stability: regulating size in the presence of an adversary

    Full text link
    We introduce a new coordination problem in distributed computing that we call the population stability problem. A system of agents each with limited memory and communication, as well as the ability to replicate and self-destruct, is subjected to attacks by a worst-case adversary that can at a bounded rate (1) delete agents chosen arbitrarily and (2) insert additional agents with arbitrary initial state into the system. The goal is perpetually to maintain a population whose size is within a constant factor of the target size NN. The problem is inspired by the ability of complex biological systems composed of a multitude of memory-limited individual cells to maintain a stable population size in an adverse environment. Such biological mechanisms allow organisms to heal after trauma or to recover from excessive cell proliferation caused by inflammation, disease, or normal development. We present a population stability protocol in a communication model that is a synchronous variant of the population model of Angluin et al. In each round, pairs of agents selected at random meet and exchange messages, where at least a constant fraction of agents is matched in each round. Our protocol uses three-bit messages and ω(log2N)\omega(\log^2 N) states per agent. We emphasize that our protocol can handle an adversary that can both insert and delete agents, a setting in which existing approximate counting techniques do not seem to apply. The protocol relies on a novel coloring strategy in which the population size is encoded in the variance of the distribution of colors. Individual agents can locally obtain a weak estimate of the population size by sampling from the distribution, and make individual decisions that robustly maintain a stable global population size

    Full Counting Statistics of Non-Commuting Variables: the Case of Spin Counts

    Full text link
    We discuss the Full Counting Statistics of non-commuting variables with the measurement of successive spin counts in non-collinear directions taken as an example. We show that owing to an irreducible detector back-action, the FCS in this case may be sensitive to the dynamics of the detectors, and may differ from the predictions obtained with using a naive version of the Projection Postulate. We present here a general model of detector dynamics and path-integral approach to the evaluation of FCS. We concentrate further on a simple "diffusive" model of the detector dynamics where the FCS can be evaluated with transfer-matrix method. The resulting probability distribution of spin counts is characterized by anomalously large higher cumulants and substantially deviates from Gaussian Statistics.Comment: 11 pages, 3 figure

    Freeze or flee? : negative stimuli elicit selective responding

    Get PDF
    Humans preferentially attend to negative stimuli. A consequence of this automatic vigilance for negative valence is that negative words elicit slower responses than neutral or positive words on a host of cognitive tasks. Some researchers have speculated that negative stimuli elicit a general suppression of motor activity, akin to the freezing response exhibited by animals under threat. Alternatively, we suggest that negative stimuli only elicit slowed responding on tasks for which stimulus valence is irrelevant for responding. To discriminate between these motor suppression and response-relevance hypotheses, we elicited both lexical decisions and valence judgments of negative words and positive words. Relative to positive words (e.g., kitten), negative words (e.g., spider) elicited slower lexical decisions but faster valence judgments. Results therefore indicate that negative stimuli do not cause a generalized motor suppression. Rather, negative stimuli elicit selective responding, with faster responses on tasks for which stimulus valence is response-relevant

    On EPR paradox, Bell's inequalities and experiments which prove nothing

    Full text link
    This article shows that the there is no paradox. Violation of Bell's inequalities should not be identified with a proof of non locality in quantum mechanics. A number of past experiments is reviewed, and it is concluded that the experimental results should be re-evaluated. The results of the experiments with atomic cascade are shown not to contradict the local realism. The article points out flaws in the experiments with down-converted photons. The experiments with neutron interferometer on measuring the "contextuality" and Bell-like inequalities are analyzed, and it is shown that the experimental results can be explained without such notions. Alternative experiment is proposed to prove the validity of local realism.Comment: 27 pages, 8 figures. I edited a little the text and abstract I corrected equations (49) and (50

    Investigating the impact of nicotine on executive functions using a novel virtual reality assessment

    Get PDF
    Aims Nicotine is known to enhance aspects of cognitive functioning in abstinent smokers but the effects on specific areas of executive functions, and in non-smokers are inconclusive. This may be due in part to the poor sensitivity of tests used to assess executive functions. This study used a new virtual reality assessment of executive functions known as JEF (the Jansari assessment of Executive Functions) to address this issue. Design 2x2 design manipulating group (smokers and never-smokers) and drug (nicotine [4mg for smokers; 2mg for never smokers] vs placebo gum). Setting School of Psychology; University of East LondonParticipants 72 participants (aged 18 to 54). 36 minimally-deprived (2 hr) smokers and 36 never-smokers.Measurements Components of executive function were measured using the virtual reality paradigm JEF, which assesses eight cognitive constructs simultaneously as well as providing an overall performance measure. Results Univariate ANOVAs revealed that nicotine improved overall JEF performance, time-based prospective memory and event-based prospective memory in smokers (p < 0.01) but not in never-smokers. Action-based prospective memory was enhanced in both groups (p < 0.01) and never-smokers out-performed smokers on selective thinking and adaptive thinking (p < 0.01). Conclusions. Overall executive functioning and prospective memory can be enhanced by nicotine gum in abstinent smokers. That smokers were only minimally deprived suggests that JEFis a sensitive measure of executive functioning and that prospective memory is particularly susceptible to disruption by abstinence

    Coherent States for Canonical Quantum General Relativity and the Infinite Tensor Product Extension

    Get PDF
    We summarize a recently proposed concrete programme for investigating the (semi)classical limit of canonical, Lorentzian, continuum quantum general relativity in four spacetime dimensions. The analysis is based on a novel set of coherent states labelled by graphs. These fit neatly together with an Infinite Tensor Product (ITP) extension of the currently used Hilbert space. The ITP construction enables us to give rigorous meaning to the infinite volume (thermodynamic) limit of the theory which has been out of reach so far.Comment: 37 p., latex2e, no figure

    Preparation and Measurement of Three-Qubit Entanglement in a Superconducting Circuit

    Full text link
    Traditionally, quantum entanglement has played a central role in foundational discussions of quantum mechanics. The measurement of correlations between entangled particles can exhibit results at odds with classical behavior. These discrepancies increase exponentially with the number of entangled particles. When entanglement is extended from just two quantum bits (qubits) to three, the incompatibilities between classical and quantum correlation properties can change from a violation of inequalities involving statistical averages to sign differences in deterministic observations. With the ample confirmation of quantum mechanical predictions by experiments, entanglement has evolved from a philosophical conundrum to a key resource for quantum-based technologies, like quantum cryptography and computation. In particular, maximal entanglement of more than two qubits is crucial to the implementation of quantum error correction protocols. While entanglement of up to 3, 5, and 8 qubits has been demonstrated among spins, photons, and ions, respectively, entanglement in engineered solid-state systems has been limited to two qubits. Here, we demonstrate three-qubit entanglement in a superconducting circuit, creating Greenberger-Horne-Zeilinger (GHZ) states with fidelity of 88%, measured with quantum state tomography. Several entanglement witnesses show violation of bi-separable bounds by 830\pm80%. Our entangling sequence realizes the first step of basic quantum error correction, namely the encoding of a logical qubit into a manifold of GHZ-like states using a repetition code. The integration of encoding, decoding and error-correcting steps in a feedback loop will be the next milestone for quantum computing with integrated circuits.Comment: 7 pages, 4 figures, and Supplementary Information (4 figures)

    Monoidal computer III: A coalgebraic view of computability and complexity

    Full text link
    Monoidal computer is a categorical model of intensional computation, where many different programs correspond to the same input-output behavior. The upshot of yet another model of computation is that a categorical formalism should provide a much needed high level language for theory of computation, flexible enough to allow abstracting away the low level implementation details when they are irrelevant, or taking them into account when they are genuinely needed. A salient feature of the approach through monoidal categories is the formal graphical language of string diagrams, which supports visual reasoning about programs and computations. In the present paper, we provide a coalgebraic characterization of monoidal computer. It turns out that the availability of interpreters and specializers, that make a monoidal category into a monoidal computer, is equivalent with the existence of a *universal state space*, that carries a weakly final state machine for any pair of input and output types. Being able to program state machines in monoidal computers allows us to represent Turing machines, to capture their execution, count their steps, as well as, e.g., the memory cells that they use. The coalgebraic view of monoidal computer thus provides a convenient diagrammatic language for studying computability and complexity.Comment: 34 pages, 24 figures; in this version: added the Appendi
    corecore