1,592 research outputs found

    Psychiatric Disorders and lncRNAs: A Synaptic Match

    Get PDF
    Psychiatric disorders represent a heterogeneous class of multifactorial mental diseases whose origin entails a pathogenic integration of genetic and environmental influences. Incidence of these pathologies is dangerously high, as more than 20% of the Western population is affected. Despite the diverse origins of specific molecular dysfunctions, these pathologies entail disruption of fine synaptic regulation, which is fundamental to behavioral adaptation to the environment. The synapses, as functional units of cognition, represent major evolutionary targets. Consistently, fine synaptic tuning occurs at several levels, involving a novel class of molecular regulators known as long non-coding RNAs (lncRNAs). Non-coding RNAs operate mainly in mammals as epigenetic modifiers and enhancers of proteome diversity. The prominent evolutionary expansion of the gene number of lncRNAs in mammals, particularly in primates and humans, and their preferential neuronal expression does represent a driving force that enhanced the layering of synaptic control mechanisms. In the last few years, remarkable alterations of the expression of lncRNAs have been reported in psychiatric conditions such as schizophrenia, autism, and depression, suggesting unprecedented mechanistic insights into disruption of fine synaptic tuning underlying severe behavioral manifestations of psychosis. In this review, we integrate literature data from rodent pathological models and human evidence that proposes the biology of lncRNAs as a promising field of neuropsychiatric investigation

    Reexamination of continuous fuzzy measurement on two-level systems

    Get PDF
    Imposing restrictions on the Feynman paths of the monitored system has in the past been proposed as a universal model-free approach to continuous quantum measurements. Here we revisit this proposition and demonstrate that a Gaussian restriction, resulting in a sequence of many highly inaccurate (weak) von Neumann measurements, is not sufficiently strong to ensure proximity between a readout and the Feynman paths along which the monitored system evolves. Rather, in the continuous limit, the variations of a typical readout become much larger than the separation between the eigenvalues of the measured quantity. Thus, a typical readout is not represented by a nearly constant curve, correlating with one of the eigenvalues of the measured quantity A^\hat{A}, even when decoherence or Zeno effect is achieved for the observed two-level system, and does not point directly to the system's final state. We show that the decoherence in a ``free'' system can be seen as induced by a Gaussian random walk with a drift, eventually directing the system towards one of the eigenstates of A^\hat{A}. A similar mechanism appears to be responsible for the Zeno effect in a driven system, when its Rabi oscillations are quenched by monitoring. Alongside the Gaussian case, which can only be studied numerically, we also consider a fully tractable model with a ``hard wall'' restriction and show the results to be similar.MINECO, Fondo Europeo de Desarrollo Regional FEDER, Grant No. FIS2015-67161-P (MINECO/FEDER) (D.S.), MINECO Grant No. SVP-2014-068451 (S.R.), MINECO Grant No. MTM2013-46553-C3-1-P (E.A.), SGI/IZOSGIker UPV/EHU, i2BASQUE academic network

    An optimal scaling to computationally tractable dimensionless models: Study of latex particles morphology formation

    Get PDF
    In modelling of chemical, physical or biological systems it may occur that the coefficients, multiplying various terms in the equation of interest, differ greatly in magnitude, if a particular system of units is used. Such is, for instance, the case of the Population Balance Equations (PBE) proposed to model the Latex Particles Morphology formation. The obvious way out of this difficulty is the use of dimensionless scaled quantities, although often the scaling procedure is not unique. In this paper, we introduce a conceptually new general approach, called Optimal Scaling (OS). The method is tested on the known examples from classical and quantum mechanics, and applied to the Latex Particles Morphology model, where it allows us to reduce the variation of the relevant coefficients from 49 to just 4 orders of magnitudes. The PBE are then solved by a novel Generalised Method Of Characteristics, and the OS is shown to help reduce numerical error, and avoid unphysical behaviour of the solution. Although inspired by a particular application, the proposed scaling algorithm is expected find application in a wide range of chemical, physical and biological problems

    Relative frequencies of constrained events in stochastic processes: An analytical approach

    Get PDF
    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈104). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications

    Preclinical and clinical evidence on the approach-avoidance conflict evaluation as an integrative tool for psychopathology

    Get PDF
    The approach-avoidance conflict (AAC), i.e. the competing tendencies to undertake goal-directed actions or to withdraw from everyday life challenges, stands at the basis of humans' existence defining behavioural and personality domains. Gray's Reinforcement Sensitivity Theory posits that a stable bias toward approach or avoidance represents a psychopathological trait associated with excessive sensitivity to reward or punishment. Optogenetic studies in rodents and imaging studies in humans associated with cross-species AAC paradigms granted new emphasis to the hippocampus as a hub of behavioural inhibition. For instance, recent functional neuroimaging studies show that functional brain activity in the human hippocampus correlates with threat perception and seems to underlie passive avoidance. Therefore, our commentary aims to (i) discuss the inhibitory role of the hippocampus in approach-related behaviours and (ii) promote the integration of functional neuroimaging with cross-species AAC paradigms as a means of diagnostic, therapeutic, follow up and prognosis refinement in psychiatric populations

    Impact of competitive processes on controlled radical polymerization

    Get PDF
    The kinetics of radical polymerization have been systematically studied for nearly a century and in general are well understood. However, in light of recent developments in controlled radical polymerization many kinetic anomalies have arisen. These unexpected results have been largely considered separate, and various, as yet inconclusive, debates as to the cause of these anomalies are ongoing. Herein we present a new theory on the cause of changes in kinetics under controlled radical polymerization conditions. We show that where the fast, intermittent deactivation of radical species takes place, changes in the relative rates of the competitive reactions that exist in radical polymerization can occur. To highlight the applicability of the model, we demonstrate that the model explains well the reduction in branching in acrylic polymers in RAFT polymerization. We further show that such a theory may explain various phenomena in controlled radical polymerization and may be exploited to design precise macromolecular architectures

    New experimental limits on the alpha decays of lead isotopes

    Full text link
    For the first time a PbWO4 crystal was grown using ancient Roman lead and it was run as a cryogenic detector. Thanks to the simultaneous and independent read-out of heat and scintillation light, the detector was able to discriminate beta/gamma interactions with respect to alpha particles down to low energies. New more stringent limits on the alpha decays of the lead isotopes are presented. In particular a limit of T_{1/2} > 1.4*10^20 y at a 90% C.L. was evaluated for the alpha decay of 204Pb to 200Hg

    Background suppression in massive TeO2_2 bolometers with Neganov-Luke amplified light detectors

    Full text link
    Bolometric detectors are excellent devices for the investigation of neutrinoless double-beta decay (0νββ\nu\beta\beta). The observation of such decay would demonstrate the violation of lepton number, and at the same time it would necessarily imply that neutrinos have a Majorana character. The sensitivity of cryogenic detectors based on TeO2_2 is strongly limited by the alpha background in the region of interest for the 0νββ\nu\beta\beta of 130^{130}Te. It has been demonstrated that particle discrimination in TeO2_2 bolometers is possible measuring the Cherenkov light produced by particle interactions. However an event-by-event discrimination with NTD-based light detectors has to be demonstrated. We will discuss the performance of a highly-sensitive light detector exploiting the Neganov-Luke effect for signal amplification. The detector, being operated with NTD-thermistor and coupled to a 750 g TeO2_2 crystal, shows the ability for an event-by-event identification of electron/gamma and alpha particles. The extremely low detector baseline noise, RMS 19 eV, demonstrates the possibility to enhance the sensitivity of TeO2_2-based 0νββ\nu\beta\beta experiment to an unprecedented level
    • …
    corecore