4,767 research outputs found

    ER Stress-Induced eIF2-alpha Phosphorylation Underlies Sensitivity of Striatal Neurons to Pathogenic Huntingtin

    Get PDF
    A hallmark of Huntington's disease is the pronounced sensitivity of striatal neurons to polyglutamine-expanded huntingtin expression. Here we show that cultured striatal cells and murine brain striatum have remarkably low levels of phosphorylation of translation initiation factor eIF2 alpha, a stress-induced process that interferes with general protein synthesis and also induces differential translation of pro-apoptotic factors. EIF2 alpha phosphorylation was elevated in a striatal cell line stably expressing pathogenic huntingtin, as well as in brain sections of Huntington's disease model mice. Pathogenic huntingtin caused endoplasmic reticulum (ER) stress and increased eIF2 alpha phosphorylation by increasing the activity of PKR-like ER-localized eIF2 alpha kinase (PERK). Importantly, striatal neurons exhibited special sensitivity to ER stress-inducing agents, which was potentiated by pathogenic huntingtin. We could strongly reduce huntingtin toxicity by inhibiting PERK. Therefore, alteration of protein homeostasis and eIF2 alpha phosphorylation status by pathogenic huntingtin appears to be an important cause of striatal cell death. A dephosphorylated state of eIF2 alpha has been linked to cognition, which suggests that the effect of pathogenic huntingtin might also be a source of the early cognitive impairment seen in patients

    Templates for stellar mass black holes falling into supermassive black holes

    Get PDF
    The spin modulated gravitational wave signals, which we shall call smirches, emitted by stellar mass black holes tumbling and inspiralling into massive black holes have extremely complicated shapes. Tracking these signals with the aid of pattern matching techniques, such as Wiener filtering, is likely to be computationally an impossible exercise. In this article we propose using a mixture of optimal and non-optimal methods to create a search hierarchy to ease the computational burden. Furthermore, by employing the method of principal components (also known as singular value decomposition) we explicitly demonstrate that the effective dimensionality of the search parameter space of smirches is likely to be just three or four, much smaller than what has hitherto been thought to be about nine or ten. This result, based on a limited study of the parameter space, should be confirmed by a more exhaustive study over the parameter space as well as Monte-Carlo simulations to test the predictions made in this paper.Comment: 12 pages, 4 Tables, 4th LISA symposium, submitted to CQ

    Quality of internal representation shapes learning performance in feedback neural networks

    Get PDF
    A fundamental feature of complex biological systems is the ability to form feedback interactions with their environment. A prominent model for studying such interactions is reservoir computing, where learning acts on low-dimensional bottlenecks. Despite the simplicity of this learning scheme, the factors contributing to or hindering the success of training in reservoir networks are in general not well understood. In this work, we study non-linear feedback networks trained to generate a sinusoidal signal, and analyze how learning performance is shaped by the interplay between internal network dynamics and target properties. By performing exact mathematical analysis of linearized networks, we predict that learning performance is maximized when the target is characterized by an optimal, intermediate frequency which monotonically decreases with the strength of the internal reservoir connectivity. At the optimal frequency, the reservoir representation of the target signal is high-dimensional, de-synchronized, and thus maximally robust to noise. We show that our predictions successfully capture the qualitative behaviour of performance in non-linear networks. Moreover, we find that the relationship between internal representations and performance can be further exploited in trained non-linear networks to explain behaviours which do not have a linear counterpart. Our results indicate that a major determinant of learning success is the quality of the internal representation of the target, which in turn is shaped by an interplay between parameters controlling the internal network and those defining the task

    Solving two-dimensional large-N QCD with a nonzero density of baryons and arbitrary quark mass

    Full text link
    We solve two-dimensional large-N QCD in the presence of a nonzero baryon number B, and for arbitrary quark mass m and volume L. We fully treat the dynamics of the gluonic zero modes and check how this affects results from previous studies of the B=0 and B=1 systems. For a finite density of baryons, and for any m>0, we find that the ground state contains a baryon crystal with expectation values for psi-bar gamma_mu psi that have a helix-like spatial structure. We study how these evolve with B and see that the volume integral of psi-bar psi strongly changes with the baryon density. We compare this emerging crystal structure with the sine-Gordon crystal, which is expected to be a good approximation for light quarks, and find that it is a very good approximation for surprisingly heavy quarks. We also calculate the way the ground state energy E changes as a function of the baryon number B, and find that for sufficiently large densities the function E(B) is well described by the equation of state for free massless quarks, thus suggesting a quark-Hadron continuity. From dE(B)/dB we calculate the quark chemical potential mu as a function of B and see that the baryons repel each other. The way mu depends on B also allows us to translate our findings to the grand-canonical ensemble. The resulting phase structure along the mu-axis contains a phase transition that occurs at a value of mu equal to the baryon mass divided in N, and that separates a mu-independent phase with intact translation symmetry from a mu-dependent phase with spontaneously broken translation symmetry. Finally, our calculations confirm the presence of a partial large-N Eguchi-Kawai volume independence, as described in Phys.Rev.D79:105021, that arises only if one treats the gluonic zero modes correctly.Comment: Added discussion on emerging quark-hadron continuity, fixed factor 2 in UV regularization, regularized gamma_5 condensate, and fixed plot of gamma_1 condensate. Plots are now more pretty. Added discussion of physical meaning of results, and corrected typos. 41 pages, 25 figure

    Volume dependence of two-dimensional large-N QCD with a nonzero density of baryons

    Full text link
    We take a first step towards the solution of QCD in 1+1 dimensions at nonzero density. We regularize the theory in the UV by using a lattice and in the IR by putting the theory in a box of spatial size L. After fixing to axial gauge we use the coherent states approach to obtain the large-N classical Hamiltonian H that describes color neutral quark-antiquark pairs interacting with spatial Polyakov loops in the background of baryons. Minimizing H we get a regularized form of the `t Hooft equation that depends on the expectation values of the Polyakov loops. Analyzing the L-dependence of this equation we show how volume independence, a la Eguchi and Kawai, emerges in the large-N limit, and how it depends on the expectation values of the Polyakov loops. We describe how this independence relies on the realization of translation symmetry, in particular when the ground state contains a baryon crystal. Finally, we remark on the implications of our results on studying baryon density in large-N QCD within single-site lattice theories, and on some general lessons concerning the way four-dimensional large-N QCD behaves in the presence of baryons.Comment: 32 pages, 3 figures. New version much more reader friendly and also emphasizes the exact nature of the approac

    Considering lactate dehydrogenase (LDH) concentration in nasal-wash (NW) as a marker in evaluating the outcome of patients with bronchiolitis

    Get PDF
    Background: Estimation of bronchiolitis severity in infants is still an important issue and there are no standard methods to help physicians for better evaluation and management of clinical status of these patients. The aim of this study was to investigate the role LDH concentration in NW as a biomarker in evaluation the outcome of patients suffering bronchiolitis in Bu Ali Hospital, Ardabil.Methods: 100 children with bronchiolitis aged below 2 years entered the study. Nasal wash sample was extracted from all patients using 2 ml of normal saline. Samples were sent to laboratory to measure LDH level. Data were analyzed by statistical methods in SPSS.16.Results: The mean age of patients was 6.9±3.7 months and 57% of them were male. 42% of patients had mild bronchiolitis and 58% of them suffered from severe bronchiolitis. The LDH level of nasal wash fluid was neither related with gender nor with age. But it was significantly lower in patients who required oxygen therapy and had fever compared with those who did not require oxygen therapy and without fever. Moreover, LDH level showed a significant negative association with hospital stay (r= -0.570, p<0.001) and bronchiolitis severity (r= -0.440, p<0.001) in a way that its concentration was significantly lower in patients with hospital stay longer than 24 hours compared with hospital stay shorter than 24 hours, and in patients with severe bronchiolitis compared with mild bronchiolitis.Conclusions: According to results of this study, LDH measurement in nasal wash fluid can be used as a biochemical marker to evaluate clinical outcomes of bronchiolitis in children younger than 24 months

    Role of nitric oxide in Salmonella typhimurium-mediated cancer cell killing

    Get PDF
    Background: Bacterial targeting of tumours is an important anti-cancer strategy. We previously showed that strain SL7838 of Salmonella typhimurium targets and kills cancer cells. Whether NO generation by the bacteria has a role in SL7838 lethality to cancer cells is explored. This bacterium has the mechanism for generating NO, but also for decomposing it. Methods: Mechanism underlying Salmonella typhimurium tumour therapy was investigated through in vitro and in vivo studies. NO measurements were conducted either by chemical assays (in vitro) or using Biosensors (in vivo). Cancer cells cytotoxic assay were done by using MTS. Bacterial cell survival and tumour burden were determined using molecular imaging techniques. Results: SL7838 generated nitric oxide (NO) in anaerobic cell suspensions, inside infected cancer cells in vitro and in implanted 4T1 tumours in live mice, the last, as measured using microsensors. Thus, under these conditions, the NO generating pathway is more active than the decomposition pathway. The latter was eliminated, in strain SL7842, by the deletion of hmp- and norV genes, making SL7842 more proficient at generating NO than SL7838. SL7842 killed cancer cells more effectively than SL7838 in vitro, and this was dependent on nitrate availability. This strain was also ca. 100% more effective in treating implanted 4T1 mouse tumours than SL7838

    A 12-Channel, real-time near-infrared spectroscopy instrument for brain-computer interface applications

    Get PDF
    A continuous wave near-infrared spectroscopy (NIRS) instrument for brain-computer interface (BCI) applications is presented. In the literature, experiments have been carried out on subjects with such motor degenerative diseases as amyotrophic lateral sclerosis, which have demonstrated the suitability of NIRS to access intentional functional activity, which could be used in a BCI as a communication aid. Specifically, a real-time, multiple channel NIRS tool is needed to realise access to even a few different mental states, for reasonable baud rates. The 12-channel instrument described here has a spatial resolution of 30mm, employing a flexible software demodulation scheme. Temporal resolution of ~100ms is maintained since typical topographic imaging is not needed, since we are only interested in exploiting the vascular response for BCI control. A simple experiment demonstrates the ability of the system to report on haemodynamics during single trial mental arithmetic tasks. Multiple trial averaging is not required

    Phase Transition in Liquid Drop Fragmentation

    Full text link
    A liquid droplet is fragmented by a sudden pressurized-gas blow, and the resulting droplets, adhered to the window of a flatbed scanner, are counted and sized by computerized means. The use of a scanner plus image recognition software enables us to automatically count and size up to tens of thousands of tiny droplets with a smallest detectable volume of approximately 0.02 nl. Upon varying the gas pressure, a critical value is found where the size-distribution becomes a pure power-law, a fact that is indicative of a phase transition. Away from this transition, the resulting size distributions are well described by Fisher's model at coexistence. It is found that the sign of the surface correction term changes sign, and the apparent power-law exponent tau has a steep minimum, at criticality, as previously reported in Nuclear Multifragmentation studies [1,2]. We argue that the observed transition is not percolative, and introduce the concept of dominance in order to characterize it. The dominance probability is found to go to zero sharply at the transition. Simple arguments suggest that the correlation length exponent is nu=1/2. The sizes of the largest and average fragments, on the other hand, do not go to zero but behave in a way that appears to be consistent with recent predictions of Ashurst and Holian [3,4].Comment: 10 pages, 11 figures. LaTeX (revtex4) with psfig/epsfi
    • …
    corecore