49,658 research outputs found

    Target-derived neurotrophic factors regulate the death of developing forebrain neurons after a change in their trophic requirements

    Get PDF
    Many neurons die as the normal brain develops. How this is regulated and whether the mechanism involves neurotrophic molecules from target cells are unknown. We found that cultured neurons from a key forebrain structure, the dorsal thalamus, develop a need for survival factors including brain-derived neurotrophic factor (BDNF) from their major target, the cerebral cortex, at the age at which they innervate it. Experiments in vivo have shown that rates of dorsal thalamic cell death are reduced by increasing cortical levels of BDNF and are increased in mutant mice lacking functional BDNF receptors or thalamocortical projections; these experiments have also shown that an increase in the rates of dorsal thalamic cell death can be achieved by blocking BDNF in the cortex. We suggest that the onset of a requirement for cortex-derived neurotrophic factors initiates a competitive mechanism regulating programmed cell death among dorsal thalamic neurons

    Nilpotent normal form for divergence-free vector fields and volume-preserving maps

    Get PDF
    We study the normal forms for incompressible flows and maps in the neighborhood of an equilibrium or fixed point with a triple eigenvalue. We prove that when a divergence free vector field in R3\mathbb{R}^3 has nilpotent linearization with maximal Jordan block then, to arbitrary degree, coordinates can be chosen so that the nonlinear terms occur as a single function of two variables in the third component. The analogue for volume-preserving diffeomorphisms gives an optimal normal form in which the truncation of the normal form at any degree gives an exactly volume-preserving map whose inverse is also polynomial inverse with the same degree.Comment: laTeX, 20 pages, 1 figur

    Properdin and factor H: Opposing players on the alternative complement pathway "see-saw"

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Properdin and factor H are two key regulatory proteins having opposite functions in the alternative complement pathway. Properdin up-regulates the alternative pathway by stabilizing the C3bBb complex, whereas factor H downregulates the pathway by promoting proteolytic degradation of C3b. While factor H is mainly produced in the liver, there are several extrahepatic sources. In addition to the liver, factor H is also synthesized in fetal tubuli, keratinocytes, skin fibroblasts, ocular tissue, adipose tissue, brain, lungs, heart, spleen, pancreas, kidney, muscle, and placenta. Neutrophils are the major source of properdin, and it is also produced by monocytes, T cells and bone marrow progenitor cell line. Properdin is released by neutrophils from intracellular stores following stimulation by N-formyl-methionine-leucine-phenylalanine (fMLP) and tumor necrosis factor alpha (TNF-α). The HEP G2 cells derived from human liver has been found to produce functional properdin. Endothelial cells also produce properdin when induced by shear stress, thus is a physiological source for plasma properdin. The diverse range of extrahepatic sites for synthesis of these two complement regulators suggests the importance and need for local availability of the proteins. Here, we discuss the significance of the local synthesis of properdin and factor H. This assumes greater importance in view of recently identified unexpected and novel roles of properdin and factor H that are potentially independent of their involvement in complement regulation

    Pengaruh Service Failure Severity terhadap Kepuasan, Kepercayaan, Komitmen dan Negatif Word Of Mouth

    Get PDF
    The background of this research was to investigate the main and interactive effects of the severity of the service failure, specifically investigate the main effect of service failure severity on satisfaction,trust, commitment, and negative word of mouth. This investigation of the role of the severityconstruct will aid researchers and managers in better understanding and managing the service recovery process under different conditions.The objectives of this research is to extends previous research by investigating the role of service failure severity within the existing framework of customer's post-recovery evaluation and their future relationship with a service provider.The design of this research applies a survey toward unit analysis on customer Auto2000 Jakarta, which involved 142 respondent.the number of sample being respondent in this research. Meanwhile, the required data consist of five variables : service failure severity, satisfaction, trust, commitmentand negative word of mouth.The result of the study concludes that service failure severity has a significant main effect on satisfaction with service recovery. Despite the positif influence of a strong recovery on satisfaction,there remained a negatif influence on satisfaction as a result of a more severe servicefailure. In addition , the severity of a service failure also had a main effect on customer trust, commitment and the likelihood of engaging in negative word of mouth after the service failure

    Looking for Design in Materials Design

    Full text link
    Despite great advances in computation, materials design is still science fiction. The construction of structure-property relations on the quantum scale will turn computational empiricism into true design.Comment: 3 pages, 1 figur

    Shutters, Boxes, But No Paradoxes: Time Symmetry Puzzles in Quantum Theory

    Full text link
    The ``N-Box Experiment'' is a much-discussed thought experiment in quantum mechanics. It is claimed by some authors that a single particle prepared in a superposition of N+1 box locations and which is subject to a final ``post-selection'' measurement corresponding to a different superposition can be said to have occupied ``with certainty'' N boxes during the intervening time. However, others have argued that under closer inspection, this surprising claim fails to hold. Aharonov and Vaidman have continued their advocacy of the claim in question by proposing a variation on the N-box experiment, in which the boxes are replaced by shutters and the pre- and post-selected particle is entangled with a photon. These authors argue that the resulting ``N-shutter experiment'' strengthens their original claim regarding the N-box experiment. It is argued in this paper that the apparently surprising features of this variation are no more robust than those of the N-box experiment and that it is not accurate to say that the particle is ``with certainty'' in all N shutters at any given time.Comment: Presentation improved; to appear in International Studies in Philosophy of Scienc

    A Computational Study of the Weak Galerkin Method for Second-Order Elliptic Equations

    Full text link
    The weak Galerkin finite element method is a novel numerical method that was first proposed and analyzed by Wang and Ye for general second order elliptic problems on triangular meshes. The goal of this paper is to conduct a computational investigation for the weak Galerkin method for various model problems with more general finite element partitions. The numerical results confirm the theory established by Wang and Ye. The results also indicate that the weak Galerkin method is efficient, robust, and reliable in scientific computing.Comment: 19 page

    Sub-Planckian black holes and the Generalized Uncertainty Principle

    Get PDF
    The Black Hole Uncertainty Principle correspondence suggests that there could exist black holes with mass beneath the Planck scale but radius of order the Compton scale rather than Schwarzschild scale. We present a modified, self-dual Schwarzschild-like metric that reproduces desirable aspects of a variety of disparate models in the sub-Planckian limit, while remaining Schwarzschild in the large mass limit. The self-dual nature of this solution under MM1M \leftrightarrow M^{-1} naturally implies a Generalized Uncertainty Principle with the linear form Δx1Δp+Δp\Delta x \sim \frac{1}{\Delta p} + \Delta p. We also demonstrate a natural dimensional reduction feature, in that the gravitational radius and thermodynamics of sub-Planckian objects resemble that of (1+1)(1+1)-D gravity. The temperature of sub-Planckian black holes scales as MM rather than M1M^{-1} but the evaporation of those smaller than 103610^{-36}g is suppressed by the cosmic background radiation. This suggests that relics of this mass could provide the dark matter.Comment: 12 pages, 9 figures, version published in J. High En. Phy

    Adding Isolated Vertices Makes some Online Algorithms Optimal

    Full text link
    An unexpected difference between online and offline algorithms is observed. The natural greedy algorithms are shown to be worst case online optimal for Online Independent Set and Online Vertex Cover on graphs with 'enough' isolated vertices, Freckle Graphs. For Online Dominating Set, the greedy algorithm is shown to be worst case online optimal on graphs with at least one isolated vertex. These algorithms are not online optimal in general. The online optimality results for these greedy algorithms imply optimality according to various worst case performance measures, such as the competitive ratio. It is also shown that, despite this worst case optimality, there are Freckle graphs where the greedy independent set algorithm is objectively less good than another algorithm. It is shown that it is NP-hard to determine any of the following for a given graph: the online independence number, the online vertex cover number, and the online domination number.Comment: A footnote in the .tex file didn't show up in the last version. This was fixe

    Strategic Shift to a Diagnostic Model of Care in a Multi-Site Group Dental Practice.

    Get PDF
    BackgroundDocumenting standardized dental diagnostic terms represents an emerging change for how dentistry is practiced. We focused on a mid-sized dental group practice as it shifted to a policy of documenting patients' diagnoses using standardized terms in the electronic health record.MethodsKotter's change framework was translated into interview questions posed to the senior leadership in a mid-size dental group practice. In addition, quantitative content analyses were conducted on the written policies and forms before and after the implementation of standardized diagnosis documentation to assess the extent to which the forms and policies reflected the shift. Three reviewers analyzed the data individually and reached consensuses where needed.ResultsKotter's guiding change framework explained the steps taken to 97 percent utilization rate of the Electronic Health Record and Dental Diagnostic Code. Of the 96 documents included in the forms and policy analysis, 31 documents were officially updated but only two added a diagnostic element.ConclusionChange strategies established in the business literature hold utility for dental practices seeking diagnosis-centered care.Practical implicationsA practice that shifts to a diagnosis-driven care philosophy would be best served by ensuring that the change process follows a leadership framework that is calibrated to the organization's culture
    corecore