435 research outputs found

    Is the Riemann zeta function in a short interval a 1-RSB spin glass ?

    Full text link
    Fyodorov, Hiary & Keating established an intriguing connection between the maxima of log-correlated processes and the ones of the Riemann zeta function on a short interval of the critical line. In particular, they suggest that the analogue of the free energy of the Riemann zeta function is identical to the one of the Random Energy Model in spin glasses. In this paper, the connection between spin glasses and the Riemann zeta function is explored further. We study a random model of the Riemann zeta function and show that its two-overlap distribution corresponds to the one of a one-step replica symmetry breaking (1-RSB) spin glass. This provides evidence that the local maxima of the zeta function are strongly clustered.Comment: 20 pages, 1 figure, Minor corrections, References update

    Baryon magnetic moments and sigma terms in lattice-regularized chiral perturbation theory

    Get PDF
    An SU(3) chiral Lagrangian for the lightest decuplet of baryons is constructed on a discrete lattice of spacetime points, and is added to an existing lattice Lagrangian for the lightest octets of mesons and baryons. A nonzero lattice spacing renders all loop integrations finite, and the continuum limit of any physical observable is identical to the result obtained from dimensional regularization. Chiral symmetry and gauge invariance are preserved even at nonzero lattice spacing. Specific calculations discussed here include the non-renormalization of a conserved vector current, the magnetic moments of octet baryons, and the pi N and KN sigma terms that relate to the nucleon's strangeness content. The quantitative difference between physics at a nonzero lattice spacing and physics in the continuum limit is easily computed, and it represents an expectation for the size of discretization errors in corresponding lattice QCD simulations.Comment: 19 pages, 5 figures, one paragraph added to introduction, to appear in Phys Rev

    Building Information Modeling for Cultural Heritage: The Management of Generative Process for Complex Historical Buildings

    Get PDF
    Building Information Modeling (BIM) enhances the sharing of information during the traditional process for new construction, but most of the time, it requires high levels of knowledge management for the historical digital model (H-BIM). The innovation in the Digital Cultural Heritage (DCH) domain is supported by the development of Information and Communications Technologies (ICT) and modern tools that are able to transmit morphological characteristics of the buildings in all their uniqueness. The latest research in the field of H-BIM shows a significant emergence of innovative methods and management initiatives for the generation of complex historical elements, leading to the confrontation of the paradigm of regularity (simple geometric shapes) with the new paradigm of complexity (historical building elements). This paper proves the benefits of the BIM for project management of the Centre Block of the Canadian Parliament in Ottawa, Ontario Canada, and shows the results obtained by the introduction of Advanced Modeling Techniques (AMT) during the generative process, reducing time and cost for the creation of the complex architectural and structural elements. The uniqueness of the forms of historical buildings is a real value to be transmitted throughout the building’s lifecycle with high Levels of Detail (LOD). Proper management of geometric primitives and Non-Uniform Rational Basis Spline (NURBS) models have guaranteed the conversion of spatial data (point clouds) from laser scanning and photogrammetry (geometric survey) into parametric applications. This paper explores the generative process of one of the most complex spaces within The Centre Block building of Parliament Hill—Confederation Hall

    THP-1 macrophage cholesterol efflux is impaired by palmitoleate through Akt activation.

    Get PDF
    Lipoprotein lipase (LPL) is upregulated in atherosclerotic lesions and it may promote the progression of atherosclerosis, but the mechanisms behind this process are not completely understood. We previously showed that the phosphorylation of Akt within THP-1 macrophages is increased in response to the lipid hydrolysis products generated by LPL from total lipoproteins. Notably, the free fatty acid (FFA) component was responsible for this effect. In the present study, we aimed to reveal more detail as to how the FFA component may affect Akt signalling. We show that the phosphorylation of Akt within THP-1 macrophages increases with total FFA concentration and that phosphorylation is elevated up to 18 hours. We further show that specifically the palmitoleate component of the total FFA affects Akt phosphorylation. This is tied with changes to the levels of select molecular species of phosphoinositides. We further show that the total FFA component, and specifically palmitoleate, reduces apolipoprotein A-I-mediated cholesterol efflux, and that the reduction can be reversed in the presence of the Akt inhibitor MK-2206. Overall, our data support a negative role for the FFA component of lipoprotein hydrolysis products generated by LPL, by impairing macrophage cholesterol efflux via Akt activation

    The landslide story

    Get PDF
    The catastrophic Wenchuan earthquake induced an unprecedented number of geohazards. The risk of heightened landslide frequency after a quake, with potential secondary effects such as river damming and subsequent floods, needs more focused attention

    Appropriate disclosure of a diagnosis of dementia : identifying the key behaviours of 'best practice'

    Get PDF
    Background: Despite growing evidence that many people with dementia want to know their diagnosis, there is wide variation in attitudes of professionals towards disclosure. The disclosure of the diagnosis of dementia is increasingly recognised as being a process rather than a one-off behaviour. However, the different behaviours that contribute to this process have not been comprehensively defined. No intervention studies to improve diagnostic disclosure in dementia have been reported to date. As part of a larger study to develop an intervention to promote appropriate disclosure, we sought to identify important disclosure behaviours and explore whether supplementing a literature review with other methods would result in the identification of new behaviours. Methods: To identify a comprehensive list of behaviours in disclosure we conducted a literature review, interviewed people with dementia and informal carers, and used a consensus process involving health and social care professionals. Content analysis of the full list of behaviours was carried out. Results: Interviews were conducted with four people with dementia and six informal carers. Eight health and social care professionals took part in the consensus panel. From the interviews, consensus panel and literature review 220 behaviours were elicited, with 109 behaviours over-lapping. The interviews and consensus panel elicited 27 behaviours supplementary to the review. Those from the interviews appeared to be self-evident but highlighted deficiencies in current practice and from the panel focused largely on balancing the needs of people with dementia and family members. Behaviours were grouped into eight categories: preparing for disclosure; integrating family members; exploring the patient's perspective; disclosing the diagnosis; responding to patient reactions; focusing on quality of life and well-being; planning for the future; and communicating effectively. Conclusion: This exercise has highlighted the complexity of the process of disclosing a diagnosis of dementia in an appropriate manner. It confirms that many of the behaviours identified in the literature (often based on professional opinion rather than empirical evidence) also resonate with people with dementia and informal carers. The presence of contradictory behaviours emphasises the need to tailor the process of disclosure to individual patients and carers. Our combined methods may be relevant to other efforts to identify and define complex clinical practices for further study.This project is funded by UK Medical Research Council, Grant reference number G0300999

    Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set

    Get PDF
    We report a measurement of the bottom-strange meson mixing phase \beta_s using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays in which the quark-flavor content of the bottom-strange meson is identified at production. This measurement uses the full data set of proton-antiproton collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity. We report confidence regions in the two-dimensional space of \beta_s and the B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2, -1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in agreement with the standard model expectation. Assuming the standard model value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +- 0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +- 0.009 (syst) ps, which are consistent and competitive with determinations by other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012

    Modeling Time in Computing: A Taxonomy and a Comparative Survey

    Full text link
    The increasing relevance of areas such as real-time and embedded systems, pervasive computing, hybrid systems control, and biological and social systems modeling is bringing a growing attention to the temporal aspects of computing, not only in the computer science domain, but also in more traditional fields of engineering. This article surveys various approaches to the formal modeling and analysis of the temporal features of computer-based systems, with a level of detail that is suitable also for non-specialists. In doing so, it provides a unifying framework, rather than just a comprehensive list of formalisms. The paper first lays out some key dimensions along which the various formalisms can be evaluated and compared. Then, a significant sample of formalisms for time modeling in computing are presented and discussed according to these dimensions. The adopted perspective is, to some extent, historical, going from "traditional" models and formalisms to more modern ones.Comment: More typos fixe
    corecore