151 research outputs found

    Neurocalcin-delta: a potential memory-related factor in hippocampus of obese rats induced by high-fat diet.

    Get PDF
    Introduction: Aberrant protein expression within the hippocampus has recently been implicated in the pathogenesis of obesity- induced memory impairment.Objectives: The objective of the current study was to search for specific memory-related factors in the hippocampus in obese rats.Methods: Sprague-Dawley (SD) rats were fed either a high-fat (HF) diet or normal-fat (NF) diet for 10 weeks to obtain the control (CON), diet-induced obese rats (DIO) and diet-resistant (DR) rats. D-galactose was injected subcutaneously for 10 weeks to establish model (MOD) rats with learning and memory impairment. After the hippocampus of the rats sampling, the proteome analysis was conducted using two-dimensional get electrophoresis (2-DE) combined with peptide mass fingerprinting (PMF).Results: We found 15 differential proteins that expressed in the hippocampus in rats induced by HF diet from the 2-DE map. In addition, Neurocalcin-delta (NCALD) was nearly down-regulated in the DR rats compared with CON rats and MOD rats, which was further confirmed by Western blot, real-time PCR and ELISA results.Conclusion: Our data demonstrates that the differential memory-related proteins were a reflection of the HF diet, but not potential factors in obesity proneness or obesity resistance. Furthermore, NCALD is proved to be a potential hippocampus-memory related factor related to obesity.Keywords: Diet-induced obesity; diet-resistant; high fat diet; neurocalcin-delta; proteom

    Exceeding Conservative Limits: A Consolidated Analysis on Modern Hardware Margins

    Get PDF
    Modern large-scale computing systems (data centers, supercomputers, cloud and edge setups and high-end cyber-physical systems) employ heterogeneous architectures that consist of multicore CPUs, general-purpose many-core GPUs, and programmable FPGAs. The effective utilization of these architectures poses several challenges, among which a primary one is power consumption. Voltage reduction is one of the most efficient methods to reduce power consumption of a chip. With the galloping adoption of hardware accelerators (i.e., GPUs and FPGAs) in large datacenters and other large-scale computing infrastructures, a comprehensive evaluation of the safe voltage reduction levels for each different chip can be employed for efficient reduction of the total power. We present a survey of recent studies in voltage margins reduction at the system level for modern CPUs, GPUs and FPGAs. The pessimistic voltage guardbands inserted by the silicon vendors can be exploited in all devices for significant power savings. On average, voltage reduction can reach 12% in multicore CPUs, 20% in manycore GPUs and 39% in FPGAs.Comment: Accepted for publication in IEEE Transactions on Device and Materials Reliabilit

    Toward sustainable data centers: a comprehensive energy management strategy

    Get PDF
    Data centers are major contributors to the emission of carbon dioxide to the atmosphere, and this contribution is expected to increase in the following years. This has encouraged the development of techniques to reduce the energy consumption and the environmental footprint of data centers. Whereas some of these techniques have succeeded to reduce the energy consumption of the hardware equipment of data centers (including IT, cooling, and power supply systems), we claim that sustainable data centers will be only possible if the problem is faced by means of a holistic approach that includes not only the aforementioned techniques but also intelligent and unifying solutions that enable a synergistic and energy-aware management of data centers. In this paper, we propose a comprehensive strategy to reduce the carbon footprint of data centers that uses the energy as a driver of their management procedures. In addition, we present a holistic management architecture for sustainable data centers that implements the aforementioned strategy, and we propose design guidelines to accomplish each step of the proposed strategy, referring to related achievements and enumerating the main challenges that must be still solved.Peer ReviewedPostprint (author's final draft

    Huber Principal Component Analysis for Large-dimensional Factor Models

    Full text link
    Factor models have been widely used in economics and finance. However, the heavy-tailed nature of macroeconomic and financial data is often neglected in the existing literature. To address this issue and achieve robustness, we propose an approach to estimate factor loadings and scores by minimizing the Huber loss function, which is motivated by the equivalence of conventional Principal Component Analysis (PCA) and the constrained least squares method in the factor model. We provide two algorithms that use different penalty forms. The first algorithm, which we refer to as Huber PCA, minimizes the â„“2\ell_2-norm-type Huber loss and performs PCA on the weighted sample covariance matrix. The second algorithm involves an element-wise type Huber loss minimization, which can be solved by an iterative Huber regression algorithm. Our study examines the theoretical minimizer of the element-wise Huber loss function and demonstrates that it has the same convergence rate as conventional PCA when the idiosyncratic errors have bounded second moments. We also derive their asymptotic distributions under mild conditions. Moreover, we suggest a consistent model selection criterion that relies on rank minimization to estimate the number of factors robustly. We showcase the benefits of Huber PCA through extensive numerical experiments and a real financial portfolio selection example. An R package named ``HDRFA" has been developed to implement the proposed robust factor analysis

    Comparison of Timber Consumption in U.S. and China

    Get PDF
    • …
    corecore