8,746 research outputs found

    The circadian rhythm: an influential soundtrack in the diabetes story

    Get PDF
    Type 2 Diabetes Mellitus (T2DM) has been the main category of metabolic diseases in recent years due to changes in lifestyle and environmental conditions such as diet and physical activity. On the other hand, the circadian rhythm is one of the most significant biological pathways in humans and other mammals, which is affected by light, sleep, and human activity. However, this cycle is controlled via complicated cellular pathways with feedback loops. It is widely known that changes in the circadian rhythm can alter some metabolic pathways of body cells and could affect the treatment process, particularly for metabolic diseases like T2DM. The aim of this study is to explore the importance of the circadian rhythm in the occurrence of T2DM via reviewing the metabolic pathways involved, their relationship with the circadian rhythm from two perspectives, lifestyle and molecular pathways, and their effect on T2DM pathophysiology. These impacts have been demonstrated in a variety of studies and led to the development of approaches such as time-restricted feeding, chronotherapy (time-specific therapies), and circadian molecule stabilizers

    SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient

    Full text link
    Many deep learning applications benefit from using large models with billions of parameters. Training these models is notoriously expensive due to the need for specialized HPC clusters. In this work, we consider alternative setups for training large models: using cheap "preemptible" instances or pooling existing resources from multiple regions. We analyze the performance of existing model-parallel algorithms in these conditions and find configurations where training larger models becomes less communication-intensive. Based on these findings, we propose SWARM parallelism, a model-parallel training algorithm designed for poorly connected, heterogeneous and unreliable devices. SWARM creates temporary randomized pipelines between nodes that are rebalanced in case of failure. We empirically validate our findings and compare SWARM parallelism with existing large-scale training approaches. Finally, we combine our insights with compression strategies to train a large Transformer language model with 1B shared parameters (approximately 13B before sharing) on preemptible T4 GPUs with less than 200Mb/s network.Comment: Accepted to International Conference on Machine Learning (ICML) 2023. 25 pages, 8 figure

    Adaptive non-singular fast terminal sliding mode control and synchronization of a chaotic system via interval type-2 fuzzy inference system with proportionate controller

    Get PDF
    This paper introduces a novel adaptive nonsingular fast terminal sliding mode approach that benefits from an interval type-2 fuzzy logic estimator and a gain for control and synchronization of chaotic systems in the presence of uncertainty. The nonsingular fast terminal sliding mode controller is developed to increase the convergence rate and remove the singularity problem of the system. Using the proposed method, the finite-time convergence has been ensured. To eliminate the chattering phenomenon in the conventional sliding mode controller, the discontinuous sign function is estimated using an interval type-2 fuzzy inference system (FIS) based on the center of sets type reduction followed by defuzzification. By adding the proportionate gain to the interval type-2 FIS, the robustness and speed of the controller system is enhanced. An appropriate Lyapunov function is utilized to ensure the closed-loop stability of the control system. The performance of the controller is evaluated for a nonlinear time-varying second-order magnetic space-craft chaotic system with different initial conditions in the presence of uncertainty. The simulation results show the efficacy of the proposed approach for the tracking control problems. The time and frequency domain analysis of the control signal demonstrates that the chattering phenomenon is successfully diminished

    An investigation of entorhinal spatial representations in self-localisation behaviours

    Get PDF
    Spatial-modulated cells of the medial entorhinal cortex (MEC) and neighbouring cortices are thought to provide the neural substrate for self-localisation behaviours. These cells include grid cells of the MEC which are thought to compute path integration operations to update self-location estimates. In order to read this grid code, downstream cells are thought to reconstruct a positional estimate as a simple rate-coded representation of space. Here, I show the coding scheme of grid cell and putative readout cells recorded from mice performing a virtual reality (VR) linear location task which engaged mice in both beaconing and path integration behaviours. I found grid cells can encode two unique coding schemes on the linear track, namely a position code which reflects periodic grid fields anchored to salient features of the track and a distance code which reflects periodic grid fields without this anchoring. Grid cells were found to switch between these coding schemes within sessions. When grid cells were encoding position, mice performed better at trials that required path integration but not on trials that required beaconing. This result provides the first mechanistic evidence linking grid cell activity to path integration-dependent behaviour. Putative readout cells were found in the form of ramp cells which fire proportionally as a function of location in defined regions of the linear track. This ramping activity was found to be primarily explained by track position rather than other kinematic variables like speed and acceleration. These representations were found to be maintained across both trial types and outcomes indicating they likely result from recall of the track structure. Together, these results support the functional importance of grid and ramp cells for self-localisation behaviours. Future investigations will look into the coherence between these two neural populations, which may together form a complete neural system for coding and decoding self-location in the brain

    Approximate Computing Survey, Part I: Terminology and Software & Hardware Approximation Techniques

    Full text link
    The rapid growth of demanding applications in domains applying multimedia processing and machine learning has marked a new era for edge and cloud computing. These applications involve massive data and compute-intensive tasks, and thus, typical computing paradigms in embedded systems and data centers are stressed to meet the worldwide demand for high performance. Concurrently, the landscape of the semiconductor field in the last 15 years has constituted power as a first-class design concern. As a result, the community of computing systems is forced to find alternative design approaches to facilitate high-performance and/or power-efficient computing. Among the examined solutions, Approximate Computing has attracted an ever-increasing interest, with research works applying approximations across the entire traditional computing stack, i.e., at software, hardware, and architectural levels. Over the last decade, there is a plethora of approximation techniques in software (programs, frameworks, compilers, runtimes, languages), hardware (circuits, accelerators), and architectures (processors, memories). The current article is Part I of our comprehensive survey on Approximate Computing, and it reviews its motivation, terminology and principles, as well it classifies and presents the technical details of the state-of-the-art software and hardware approximation techniques.Comment: Under Review at ACM Computing Survey

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Study of neural circuits using multielectrode arrays in movement disorders

    Full text link
    Treballs Finals de Grau d'Enginyeria Biomèdica. Facultat de Medicina i Ciències de la Salut. Universitat de Barcelona. Curs: 2022-2023. Tutor/Director: Rodríguez Allué, Manuel JoséNeurodegenerative movement-related disorders are characterized by a progressive degeneration and loss of neurons, which lead to motor control impairment. Although the precise mechanisms underlying these conditions are still unknown, an increasing number of studies point towards the analysis of neural networks and functional connectivity to unravel novel insights. The main objective of this work is to understand cellular mechanisms related to dysregulated motor control symptoms in movement disorders, such as Chorea-Acanthocytosis (ChAc), by employing multielectrode arrays to analyze the electrical activity of neuronal networks in mouse models. We found no notable differences in cell viability between neurons with and without VPS13A knockdown, that is the only gene known to be implicated in the disease, suggesting that the absence of VPS13A in neurons may be partially compensated by other proteins. The MEA setup used to capture the electrical activity from neuron primary cultures is described in detail, pointing out its specific characteristics. At last, we present the alternative backup approach implemented to overcome the challenges faced during the research process and to explore the advanced algorithms for signal processing and analysis. In this report, we present a thorough account of the conception and implementation of our research, outlining the multiple limitations that have been encountered all along the course of the project. We provide a detailed analysis on the project’s economical and technical feasibility, as well as a comprehensive overview of the ethical and legal aspects considered during the execution

    Emergence of Adaptive Circadian Rhythms in Deep Reinforcement Learning

    Full text link
    Adapting to regularities of the environment is critical for biological organisms to anticipate events and plan. A prominent example is the circadian rhythm corresponding to the internalization by organisms of the 2424-hour period of the Earth's rotation. In this work, we study the emergence of circadian-like rhythms in deep reinforcement learning agents. In particular, we deployed agents in an environment with a reliable periodic variation while solving a foraging task. We systematically characterize the agent's behavior during learning and demonstrate the emergence of a rhythm that is endogenous and entrainable. Interestingly, the internal rhythm adapts to shifts in the phase of the environmental signal without any re-training. Furthermore, we show via bifurcation and phase response curve analyses how artificial neurons develop dynamics to support the internalization of the environmental rhythm. From a dynamical systems view, we demonstrate that the adaptation proceeds by the emergence of a stable periodic orbit in the neuron dynamics with a phase response that allows an optimal phase synchronisation between the agent's dynamics and the environmental rhythm.Comment: ICML 202

    A Design Science Research Approach to Smart and Collaborative Urban Supply Networks

    Get PDF
    Urban supply networks are facing increasing demands and challenges and thus constitute a relevant field for research and practical development. Supply chain management holds enormous potential and relevance for society and everyday life as the flow of goods and information are important economic functions. Being a heterogeneous field, the literature base of supply chain management research is difficult to manage and navigate. Disruptive digital technologies and the implementation of cross-network information analysis and sharing drive the need for new organisational and technological approaches. Practical issues are manifold and include mega trends such as digital transformation, urbanisation, and environmental awareness. A promising approach to solving these problems is the realisation of smart and collaborative supply networks. The growth of artificial intelligence applications in recent years has led to a wide range of applications in a variety of domains. However, the potential of artificial intelligence utilisation in supply chain management has not yet been fully exploited. Similarly, value creation increasingly takes place in networked value creation cycles that have become continuously more collaborative, complex, and dynamic as interactions in business processes involving information technologies have become more intense. Following a design science research approach this cumulative thesis comprises the development and discussion of four artefacts for the analysis and advancement of smart and collaborative urban supply networks. This thesis aims to highlight the potential of artificial intelligence-based supply networks, to advance data-driven inter-organisational collaboration, and to improve last mile supply network sustainability. Based on thorough machine learning and systematic literature reviews, reference and system dynamics modelling, simulation, and qualitative empirical research, the artefacts provide a valuable contribution to research and practice
    • …
    corecore