83 research outputs found

    On the Thermodynamic Consistency of a Two Micro-Structured Thixotropic Constitutive Model

    Get PDF
    The time-dependent rheological behavior of the thixotropic fluids is presented in various industrial fields (cosmetics, food, oil, etc.). Usually, a couple of equations define constitutive model for thixotropic substances: a constitutive equation based on linear viscoelastic models and a rate equation (an equation related to the micro-structural evolution of the substance). Many constitutive models do not take into account the micro-structural dependence of the shear modulus and viscosity in the dynamic principles from which are developed. The modified Jeffreys model (considering only one single micro-structure type) does not show this incoherence in its formulation. In this chapter, a constitutive model for thixotropic fluids, based on modified Jeffreys model, is presented with the addition of one more micro-structure type, besides of comments on some possible generalizations. The rheological coherence of this constitutive model and thermodynamic consistency are analyzed too. This model takes into account a simple isothermal laminar shear flows, and the micro-structures dynamics are relate to Brownian motion and de Gennes Reptation model via the Smoluchowski™s coagulation theory

    Paternal Diet Defines Offspring Chromatin State and Intergenerational Obesity

    Get PDF
    The global rise in obesity has revitalized a search for genetic and epigenetic factors underlying the disease. We present a Drosophila model of paternal-diet-induced intergenerational metabolic reprogramming (IGMR) and identify genes required for its encoding in offspring. Intriguingly, we find that as little as 2 days of dietary intervention in fathers elicits obesity in offspring. Paternal sugar acts as a physiological suppressor of variegation, desilencing chromatin-state-defined domains in both mature sperm and in offspring embryos. We identify requirements for H3K9/K27me3-dependent reprogramming of metabolic genes in two distinct germline and zygotic windows. Critically, we find evidence that a similar system may regulate obesity susceptibility and phenotype variation in mice and humans. The findings provide insight into the mechanisms underlying intergenerational metabolic reprogramming and carry profound implications for our understanding of phenotypic variation and evolution

    How Memory Conforms to Brain Development

    Get PDF
    Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between “form” and “function” spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.Financial support from the Spanish Ministry of Science and Technology, and the Agencia Española de Investigación (AEI) under grant FIS2017-84256-P (FEDER funds) and from the Obra Social La Caixa (ID 100010434, with code LCF/BQ/ES15/10360004). This study has been also partially financed by the Consejería de Conocimiento, Investigación y Universidad, Junta de Andalucía and European Regional Development Fund (ERDF), with reference SOMM17/6105/UGR

    An energy–momentum time integration scheme based on a convex multi-variable framework for non-linear electro-elastodynamics

    Get PDF
    This paper introduces a new one-step second order accurate energy–momentum (EM) preserving time integrator for reversible electro-elastodynamics. The new scheme is shown to be extremely useful for the long-term simulation of electroactive polymers (EAPs) undergoing massive strains and/or electric fields. The paper presents the following main novelties. (1) The formulation of a new energy momentum time integrator scheme in the context of nonlinear electro-elastodynamics. (2) The consideration of well-posed ab initio convex multi-variable constitutive models. (3) Based on the use of alternative mixed variational principles, the paper introduces two different EM time integration strategies (one based on the Helmholtz’s and the other based on the internal energy). (4) The new time integrator relies on the definition of four discrete derivatives of the internal/Helmholtz energies representing the algorithmic counterparts of the work conjugates of the right Cauchy–Green deformation tensor, its co-factor, its determinant and the Lagrangian electric displacement field. (6) Proof of thermodynamic consistency and of second order accuracy with respect to time of the resulting algorithm is included. Finally, a series of numerical examples are included in order to demonstrate the robustness and conservation properties of the proposed scheme, specifically in the case of long-term simulations

    Energy efficient sparse connectivity from imbalanced synaptic plasticity rules

    Get PDF
    It is believed that energy efficiency is an important constraint in brain evolution. As synaptic transmission dominates energy consumption, energy can be saved by ensuring that only a few synapses are active. It is therefore likely that the formation of sparse codes and sparse connectivity are fundamental objectives of synaptic plasticity. In this work we study how sparse connectivity can result from a synaptic learning rule of excitatory synapses. Information is maximised when potentiation and depression are balanced according to the mean presynaptic activity level and the resulting fraction of zero-weight synapses is around 50%. However, an imbalance towards depression increases the fraction of zero-weight synapses without significantly affecting performance. We show that imbalanced plasticity corresponds to imposing a regularising constraint on the L1-norm of the synaptic weight vector, a procedure that is well-known to induce sparseness. Imbalanced plasticity is biophysically plausible and leads to more efficient synaptic configurations than a previously suggested approach that prunes synapses after learning. Our framework gives a novel interpretation to the high fraction of silent synapses found in brain regions like the cerebellum

    Spike-Based Bayesian-Hebbian Learning of Temporal Sequences

    Get PDF
    Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison

    Hippocampal Mechanisms for the Segmentation of Space by Goals and Boundaries

    Get PDF
    corecore