764 research outputs found

    Kako ispitati dinamiku širenja vatrene lopte ?

    Get PDF
    Model predictions on the yield distributions for different reaction products and on the collective energy of the expanding fireball as a function of break-up time, corroborated with the experimental results on the small-angle two-particle longitudinal correlation functions, transition energy dependence as a function of transversal momentum and azimuthal distributions of the collective expansion, evidence in a clear way the possibility to access different time slots in the dynamics of the fireball.Predviđanja modela o raspodjelama prinosa za različite izlazne čestice i za skupnu energiju vatrene lopte koja se širi, koja potkrijepljuju ishodi mjerenja uzdužnih dvočestičnih korelacijskih funkcija na malim kutovima, ovisnost prijelazne energije o poprečnom impulsu, te azimutalne raspodjele skupnog širenja, jasno pokazuju na mogućnost razlučivanja različitih vremenskih intervala u dinamici vatrene lopte

    Kako ispitati dinamiku širenja vatrene lopte ?

    Get PDF
    Model predictions on the yield distributions for different reaction products and on the collective energy of the expanding fireball as a function of break-up time, corroborated with the experimental results on the small-angle two-particle longitudinal correlation functions, transition energy dependence as a function of transversal momentum and azimuthal distributions of the collective expansion, evidence in a clear way the possibility to access different time slots in the dynamics of the fireball.Predviđanja modela o raspodjelama prinosa za različite izlazne čestice i za skupnu energiju vatrene lopte koja se širi, koja potkrijepljuju ishodi mjerenja uzdužnih dvočestičnih korelacijskih funkcija na malim kutovima, ovisnost prijelazne energije o poprečnom impulsu, te azimutalne raspodjele skupnog širenja, jasno pokazuju na mogućnost razlučivanja različitih vremenskih intervala u dinamici vatrene lopte

    The coming decade of digital brain research - A vision for neuroscience at the intersection of technology and computing

    Get PDF
    Brain research has in recent years indisputably entered a new epoch, driven by substantial methodological advances and digitally enabled data integration and modeling at multiple scales – from molecules to the whole system. Major advances are emerging at the intersection of neuroscience with technology and computing. This new science of the brain integrates high-quality basic research, systematic data integration across multiple scales, a new culture of large-scale collaboration and translation into applications. A systematic approach, as pioneered in Europe’s Human Brain Project (HBP), will be essential in meeting the pressing medical and technological challenges of the coming decade. The aims of this paper are: To develop a concept for the coming decade of digital brain research To discuss it with the research community at large, with the aim of identifying points of convergence and common goals. To provide a scientific framework for current and future development of EBRAINS. To inform and engage stakeholders, funding organizations and research institutions regarding future digital brain research. To identify and address key ethical and societal issues. While we do not claim that there is a ‘one size fits all’ approach to addressing these aspects, we are convinced that discussions around the theme of digital brain research will help drive progress in the broader field of neuroscience

    Data mining electronic health records of type 2 diabetes uncontrolled patients towards clustering LDL-cholesterol patterns

    Get PDF
    Cardiovascular Diseases (CVD) present the highest world health rate, constituting a risk factor to patients with diabetes and simultaneously a consequence of dyslipidemia. E ective lipid management of patients with diabetes is still largely unattained, requiring better perception of both patients and healthcare professionals. Aiming at better understanding the in uence of clinical parameters on Low Density Lipoprotein (LDL)-cholesterol patterns of type 2 diabetes uncontrolled patients, the Electronic Health Records (EHR) provided by APDP (Associa c~ao Protetora de Diabetes Portugal) have been subject to data mining techniques. The database content was primarily analyzed to understand data integrity and to avoid usage of EHR's corrupted values or misleading information. The statistical distribution of each clinical parameter reported in the data base took place to identify their individual behavior and to enable statistically coherent identi cation of the cohort to be used when modeling LDL. As a rst approach, LDL linear modeling was considered, using both ordinary leastsquares and stepwise approaches. Then, LDL non-linear modeling was tested, using the same populations employed on linear modeling to assess the most accurate and practical LDL model. The provided EHR included 32577 medical appointments held by 1767 patients between January 2008 and February 2018. More than 10 clinical features were studied, leading to the decision of limiting the case-study population to those patients who had at least 5 Medical Appointments (MA) during the decade. From all MA's, 32% and 63% reported LDL and Glycated Hemoglobin (HbA1c) measurements, respectively, but some MA's did not report both simultaneously. Six linear models, relating di erent sets of 6 clinical parameters were tested. The linear model 3, involving LDL, Total Cholesterol, HDL, Triglyceride, HbA1c and Platelet is the elected linear model with a Root Mean Square Error (RMSE) of 0.07. The model where Platelets are substituted by Proteinuria presents a RMSE of just 0.054 but employed solely 38 case-studies. Neural network-based modeling strategies were tested as an alternative to linear models. In this sense, the Multi-Objective Genetic Algorithm (MOGA) was used. After data preprocessing, MOGA was performed twice using di erent threshold values. Six models were developed considering di erent combinations of clinical parameters. For each model, the population was divided into 3 groups: 60% of the population was used to train the network, 20% to test the model and the remaining 20% to validate the model. Using the populations employed by each MOGA run, the stepwise algorithm was used to identify the relevance of each clinical parameter in the model and create another linear model using this parameter set. The MOGA model with the best training performance was Model 4, while model 2 was the one performing best in validation with RMSE of 0.057. However, linear model 5 created using the parameter selection identi ed by the MOGA presented a RMSE of 0.054 during validation when total cholesterol, HDL, triglyceride, HbA1c, microalbuminuria, creatinine, MDRD, sex and age are used in the composition of the LDL linear model. Therefore, we can conclude that LDL can be modeled by a linear model using 6 or 10 clinical variables with very low mean square error.As doenças cardiovasculares (CVD) continuam a ser a maior causa de morte no mundo e constituem um fator de risco para diabéticos para além de os diabéticos terem maior propensão para desenvolver CVD. No entanto, apesar de as diretrizes recentes cobrirem o risco de CVD, o efetivo controlo lipídico está longe de ser conseguido. Além disso, a autogestão lipídica em conjunto com o gerenciamento de decisões terapêuticas, nem sempre assume a prioridade adequada quer pelos pacientes quer pelos profissionais de saúde. Pretendendo compreender melhor a influência dos parâmetros clínicos no colesterol de lipoproteínas de baixa densidade (LDL) de doentes diabéticos tipo 2, doentes estes cujo gerenciamento dos valores lipídicos se suspeitam inst aveis, recorreu-se a registos eletrónicos de saúde (EHR) providenciados pela APDP (Associação Protetora de Diabetes Portugal) para fazer um estudo baseado em técnicas de mineração de dados.(…

    Form vs. Function: Theory and Models for Neuronal Substrates

    Get PDF
    The quest for endowing form with function represents the fundamental motivation behind all neural network modeling. In this thesis, we discuss various functional neuronal architectures and their implementation in silico, both on conventional computer systems and on neuromorpic devices. Necessarily, such casting to a particular substrate will constrain their form, either by requiring a simplified description of neuronal dynamics and interactions or by imposing physical limitations on important characteristics such as network connectivity or parameter precision. While our main focus lies on the computational properties of the studied models, we augment our discussion with rigorous mathematical formalism. We start by investigating the behavior of point neurons under synaptic bombardment and provide analytical predictions of single-unit and ensemble statistics. These considerations later become useful when moving to the functional network level, where we study the effects of an imperfect physical substrate on the computational properties of several cortical networks. Finally, we return to the single neuron level to discuss a novel interpretation of spiking activity in the context of probabilistic inference through sampling. We provide analytical derivations for the translation of this ``neural sampling'' framework to networks of biologically plausible and hardware-compatible neurons and later take this concept beyond the realm of brain science when we discuss applications in machine learning and analogies to solid-state systems

    The Yin-Yang dataset

    Get PDF
    The Yin-Yang dataset was developed for research on biologically plausible error backpropagation and deep learning in spiking neural networks. It serves as an alternative to classic deep learning datasets, especially in early-stage prototyping scenarios for both network models and hardware platforms, for which it provides several advantages. First, it is smaller and therefore faster to learn, thereby being better suited for small-scale exploratory studies in both software simulations and hardware prototypes. Second, it exhibits a very clear gap between the accuracies achievable using shallow as compared to deep neural networks. Third, it is easily transferable between spatial and temporal input domains, making it interesting for different types of classification scenarios

    Natural-gradient learning for spiking neurons.

    Get PDF
    In many normative theories of synaptic plasticity, weight updates implicitly depend on the chosen parametrization of the weights. This problem relates, for example, to neuronal morphology: synapses which are functionally equivalent in terms of their impact on somatic firing can differ substantially in spine size due to their different positions along the dendritic tree. Classical theories based on Euclidean-gradient descent can easily lead to inconsistencies due to such parametrization dependence. The issues are solved in the framework of Riemannian geometry, in which we propose that plasticity instead follows natural-gradient descent. Under this hypothesis, we derive a synaptic learning rule for spiking neurons that couples functional efficiency with the explanation of several well-documented biological phenomena such as dendritic democracy, multiplicative scaling, and heterosynaptic plasticity. We therefore suggest that in its search for functional synaptic plasticity, evolution might have come up with its own version of natural-gradient descent

    Natural-gradient learning for spiking neurons

    Get PDF
    In many normative theories of synaptic plasticity, weight updates implicitly depend on the chosen parametrization of the weights. This problem relates, for example, to neuronal morphology: synapses which are functionally equivalent in terms of their impact on somatic firing can differ substantially in spine size due to their different positions along the dendritic tree. Classical theories based on Euclidean gradient descent can easily lead to inconsistencies due to such parametrization dependence. The issues are solved in the framework of Riemannian geometry, in which we propose that plasticity instead follows natural gradient descent. Under this hypothesis, we derive a synaptic learning rule for spiking neurons that couples functional efficiency with the explanation of several well-documented biological phenomena such as dendritic democracy, multiplicative scaling and heterosynaptic plasticity. We therefore suggest that in its search for functional synaptic plasticity, evolution might have come up with its own version of natural gradient descent.Comment: Joint senior authorship: Walter M. Senn and Mihai A. Petrovic
    corecore