255 research outputs found

    Online shopping behaviour of Arab students at College of Business Universiti Utara Malaysia

    Get PDF
    This study attempts to elicit the internet purchasing process among Arabic students in COB of UUM, to examine the perceived credibility influence on shopping behavior of Arabic students and to determine the perceived ease of use influence on shopping behavior of Arabic students as well as to examine the perceived credibility influence on shopping behavior of Arabic students. The participants were Arabic students in College of Business Universiti Utara Malaysia as they expected to come from the various personal backgrounds. The descriptive research design used A questionnaire using a seven-point scale where employed to collect the data for the constructs of the research model. Items from previous studies were modified for adaptation to the internet behavior context. The analysis of the results confirmed the influence of between perceived ease of use, perceived usefulness,and perceived credibility to Online Shopping Behavior. A major conclusion of the study was that perceived ease of use, perceived usefulness and credibility with internet has a direct positive influence to user adoption. Lecturer, staff and student are in the right position to run every movement related to the improvement of the system. Academic and management system are main items in the performance. The acceptance level of user will influence the success of the system

    Hands-On Parameter Search for Neural Simulations by a MIDI-Controller

    Get PDF
    Computational neuroscientists frequently encounter the challenge of parameter fitting – exploring a usually high dimensional variable space to find a parameter set that reproduces an experimental data set. One common approach is using automated search algorithms such as gradient descent or genetic algorithms. However, these approaches suffer several shortcomings related to their lack of understanding the underlying question, such as defining a suitable error function or getting stuck in local minima. Another widespread approach is manual parameter fitting using a keyboard or a mouse, evaluating different parameter sets following the users intuition. However, this process is often cumbersome and time-intensive. Here, we present a new method for manual parameter fitting. A MIDI controller provides input to the simulation software, where model parameters are then tuned according to the knob and slider positions on the device. The model is immediately updated on every parameter change, continuously plotting the latest results. Given reasonably short simulation times of less than one second, we find this method to be highly efficient in quickly determining good parameter sets. Our approach bears a close resemblance to tuning the sound of an analog synthesizer, giving the user a very good intuition of the problem at hand, such as immediate feedback if and how results are affected by specific parameter changes. In addition to be used in research, our approach should be an ideal teaching tool, allowing students to interactively explore complex models such as Hodgkin-Huxley or dynamical systems

    Observers for canonic models of neural oscillators

    Full text link
    We consider the problem of state and parameter estimation for a wide class of nonlinear oscillators. Observable variables are limited to a few components of state vector and an input signal. The problem of state and parameter reconstruction is viewed within the classical framework of observer design. This framework offers computationally-efficient solutions to the problem of state and parameter reconstruction of a system of nonlinear differential equations, provided that these equations are in the so-called adaptive observer canonic form. We show that despite typical neural oscillators being locally observable they are not in the adaptive canonic observer form. Furthermore, we show that no parameter-independent diffeomorphism exists such that the original equations of these models can be transformed into the adaptive canonic observer form. We demonstrate, however, that for the class of Hindmarsh-Rose and FitzHugh-Nagumo models, parameter-dependent coordinate transformations can be used to render these systems into the adaptive observer canonical form. This allows reconstruction, at least partially and up to a (bi)linear transformation, of unknown state and parameter values with exponential rate of convergence. In order to avoid the problem of only partial reconstruction and to deal with more general nonlinear models in which the unknown parameters enter the system nonlinearly, we present a new method for state and parameter reconstruction for these systems. The method combines advantages of standard Lyapunov-based design with more flexible design and analysis techniques based on the non-uniform small-gain theorems. Effectiveness of the method is illustrated with simple numerical examples

    A calcium-based plasticity model for predicting long-term potentiation and depression in the neocortex

    Get PDF
    Pyramidal cells (PCs) form the backbone of the layered structure of the neocortex, and plasticity of their synapses is thought to underlie learning in the brain. However, such long-term synaptic changes have been experimentally characterized between only a few types of PCs, posing a significant barrier for studying neocortical learning mechanisms. Here we introduce a model of synaptic plasticity based on data-constrained postsynaptic calcium dynamics, and show in a neocortical microcircuit model that a single parameter set is sufficient to unify the available experimental findings on long-term potentiation (LTP) and long-term depression (LTD) of PC connections. In particular, we find that the diverse plasticity outcomes across the different PC types can be explained by cell-type-specific synaptic physiology, cell morphology and innervation patterns, without requiring type-specific plasticity. Generalizing the model to in vivo extracellular calcium concentrations, we predict qualitatively different plasticity dynamics from those observed in vitro. This work provides a first comprehensive null model for LTP/LTD between neocortical PC types in vivo, and an open framework for further developing models of cortical synaptic plasticity.We thank Michael Hines for helping with synapse model implementation in NEURON; Mariana Vargas-Caballero for sharing NMDAR data; Veronica Egger for sharing in vitro data and for clarifications on the analysis methods; Jesper Sjöström for sharing in vitro data, helpful discussions, and feedback on the manuscript; Ralf Schneggenburger for helpful discussions and clarifications on the NMDAR calcium current model; Fabien Delalondre for helpful discussions; Francesco Casalegno and Taylor Newton for helpful discussion on model fitting; Daniel Keller for helpful discussions on the biophysics of synaptic plasticity; Natali Barros-Zulaica for helpful discussions on MVR modeling and generalization; Srikanth Ramaswamy, Michael Reimann and Max Nolte for feedback on the manuscript; Wulfram Gerstner and Guillaume Bellec for helpful discussions on synaptic plasticity modeling. This study was supported by funding to the Blue Brain Project, a research center of the École polytechnique fédérale de Lausanne, from the Swiss government’s ETH Board of the Swiss Federal Institutes of Technology. E.B.M. received additional support from the CHU Sainte-Justine Research Center (CHUSJRC), the Institute for Data Valorization (IVADO), Fonds de Recherche du Québec–Santé (FRQS), the Canada CIFAR AI Chairs Program, the Quebec Institute for Artificial Intelligence (Mila), and Google. R.B.P. and J.DF. received support from the Spanish “Ministerio de Ciencia e Innovación” (grant PGC2018-094307-B-I00). M.D. and I.S. were supported by a grant from the ETH domain for the Blue Brain Project, the Gatsby Charitable Foundation, and the Drahi Family Foundation

    The physiological variability of channel density in hippocampal CA1 pyramidal cells and interneurons explored using a unified data-driven modeling workflow

    Get PDF
    Every neuron is part of a network, exerting its function by transforming multiple spatiotemporal synaptic input patterns into a single spiking output. This function is specified by the particular shape and passive electrical properties of the neuronal membrane, and the composition and spatial distribution of ion channels across its processes. For a variety of physiological or pathological reasons, the intrinsic input/output function may change during a neuron’s lifetime. This process results in high variability in the peak specific conductance of ion channels in individual neurons. The mechanisms responsible for this variability are not well understood, although there are clear indications from experiment and modeling that degeneracy and correlation among multiple channels may be involved. Here, we studied this issue in biophysical models of hippocampal CA1 pyramidal neurons and interneurons. Using a unified data-driven simulation workflow and starting from a set of experimental recordings and morphological reconstructions obtained from rats, we built and analyzed several ensembles of morphologically and biophysically accurate single cell models with intrinsic electrophysiological properties consistent with experimental findings. The results suggest that the set of conductances expressed in any given hippocampal neuron may be considered as belonging to two groups: one subset is responsible for the major characteristics of the firing behavior in each population and the other responsible for a robust degeneracy. Analysis of the model neurons suggests several experimentally testable predictions related to the combination and relative proportion of the different conductances that should be expressed on the membrane of different types of neurons for them to fulfill their role in the hippocampus circuitry

    Experimentally-constrained biophysical models of tonic and burst firing modes in thalamocortical neurons

    Get PDF
    Somatosensory thalamocortical (TC) neurons from the ventrobasal (VB) thalamus are central components in the flow of sensory information between the periphery and the cerebral cortex, and participate in the dynamic regulation of thalamocortical states including wakefulness and sleep. This property is reflected at the cellular level by the ability to generate action potentials in two distinct firing modes, called tonic firing and low-threshold bursting. Although the general properties of TC neurons are known, we still lack a detailed characterization of their morphological and electrical properties in the VB thalamus. The aim of this study was to build biophysically-detailed models of VB TC neurons explicitly constrained with experimental data from rats. We recorded the electrical activity of VB neurons (N = 49) and reconstructed morphologies in 3D (N = 50) by applying standardized protocols. After identifying distinct electrical types, we used a multi-objective optimization to fit single neuron electrical models (e-models), which yielded multiple solutions consistent with the experimental data. The models were tested for generalization using electrical stimuli and neuron morphologies not used during fitting. A local sensitivity analysis revealed that the e-models are robust to small parameter changes and that all the parameters were constrained by one or more features. The e-models, when tested in combination with different morphologies, showed that the electrical behavior is substantially preserved when changing dendritic structure and that the e-models were not overfit to a specific morphology. The models and their analysis show that automatic parameter search can be applied to capture complex firing behavior, such as co-existence of tonic firing and low-threshold bursting over a wide range of parameter sets and in combination with different neuron morphologies

    Using Evolutionary Algorithms for Fitting High-Dimensional Models to Neuronal Data

    Get PDF
    In the study of neurosciences, and of complex biological systems in general, there is frequently a need to fit mathematical models with large numbers of parameters to highly complex datasets. Here we consider algorithms of two different classes, gradient following (GF) methods and evolutionary algorithms (EA) and examine their performance in fitting a 9-parameter model of a filter-based visual neuron to real data recorded from a sample of 107 neurons in macaque primary visual cortex (V1). Although the GF method converged very rapidly on a solution, it was highly susceptible to the effects of local minima in the error surface and produced relatively poor fits unless the initial estimates of the parameters were already very good. Conversely, although the EA required many more iterations of evaluating the model neuron’s response to a series of stimuli, it ultimately found better solutions in nearly all cases and its performance was independent of the starting parameters of the model. Thus, although the fitting process was lengthy in terms of processing time, the relative lack of human intervention in the evolutionary algorithm, and its ability ultimately to generate model fits that could be trusted as being close to optimal, made it far superior in this particular application than the gradient following methods. This is likely to be the case in many further complex systems, as are often found in neuroscience

    Spatially distributed dendritic resonance selectively filters synaptic input

    Get PDF
    © 2014 Laudanski et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.An important task performed by a neuron is the selection of relevant inputs from among thousands of synapses impinging on the dendritic tree. Synaptic plasticity enables this by strenghtening a subset of synapses that are, presumably, functionally relevant to the neuron. A different selection mechanism exploits the resonance of the dendritic membranes to preferentially filter synaptic inputs based on their temporal rates. A widely held view is that a neuron has one resonant frequency and thus can pass through one rate. Here we demonstrate through mathematical analyses and numerical simulations that dendritic resonance is inevitably a spatially distributed property; and therefore the resonance frequency varies along the dendrites, and thus endows neurons with a powerful spatiotemporal selection mechanism that is sensitive both to the dendritic location and the temporal structure of the incoming synaptic inputs.Peer reviewe
    corecore