12,286 research outputs found

    The role of type 4 phosphodiesterases in generating microdomains of cAMP: Large scale stochastic simulations

    Get PDF
    Cyclic AMP (cAMP) and its main effector Protein Kinase A (PKA) are critical for several aspects of neuronal function including synaptic plasticity. Specificity of synaptic plasticity requires that cAMP activates PKA in a highly localized manner despite the speed with which cAMP diffuses. Two mechanisms have been proposed to produce localized elevations in cAMP, known as microdomains: impeded diffusion, and high phosphodiesterase (PDE) activity. This paper investigates the mechanism of localized cAMP signaling using a computational model of the biochemical network in the HEK293 cell, which is a subset of pathways involved in PKA-dependent synaptic plasticity. This biochemical network includes cAMP production, PKA activation, and cAMP degradation by PDE activity. The model is implemented in NeuroRD: novel, computationally efficient, stochastic reaction-diffusion software, and is constrained by intracellular cAMP dynamics that were determined experimentally by real-time imaging using an Epac-based FRET sensor (H30). The model reproduces the high concentration cAMP microdomain in the submembrane region, distinct from the lower concentration of cAMP in the cytosol. Simulations further demonstrate that generation of the cAMP microdomain requires a pool of PDE4D anchored in the cytosol and also requires PKA-mediated phosphorylation of PDE4D which increases its activity. The microdomain does not require impeded diffusion of cAMP, confirming that barriers are not required for microdomains. The simulations reported here further demonstrate the utility of the new stochastic reaction-diffusion algorithm for exploring signaling pathways in spatially complex structures such as neurons

    Workshop sensing a changing world : proceedings workshop November 19-21, 2008

    Get PDF

    Optical oxygen sensing with artificial intelligence

    Get PDF
    Luminescence-based sensors for measuring oxygen concentration are widely used in both industry and research due to the practical advantages and sensitivity of this type of sensing. The measuring principle is the luminescence quenching by oxygen molecules, which results in a change of the luminescence decay time and intensity. In the classical approach, this change is related to an oxygen concentration using the Stern-Volmer equation. This equation, which in most cases is non-linear, is parameterized through device-specific constants. Therefore, to determine these parameters, every sensor needs to be precisely calibrated at one or more known concentrations. This study explored an entirely new artificial intelligence approach and demonstrated the feasibility of oxygen sensing through machine learning. The specifically developed neural network learns very efficiently to relate the input quantities to the oxygen concentration. The results show a mean deviation of the predicted from the measured concentration of 0.5% air, comparable to many commercial and low-cost sensors. Since the network was trained using synthetically generated data, the accuracy of the model predictions is limited by the ability of the generated data to describe the measured data, opening up future possibilities for significant improvement by using a large number of experimental measurements for training. The approach described in this work demonstrates the applicability of artificial intelligence to sensing technology and paves the road for the next generation of sensors

    The future of Earth observation in hydrology

    Get PDF
    In just the past 5 years, the field of Earth observation has progressed beyond the offerings of conventional space-agency-based platforms to include a plethora of sensing opportunities afforded by CubeSats, unmanned aerial vehicles (UAVs), and smartphone technologies that are being embraced by both for-profit companies and individual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically of the order of 1 billion dollars per satellite and with concept-to-launch timelines of the order of 2 decades (for new missions). More recently, the proliferation of smart-phones has helped to miniaturize sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing ultra-high (3-5 m) resolution sensing of the Earth on a daily basis. Start-up companies that did not exist a decade ago now operate more satellites in orbit than any space agency, and at costs that are a mere fraction of traditional satellite missions. With these advances come new space-borne measurements, such as real-time high-definition video for tracking air pollution, storm-cell development, flood propagation, precipitation monitoring, or even for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-metre resolutions, pushing back on spatio-temporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizen scientists to catalogue photos of environmental conditions, estimate daily average temperatures from battery state, and sense other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high-altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the "internet of things" as an entirely new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is unclear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms present our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilize and exploit these new observing systems

    Analysis of Multivariate Sensor Data for Monitoring of Cultivations

    Get PDF
    • …
    corecore