631,328 research outputs found

    Towards a Mathematical Theory of Cortical Micro-circuits

    Get PDF
    The theoretical setting of hierarchical Bayesian inference is gaining acceptance as a framework for understanding cortical computation. In this paper, we describe how Bayesian belief propagation in a spatio-temporal hierarchical model, called Hierarchical Temporal Memory (HTM), can lead to a mathematical model for cortical circuits. An HTM node is abstracted using a coincidence detector and a mixture of Markov chains. Bayesian belief propagation equations for such an HTM node define a set of functional constraints for a neuronal implementation. Anatomical data provide a contrasting set of organizational constraints. The combination of these two constraints suggests a theoretically derived interpretation for many anatomical and physiological features and predicts several others. We describe the pattern recognition capabilities of HTM networks and demonstrate the application of the derived circuits for modeling the subjective contour effect. We also discuss how the theory and the circuit can be extended to explain cortical features that are not explained by the current model and describe testable predictions that can be derived from the model

    Mixed hidden Markov quantile regression models for longitudinal data with possibly incomplete sequences

    No full text
    Quantile regression provides a detailed and robust picture of the distribution of a response variable, conditional on a set of observed covariates. Recently, it has be been extended to the analysis of longitudinal continuous outcomes using either time-constant or time-varying random parameters. However, in real-life data, we frequently observe both temporal shocks in the overall trend and individual-specific heterogeneity in model parameters. A benchmark dataset on HIV progression gives a clear example. Here, the evolution of the CD4 log counts exhibits both sudden temporal changes in the overall trend and heterogeneity in the effect of the time since seroconversion on the response dynamics. To accommodate such situations, we propose a quantile regression model, where time-varying and time-constant random coefficients are jointly considered. Since observed data may be incomplete due to early drop-out, we also extend the proposed model in a pattern mixture perspective. We assess the performance of the proposals via a large-scale simulation study and the analysis of the CD4 count data

    Estimating position & velocity in 3D space from monocular video sequences using a deep neural network

    Get PDF
    This work describes a regression model based on Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) networks for tracking objects from monocular video sequences. The target application being pursued is Vision-Based Sensor Substitution (VBSS). In particular, the tool-tip position and velocity in 3D space of a pair of surgical robotic instruments (SRI) are estimated for three surgical tasks, namely suturing, needle-passing and knot-tying. The CNN extracts features from individual video frames and the LSTM network processes these features over time and continuously outputs a 12-dimensional vector with the estimated position and velocity values. A series of analyses and experiments are carried out in the regression model to reveal the benefits and drawbacks of different design choices. First, the impact of the loss function is investigated by adequately weighing the Root Mean Squared Error (RMSE) and Gradient Difference Loss (GDL), using the VGG16 neural network for feature extraction. Second, this analysis is extended to a Residual Neural Network designed for feature extraction, which has fewer parameters than the VGG16 model, resulting in a reduction of ~96.44 % in the neural network size. Third, the impact of the number of time steps used to model the temporal information processed by the LSTM network is investigated. Finally, the capability of the regression model to generalize to the data related to "unseen" surgical tasks (unavailable in the training set) is evaluated. The aforesaid analyses are experimentally validated on the public dataset JIGSAWS. These analyses provide some guidelines for the design of a regression model in the context of VBSS, specifically when the objective is to estimate a set of 1D time series signals from video sequences.Peer ReviewedPostprint (author's final draft

    Spatiotemporal modeling of hydrological return levels: A quantile regression approach

    Get PDF
    Extreme river flows can lead to inundation of floodplains, with consequent impacts for society, the environment and the economy. Extreme flows are inherently diffcult to model being infrequent, irregularly spaced and affected by non-stationary climatic controls. To identify patterns in extreme flows a quantile regression approach can be used. This paper introduces a new framework for spatio-temporal quantile regression modelling, where the regression model is built as an additive model that includes smooth functions of time and space, as well as space-time interaction effects. The model exploits the exibility that P-splines offer and can be easily extended to incorporate potential covariates. We propose to estimate model parameters using a penalized least squares regression approach as an alternative to linear programming methods, classically used in quantile parameter estimation. The model is illustrated on a data set of flows in rivers across Scotland

    KALMAN FILTER BASED TECHNIQUES FOR ASSIMILATION OF RADAR DATA

    Get PDF
    The assimilation of radar data in storm-scale numerical weather prediction models is essential for improved forecasts of thunderstorm events. The huge computational cost of assimilating the high temporal and spatial resolution radar observations poses a challenge to the data assimilation techniques. The objective of this study is to examine the Kalman filter based technique for assimilating the high density radar observations. The first set of experiments evaluates the impact of assimilating high temporal frequency radar observations over a shorter assimilation period using the Ensemble Square Root Filter (EnSRF) data assimilation technique. The impact of model error and the value of using a range of intercept and density parameters for hydrometeor categories across the ensemble members within the same microphysics scheme are examined in the second set of experiments using the EnSRF technique. While the EnSRF technique shows promise in radar data assimilation, one limitation of EnSRF is the high computational expense when the number of observations is very large. Thus in an effort to explore efficient data assimilation method, the feasibility of the information filter as data assimilation technique for large number of observations assimilation is examined. The extended information filter (EIF) is implemented using the Lorenz 96 model and the performance of EIF in assimilating high density observations are compared with the benchmark extended Kalman filter (EKF) data assimilation technique

    Query processing in temporal object-oriented databases

    Get PDF
    This PhD thesis is concerned with historical data management in the context of objectoriented databases. An extensible approach has been explored to processing temporal object queries within a uniform query framework. By the uniform framework, we mean temporal queries can be processed within the existing object-oriented framework that is extended from relational framework, by extending the existing query processing techniques and strategies developed for OODBs and RDBs. The unified model of OODBs and RDBs in UmSQL/X has been adopted as a basis for this purpose. A temporal object data model is thereby defined by incorporating a time dimension into this unified model of OODBs and RDBs to form temporal relational-like cubes but with the addition of aggregation and inheritance hierarchies. A query algebra, that accesses objects through these associations of aggregation, inheritance and timereference, is then defined as a general query model /language. Due to the extensive features of our data model and reducibility of the algebra, a layered structure of query processor is presented that provides a uniforrn framework for processing temporal object queries. Within the uniform framework, query transformation is carried out based on a set of transformation rules identified that includes the known relational and object rules plus those pertaining to the time dimension. To evaluate a temporal query involving a path with timereference, a strategy of decomposition is proposed. That is, evaluation of an enhanced path, which is defined to extend a path with time-reference, is decomposed by initially dividing the path into two sub-paths: one containing the time-stamped class that can be optimized by making use of the ordering information of temporal data and another an ordinary sub-path (without time-stamped classes) which can be further decomposed and evaluated using different algorithms. The intermediate results of traversing the two sub-paths are then joined together to create the query output. Algorithms for processing the decomposed query components, i. e., time-related operation algorithms, four join algorithms (nested-loop forward join, sort-merge forward join, nested-loop reverse join and sort-merge reverse join) and their modifications, have been presented with cost analysis and implemented with stream processing techniques using C++. Simulation results are also provided. Both cost analysis and simulation show the effects of time on the query processing algorithms: the join time cost is linearly increased with the expansion in the number of time-epochs (time-dimension in the case of a regular TS). It is also shown that using heuristics that make use of time information can lead to a significant time cost saving. Query processing with incomplete temporal data has also been discussed

    On Motion Parameterizations in Image Sequences from Fixed Viewpoints

    Get PDF
    This dissertation addresses the problem of parameterizing object motion within a set of images taken with a stationary camera. We develop data-driven methods across all image scales: characterizing motion observed at the scale of individual pixels, along extended structures such as roads, and whole image deformations such as lungs deforming over time. The primary contributions include: a) fundamental studies of the relationship between spatio-temporal image derivatives accumulated at a pixel, and the object motions at that pixel,: b) data driven approaches to parameterize breath motion and reconstruct lung CT data volumes, and: c) defining and offering initial results for a new class of Partially Unsupervised Manifold Learning: PUML) problems, which often arise in medical imagery. Specifically, we create energy functions for measuring how consistent a given velocity vector is with observed spatio-temporal image derivatives. These energy functions are used to fit parametric snake models to roads using velocity constraints. We create an automatic data-driven technique for finding the breath phase of lung CT scans which is able to replace external belt measurements currently in use clinically. This approach is extended to automatically create a full deformation model of a CT lung volume during breathing or heart MRI during breathing and heartbeat. Additionally, motivated by real use cases, we address a scenario in which a dataset is collected along with meta-data which describes some, but not all, aspects of the dataset. We create an embedding which displays the remaining variability in a dataset after accounting for variability related to the meta-data

    Dynamical Approach for Real-Time Monitoring of Agricultural Crops

    Get PDF
    In this paper, a novel approach for exploiting multitemporal remote sensing data focused on real-time monitoring of agricultural crops is presented. The methodology is defined in a dynamical system context using state-space techniques, which enables the possibility of merging past temporal information with an update for each new acquisition. The dynamic system context allows us to exploit classical tools in this domain to perform the estimation of relevant variables. A general methodology is proposed, and a particular instance is defined in this study based on polarimetric radar data to track the phenological stages of a set of crops. A model generation from empirical data through principal component analysis is presented, and an extended Kalman filter is adapted to perform phenological stage estimation. Results employing quad-pol Radarsat-2 data over three different cereals are analyzed. The potential of this methodology to retrieve vegetation variables in real time is shown.This work was supported in part by the Spanish Ministry of Economy and Competitiveness (MINECO) and EU FEDER under Project TEC2011-28201-C02-02 and in part by the Generalitat Valenciana under Project ACOMP/2014/136
    • …
    corecore