969 research outputs found

    Sampled data systems and generating functions

    Get PDF
    Application of Z-transforms to sampled-data system

    When Decision Meets Estimation: Theory and Applications

    Get PDF
    In many practical problems, both decision and estimation are involved. This dissertation intends to study the relationship between decision and estimation in these problems, so that more accurate inference methods can be developed. Hybrid estimation is an important formulation that deals with state estimation and model structure identification simultaneously. Multiple-model (MM) methods are the most widelyused tool for hybrid estimation. A novel approach to predict the Internet end-to-end delay using MM methods is proposed. Based on preliminary analysis of the collected end-to-end delay data, we propose an off-line model set design procedure using vector quantization (VQ) and short-term time series analysis so that MM methods can be applied to predict on-line measurement data. Experimental results show that the proposed MM predictor outperforms two widely used adaptive filters in terms of prediction accuracy and robustness. Although hybrid estimation can identify model structure, it mainly focuses on the estimation part. When decision and estimation are of (nearly) equal importance, a joint solution is preferred. By noticing the resemblance, a new Bayes risk is generalized from those of decision and estimation, respectively. Based on this generalized Bayes risk, a novel, integrated solution to decision and estimation is introduced. Our study tries to give a more systematic view on the joint decision and estimation (JDE) problem, which we believe the work in various fields, such as target tracking, communications, time series modeling, will benefit greatly from. We apply this integrated Bayes solution to joint target tracking and classification, a very important topic in target inference, with simplified measurement models. The results of this new approach are compared with two conventional strategies. At last, a surveillance testbed is being built for such purposes as algorithm development and performance evaluation. We try to use the testbed to bridge the gap between theory and practice. In the dissertation, an overview as well as the architecture of the testbed is given and one case study is presented. The testbed is capable to serve the tasks with decision and/or estimation aspects, and is helpful for the development of the JDE algorithms

    When Decision Meets Estimation: Theory and Applications

    Get PDF
    In many practical problems, both decision and estimation are involved. This dissertation intends to study the relationship between decision and estimation in these problems, so that more accurate inference methods can be developed. Hybrid estimation is an important formulation that deals with state estimation and model structure identification simultaneously. Multiple-model (MM) methods are the most widelyused tool for hybrid estimation. A novel approach to predict the Internet end-to-end delay using MM methods is proposed. Based on preliminary analysis of the collected end-to-end delay data, we propose an off-line model set design procedure using vector quantization (VQ) and short-term time series analysis so that MM methods can be applied to predict on-line measurement data. Experimental results show that the proposed MM predictor outperforms two widely used adaptive filters in terms of prediction accuracy and robustness. Although hybrid estimation can identify model structure, it mainly focuses on the estimation part. When decision and estimation are of (nearly) equal importance, a joint solution is preferred. By noticing the resemblance, a new Bayes risk is generalized from those of decision and estimation, respectively. Based on this generalized Bayes risk, a novel, integrated solution to decision and estimation is introduced. Our study tries to give a more systematic view on the joint decision and estimation (JDE) problem, which we believe the work in various fields, such as target tracking, communications, time series modeling, will benefit greatly from. We apply this integrated Bayes solution to joint target tracking and classification, a very important topic in target inference, with simplified measurement models. The results of this new approach are compared with two conventional strategies. At last, a surveillance testbed is being built for such purposes as algorithm development and performance evaluation. We try to use the testbed to bridge the gap between theory and practice. In the dissertation, an overview as well as the architecture of the testbed is given and one case study is presented. The testbed is capable to serve the tasks with decision and/or estimation aspects, and is helpful for the development of the JDE algorithms

    HERMES: Towards an Integrated Toolbox to Characterize Functional and Effective Brain Connectivity

    Get PDF
    The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the ‘traditional’ set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality. This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified-easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox. Here we present HERMES (http://hermes.ctb.upm.es), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis

    Development and analysis of the Software Implemented Fault-Tolerance (SIFT) computer

    Get PDF
    SIFT (Software Implemented Fault Tolerance) is an experimental, fault-tolerant computer system designed to meet the extreme reliability requirements for safety-critical functions in advanced aircraft. Errors are masked by performing a majority voting operation over the results of identical computations, and faulty processors are removed from service by reassigning computations to the nonfaulty processors. This scheme has been implemented in a special architecture using a set of standard Bendix BDX930 processors, augmented by a special asynchronous-broadcast communication interface that provides direct, processor to processor communication among all processors. Fault isolation is accomplished in hardware; all other fault-tolerance functions, together with scheduling and synchronization are implemented exclusively by executive system software. The system reliability is predicted by a Markov model. Mathematical consistency of the system software with respect to the reliability model has been partially verified, using recently developed tools for machine-aided proof of program correctness

    Fronts in randomly advected and heterogeneous media and nonuniversality of Burgers turbulence: Theory and numerics

    Full text link
    A recently established mathematical equivalence--between weakly perturbed Huygens fronts (e.g., flames in weak turbulence or geometrical-optics wave fronts in slightly nonuniform media) and the inviscid limit of white-noise-driven Burgers turbulence--motivates theoretical and numerical estimates of Burgers-turbulence properties for specific types of white-in-time forcing. Existing mathematical relations between Burgers turbulence and the statistical mechanics of directed polymers, allowing use of the replica method, are exploited to obtain systematic upper bounds on the Burgers energy density, corresponding to the ground-state binding energy of the directed polymer and the speedup of the Huygens front. The results are complementary to previous studies of both Burgers turbulence and directed polymers, which have focused on universal scaling properties instead of forcing-dependent parameters. The upper-bound formula can be heuristically understood in terms of renormalization of a different kind from that previously used in combustion models, and also shows that the burning velocity of an idealized turbulent flame does not diverge with increasing Reynolds number at fixed turbulence intensity, a conclusion that applies even to strong turbulence. Numerical simulations of the one-dimensional inviscid Burgers equation using a Lagrangian finite-element method confirm that the theoretical upper bounds are sharp within about 15% for various forcing spectra (corresponding to various two-dimensional random media). These computations provide a new quantitative test of the replica method. The inferred nonuniversality (spectrum dependence) of the front speedup is of direct importance for combustion modeling.Comment: 20 pages, 2 figures, REVTeX 4. Moved some details to appendices, added figure on numerical metho
    • …
    corecore