16,557 research outputs found

    A study of data coding technology developments in the 1980-1985 time frame, volume 2

    Get PDF
    The source parameters of digitized analog data are discussed. Different data compression schemes are outlined and analysis of their implementation are presented. Finally, bandwidth compression techniques are given for video signals

    Investigations in adaptive processing of multispectral data

    Get PDF
    Adaptive data processing procedures are applied to the problem of classifying objects in a scene scanned by multispectral sensor. These procedures show a performance improvement over standard nonadaptive techniques. Some sources of error in classification are identified and those correctable by adaptive processing are discussed. Experiments in adaptation of signature means by decision-directed methods are described. Some of these methods assume correlation between the trajectories of different signature means; for others this assumption is not made

    Estimation and tracking of rapidly time-varying broadband acoustic communication channels

    Get PDF
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology and the Woods Hole Oceanographic Institution February 2006This thesis develops methods for estimating wideband shallow-water acoustic communication channels. The very shallow water wideband channel has three distinct features: large dimension caused by extensive delay spread; limited number of degrees of freedom (DOF) due to resolvable paths and inter-path correlations; and rapid fluctuations induced by scattering from the moving sea surface. Traditional LS estimation techniques often fail to reconcile the rapid fluctuations with the large dimensionality. Subspace based approaches with DOF reduction are confronted with unstable subspace structure subject to significant changes over a short period of time. Based on state-space channel modeling, the first part of this thesis develops algorithms that jointly estimate the channel as well as its dynamics. Algorithms based on the Extended Kalman Filter (EKF) and the Expectation Maximization (EM) approach respectively are developed. Analysis shows conceptual parallels, including an identical second-order innovation form shared by the EKF modification and the suboptimal EM, and the shared issue of parameter identifiability due to channel structure, reflected as parameter unobservability in EKF and insufficient excitation in EM. Modifications of both algorithms, including a two-model based EKF and a subspace EM algorithm which selectively track dominant taps and reduce prediction error, are proposed to overcome the identifiability issue. The second part of the thesis develops algorithms that explicitly find the sparse estimate of the delay-Doppler spread function. The study contributes to a better understanding of the channel physical constraints on algorithm design and potential performance improvement. It may also be generalized to other applications where dimensionality and variability collide.Financial support for this thesis research was provided by the Office of Naval Research and the WHOI Academic Program Office

    Data Transmission in the Presence of Limited Channel State Information Feedback

    Get PDF

    Role of homeostasis in learning sparse representations

    Full text link
    Neurons in the input layer of primary visual cortex in primates develop edge-like receptive fields. One approach to understanding the emergence of this response is to state that neural activity has to efficiently represent sensory data with respect to the statistics of natural scenes. Furthermore, it is believed that such an efficient coding is achieved using a competition across neurons so as to generate a sparse representation, that is, where a relatively small number of neurons are simultaneously active. Indeed, different models of sparse coding, coupled with Hebbian learning and homeostasis, have been proposed that successfully match the observed emergent response. However, the specific role of homeostasis in learning such sparse representations is still largely unknown. By quantitatively assessing the efficiency of the neural representation during learning, we derive a cooperative homeostasis mechanism that optimally tunes the competition between neurons within the sparse coding algorithm. We apply this homeostasis while learning small patches taken from natural images and compare its efficiency with state-of-the-art algorithms. Results show that while different sparse coding algorithms give similar coding results, the homeostasis provides an optimal balance for the representation of natural images within the population of neurons. Competition in sparse coding is optimized when it is fair. By contributing to optimizing statistical competition across neurons, homeostasis is crucial in providing a more efficient solution to the emergence of independent components
    corecore