51 research outputs found

    Video data compression using artificial neural network differential vector quantization

    Get PDF
    An artificial neural network vector quantizer is developed for use in data compression applications such as Digital Video. Differential Vector Quantization is used to preserve edge features, and a new adaptive algorithm, known as Frequency-Sensitive Competitive Learning, is used to develop the vector quantizer codebook. To develop real time performance, a custom Very Large Scale Integration Application Specific Integrated Circuit (VLSI ASIC) is being developed to realize the associative memory functions needed in the vector quantization algorithm. By using vector quantization, the need for Huffman coding can be eliminated, resulting in superior performance against channel bit errors than methods that use variable length codes

    Nonlinear multiclass discriminant analysis

    Get PDF
    An alternative nonlinear multiclass discriminant algorithm is presented.This algorithm is based on the use of kernel functions and is designed to optimize a general linear discriminant analysis criterion based on scatter matrices.By reformulating these matrices in a specific form,a straightforward derivation a lows the kernel function to be introduced in a simple and direct way.Moreover,we propose a method to determine the value of the regularization parameter,based on this derivation.This work was supported in part by the HPCMO PET program and by the Spanish Ministry of Science and Technolog

    Final report on accomplishments of a Task Force on Campus Bridging sponsored workshop: Campus Leadership Engagement in Building a Coherent Campus Cyberinfrastructure

    Get PDF
    In 2010, the National Science Foundation (NSF) awarded a grant of $49,840 to the University of North Carolina Chapel Hill to organize a workshop on the topic of campus cyberinfrastructure with the title “Campus Bridging Taskforce Sponsored Workshop: Campus Leadership Engagement in Building a Coherent Campus Cyberinfrastructure.” This report discusses the contents of the full workshop report to the NSF as well as the accomplishments and outcomes reported via the NSF’s online reporting system.This material is based upon work supported by the National Science Foundation (NSF) under Grant No. 1059812 to the University of North Carolina at Chapel Hill, with Patrick Dreher as principal investigator and Craig Stewart, James Pepin, Guy Almes, and Michael Mundrane as co-principal investigators. Stewart’s involvement was supported by the Indiana University Pervasive Technology Institute, which is supported in part by the Lilly Endowment, Inc. (a private charitable trust). Any opinions, findings and conclusions, or recommendations expressed in this material are those of the author(s), and do not necessarily reflect the views of the NSF or the Lilly Endowment

    The Secure Medical Research Workspace: An IT Infrastructure to Enable Secure Research on Clinical Data: Shoffner et al. nSecure Medical Research Workspace

    Get PDF
    Clinical data has tremendous value for translational research, but only if security and privacy concerns can be addressed satisfactorily. A collaboration of clinical and informatics teams, including RENCI, NC TraCS, UNC’s School of Information and Library Science, Information Technology Service’s Research Computing and other partners at the University of North Carolina at Chapel Hill have developed a system called the Secure Medical Research Workspace (SMRW) that enables researchers to use clinical data securely for research. SMRW significantly minimizes the risk presented when using of identified clinical data, thereby protecting patients, researchers, and institutions associated with the data. The SMRW is built on a novel combination of virtualization and data leakage protection and can be combined with other protection methodologies and scaled to production levels

    Leveraging Open Electronic Health Record Data and Environmental Exposures Data to Derive Insights Into Rare Pulmonary Disease

    Get PDF
    Research on rare diseases has received increasing attention, in part due to the realized profitability of orphan drugs. Biomedical informatics holds promise in accelerating translational research on rare disease, yet challenges remain, including the lack of diagnostic codes for rare diseases and privacy concerns that prevent research access to electronic health records when few patients exist. The Integrated Clinical and Environmental Exposures Service (ICEES) provides regulatory-compliant open access to electronic health record data that have been integrated with environmental exposures data, as well as analytic tools to explore the integrated data. We describe a proof-of-concept application of ICEES to examine demographics, clinical characteristics, environmental exposures, and health outcomes among a cohort of patients enriched for phenotypes associated with cystic fibrosis (CF), idiopathic bronchiectasis (IB), and primary ciliary dyskinesia (PCD). We then focus on a subset of patients with CF, leveraging the availability of a diagnostic code for CF and serving as a benchmark for our development work. We use ICEES to examine select demographics, co-diagnoses, and environmental exposures that may contribute to poor health outcomes among patients with CF, defined as emergency department or inpatient visits for respiratory issues. We replicate current understanding of the pathogenesis and clinical manifestations of CF by identifying co-diagnoses of asthma, chronic nasal congestion, cough, middle ear disease, and pneumonia as factors that differentiate patients with poor health outcomes from those with better health outcomes. We conclude by discussing our preliminary findings in relation to other published work, the strengths and limitations of our approach, and our future directions

    Clinical Data: Sources and Types, Regulatory Constraints, Applications.

    Get PDF
    Access to clinical data is critical for the advancement of translational research. However, the numerous regulations and policies that surround the use of clinical data, although critical to ensure patient privacy and protect against misuse, often present challenges to data access and sharing. In this article, we provide an overview of clinical data types and associated regulatory constraints and inferential limitations. We highlight several novel approaches that our team has developed for openly exposing clinical data

    Temporal Generalization of Simple Recurrent Networks

    No full text
    Simple recurrent networks have been widely used in temporal processing applications. In this study we investigate temporal geralization of simple recurrent networks, drawing comparisons between network capabilities and human characteristics. Elman networks were trained to regenerate temporal trajectories sampled at different rates, and then tested with trajectories at both the trained sampling rates and at alternative sampling rates. The networks were also tested with trajectories representing mixtures of different sampling rates. It was found that for simple trajectories, the networks show interval invariance, but not rate invariance. However, for complex trajectories which contain greater contextual information, these networks do not seem to show any temporal generalization. Further, similar results were also obtained employing measured speech data. Thus, these results suggest that this class of networks fails to properly generalize in time. Keywords: Neural Networks, Temporal Genera..

    A Gradient Descent VQ Classification Algorithm

    No full text
    Vector Quantization (VQ) has its origins in signal processing where it is used for compact, accurate representation of input signals. However, since VQ induces a partitioning of the input space, it can also be used for statistical pattern recognition. In this paper we present a novel gradient descent VQ classification algorithm which minimizes the Bayes Risk, which we refer to as the Generalized Bayes Risk VQ (GBRVQ), and compare its performance to other VQ classification algorithms. I. Introduction Vector quantization (VQ), a generalization of the quantization of scalars to vectors [1], is the approximation of points (vectors) in a continuous input space by a finite number of representative vectors in the same input space. The basic definition of a vector quantizer is as follows: Definition I.1 (Vector Quantizer) A K-level, n-dimensional vector quantizer, Q is a mapping, Q : IR n ! C, from an n-dimensional Euclidean space, IR n into a finite set C = fw 1 ; w 2 ; \Delta \Delta \D..

    Real-time video compression using entropy-biased ANN codebooks

    No full text
    We describe hardware that has been built to compress video in real time using full-search vector quantization (VQ). This architecture implements a differential-vector-quantization (DVQ) algorithm which features entropy-biased codebooks designed using an artificial neural network (ANN). A special-purpose digital associative memory, the VAMPIRE chip, performs the VQ processing. We describe the DVQ algorithm, its adaptations for sampled NTSC composite-color video, and details of its hardware implementation. We conclude by presenting results drawn from real-time operation of the DVQ hardware. 1. INTRODUCTION Vector quantization has become well-known and widely studied since Shannon first established the merits of quantizing vectors rather than scalars. 1 Since that time, vector quantization (VQ) has been shown to be useful in the realm of data compression, particularly attracting attention for its efficient compression of digitized speech and image data. Meanwhile, as digital data has b..
    • …