11 research outputs found

    Development and evaluation of packet video schemes

    Get PDF
    Reflecting the two tasks proposed for the current year, namely a feasibility study of simulating the NASA network, and a study of progressive transmission schemes, are presented. The view of the NASA network, gleaned from the various technical reports made available to use, is provided. Also included is a brief overview of how the current simulator could be modified to accomplish the goal of simulating the NASA network. As the material in this section would be the basis for the actual simulation, it is important to make sure that it is an accurate reflection of the requirements on the simulator. Brief descriptions of the set of progressive transmission algorithms selected for the study are contained. The results available in the literature were obtained under a variety of different assumptions, not all of which are stated. As such, the only way to compare the efficiency and the implementational complexity of the various algorithms is to simulate them

    Study and simulation of low rate video coding schemes

    Get PDF
    The semiannual report is included. Topics covered include communication, information science, data compression, remote sensing, color mapped images, robust coding scheme for packet video, recursively indexed differential pulse code modulation, image compression technique for use on token ring networks, and joint source/channel coder design

    Studies and simulations of the DigiCipher system

    Get PDF
    During this period the development of simulators for the various high definition television (HDTV) systems proposed to the FCC was continued. The FCC has indicated that it wants the various proposers to collaborate on a single system. Based on all available information this system will look very much like the advanced digital television (ADTV) system with major contributions only from the DigiCipher system. The results of our simulations of the DigiCipher system are described. This simulator was tested using test sequences from the MPEG committee. The results are extrapolated to HDTV video sequences. Once again, some caveats are in order. The sequences used for testing the simulator and generating the results are those used for testing the MPEG algorithm. The sequences are of much lower resolution than the HDTV sequences would be, and therefore the extrapolations are not totally accurate. One would expect to get significantly higher compression in terms of bits per pixel with sequences that are of higher resolution. However, the simulator itself is a valid one, and should HDTV sequences become available, they could be used directly with the simulator. A brief overview of the DigiCipher system is given. Some coding results obtained using the simulator are looked at. These results are compared to those obtained using the ADTV system. These results are evaluated in the context of the CCSDS specifications and make some suggestions as to how the DigiCipher system could be implemented in the NASA network. Simulations such as the ones reported can be biased depending on the particular source sequence used. In order to get more complete information about the system one needs to obtain a reasonable set of models which mirror the various kinds of sources encountered during video coding. A set of models which can be used to effectively model the various possible scenarios is provided. As this is somewhat tangential to the other work reported, the results are included as an appendix

    Early termination algorithms for correlation coefficient based block matching

    Get PDF
    ABSTRACT Block based motion compensation techniques make frequent use of Early Termination Algorithms (ETA) to reduce the computational cost of block matching process. ETAs have been well studied in the context of Sum of Absolute Differences (SAD) match measure and are effective in eliminating a large percentage of computations. As compared to SAD, the correlation coefficient (ρ) is a more robust measure but has high computational cost because no ETAs for ρ have been reported in literature. In this paper, we propose two types of ETAs for correlation coefficient: growth based and the bound based. In growth based ETA, ρ is computed as a monotonically decreasing measure. At a specific search location, when the partial value of ρ falls below the yet known maxima, remaining calculations are discarded. In bound based ETA, a new upperbound on ρ is derived which is tighter than the currently used Cauchy-Schwartz inequality. The search locations where the proposed bound falls shorter than the yet known maxima are eliminated from the search space. Both types of algorithms are implemented in a cascade and tested on a commercial video dataset. In our experiments, up to 88% computations are found to be eliminated. In terms of execution time, our algorithm is up to 13.7 times faster than the FFTW based implementation and up to 4.6 times faster than the current best known spatial domain technique

    Serious Play Approaches for Creating, Sharing, and Mobilizing Tacit Knowledge in Cross-disciplinary Settings

    Get PDF
    abstract: Serious play—the notion of bringing the benefits of play to bear on work-related tasks—is receiving more attention as a remedy to many challenges of the modern knowledge economy. Exploring and defining the role of serious play approaches to facilitate collaborative problem-solving and value creation, this dissertation consists of four related research papers. The first research paper (RP1) reconciles three different conceptualizations of knowledge into a new theory of knowledge. This pluralistic definition allows knowledge to change character across the span of the value creation process. The paper further introduces a model called the Wheel of Knowledge (WoK) for mobilizing knowledge throughout the different knowledge conversions of the value creation process. The second research paper (RP2) advocates that serious play can scaffold and accelerate these knowledge conversion processes, it disaggregates existing serious play approaches, and starts to operationalize the WoK by using it to match different types of serious play approaches to different types of knowledge conversion challenges. The third research paper (RP3) validates the WoK by sorting the serious play literature according to how it applies to the different knowledge conversion processes. The paper provides a framework for ascertaining the applicability of serious play methods to specific knowledge conversion challenges and identifies under-explored research areas of the serious play field. The fourth research paper (RP4) tests the recommendations of RP3 by applying the LEGO® Serious Play® (LSP) method to a knowledge conversion challenge focused on tacit knowledge sharing. It reports on a mixed-methods, multi-session case study in which LSP was used to facilitate cross-disciplinary dialogue and deliberation about a wicked problem. Results show that LSP is particularly useful in the beginning of a value creation process and that it facilitates socialization and tacit knowledge sharing. Taken together the papers demonstrate the necessity, potential, and application of serious play as a catalyst for the knowledge conversion processes presented in the WoK. It is now clear that different serious play approaches are suitable as respectively: an accelerator for trust-building and collective creativity, as a conduit for iterative innovation, and as a way of making rote tasks more engaging.Dissertation/ThesisDoctoral Dissertation Design 201

    Space Station Furnace Facility. Volume 2: Requirements Definition and Conceptual Design Study. Appendix 3: Environment Analysis

    Get PDF
    A Preliminary Safety Analysis (PSA) is being accomplished as part of the Space Station Furnace Facility (SSFF) contract. This analysis is intended to support SSFF activities by analyzing concepts and designs as they mature to develop essential safety requirements for inclusion in the appropriate specifications, and designs, as early as possible. In addition, the analysis identifies significant safety concerns that may warrant specific trade studies or design definition, etc. The analysis activity to date concentrated on hazard and hazard cause identification and requirements development with the goal of developing a baseline set of detailed requirements to support trade study, specifications development, and preliminary design activities. The analysis activity will continue as the design and concepts mature. Section 2 defines what was analyzed, but it is likely that the SSFF definitions will undergo further changes. The safety analysis activity will reflect these changes as they occur. The analysis provides the foundation for later safety activities. The hazards identified will in most cases have Preliminary Design Review (PDR) applicability. The requirements and recommendations developed for each hazard will be tracked to ensure proper and early resolution of safety concerns

    Accessing Space: A Catalogue of Process, Equipment and Resources for Commercial Users, 1990

    Get PDF
    A catalogue is presented which is intended for commercial developers who are considering, or who have in progress, a project involving the microgravity environment of space or remote sensing of the Earth. An orientation is given to commercial space activities along with a current inventory of equipment, apparatus, carriers, vehicles, resources, and services available from NASA, other government agencies and U.S. industry. The information describes the array of resources that commercial users should consider when planning ground or space based developments. Many items listed have flown in space or been tested in labs and aboard aircraft and can be reused, revitalized, or adapted to suit specific requirements. New commercial ventures are encouraged to exploit existing inventory and expertise to the greatest extent possible

    General Catalog 1986-1988

    Get PDF
    General Catalog of 1986-1988 Contains course descriptions, University college calendar, and college administration.https://digitalcommons.usu.edu/universitycatalogs/1141/thumbnail.jp

    Video coding with linear compensation (VCLC

    No full text
    Abstract — Block based motion compensation techniques are commonly used in video encoding to reduce the temporal redundancy of the signal. In these techniques, each block in a video frame is matched with another block in a previous frame. The match criteria normally used is the minimization of the sum of absolute differences (SAD). Traditional encoders take the difference of the current block and its best matching block, and this differential signal is used for further processing. Rather than directly encoding the difference between the two blocks, we propose that the difference between the current block and its first order linear estimate from the best matching block should be used. This choice of using linear compensated differential signal is motivated by observing frequent brightness and contrast changes in real videos. We show two important theoretical results: (1) The variance of the linear compensated differential signal is always less than or equal to the variance of differential signal in traditional encoders. (2) The optimal criteria for finding the best matching block, in our proposed scheme, is the maximization of the magnitude of correlation coefficient. The theoretical results are verified through experimentation on a large dataset taken from several commercial videos. For the same number of bits per pixel, our proposed scheme exhibits an improvement in peak signal to noise ratio (PSNR) of up to 5 dB when compared to the traditional encoding scheme. I

    General Catalog 1972-1974

    Get PDF
    General Catalog of 1972-1974 Contains course descriptions, University college calendar, and college administration.https://digitalcommons.usu.edu/universitycatalogs/1133/thumbnail.jp
    corecore