5,208 research outputs found

    A comparison of time-domain time-scale modification algorithms

    Get PDF
    Time-domain approaches to time-scale modification are popular due to their ability to produce high quality results at a relatively low computational cost. Within the category of time-domain implementations quite a number of alternatives exist, each with their own computational requirements and associated output quality. This paper provides a computational and objective output quality assessment of a number of popular time-domain time-scaling implementations; thus providing a means for developers to identify a suitable algorithm for their application of interest. In addition, the issues that should be considered in developing time-domain algorithms are outlined, purely in the context of a waveform editing procedure

    A comparison of time-domain time-scale modification algorithms

    Get PDF
    Time-domain approaches to time-scale modification are popular due to their ability to produce high quality results at a relatively low computational cost. Within the category of time-domain implementations quite a number of alternatives exist, each with their own computational requirements and associated output quality. This paper provides a computational and objective output quality assessment of a number of popular time-domain time-scaling implementations; thus providing a means for developers to identify a suitable algorithm for their application of interest. In addition, the issues that should be considered in developing time-domain algorithms are outlined, purely in the context of a waveform editing procedure

    Synchronizing Sequencing Software to a Live Drummer

    Get PDF
    Copyright 2013 Massachusetts Institute of Technology. MIT allows authors to archive published versions of their articles after an embargo period. The article is available at

    Streamlining Sound Speed Profile Pre-Processing: Case Studies and Field Trials

    Get PDF
    High rate sound speed profiling systems have the potential to maximize the efficiency of multibeam echosounder systems (MBES) by increasing the accuracy at the outer edges of the swath where refraction effects are at their worst. In some cases, high rate sampling on the order of tens of casts per hour is required to capture the spatio-temporal oceanographic variability and this increased sampling rate can challenge the data acquisition workflow if refraction corrections are to be applied in real-time. Common bottlenecks result from sound speed profile (SSP) preprocessing requirements, e.g. file format conversion, cast extension, reduction of the number of points in the cast, filtering, etc. Without the ability to quickly pre-process SSP data, the MBES operator can quickly become overwhelmed with SSP related tasks, potentially to the detriment of their other duties. A series of algorithms are proposed in which SSPs are automatically pre-processed to meet input criteria of MBES acquisition systems, specifically the problems of cast extrapolation and thinning are addressed. The algorithmic performance will be assessed in terms of sounding uncertainty through a series of case studies in a variety of oceanographic conditions and water depths. Results from a field trial in the French Mediterranean will be used to assess the improvement in real-time MBES acquisition workflow and survey accuracy and will also highlight where further improvements can be made in the pre-processing pipeline

    Interaktive Raumzeitrekonstruktion in der Computergraphik

    Get PDF
    High-quality dense spatial and/or temporal reconstructions and correspondence maps from camera images, be it optical flow, stereo or scene flow, are an essential prerequisite for a multitude of computer vision and graphics tasks, e.g. scene editing or view interpolation in visual media production. Due to the ill-posed nature of the estimation problem in typical setups (i.e. limited amount of cameras, limited frame rate), automated estimation approaches are prone to erroneous correspondences and subsequent quality degradation in many non-trivial cases such as occlusions, ambiguous movements, long displacements, or low texture. While improving estimation algorithms is one obvious possible direction, this thesis complementarily concerns itself with creating intuitive, high-level user interactions that lead to improved correspondence maps and scene reconstructions. Where visually convincing results are essential, rendering artifacts resulting from estimation errors are usually repaired by hand with image editing tools, which is time consuming and therefore costly. My new user interactions, which integrate human scene recognition capabilities to guide a semi-automatic correspondence or scene reconstruction algorithm, save considerable effort and enable faster and more efficient production of visually convincing rendered images.Raumzeit-Rekonstruktion in Form von dichten rĂ€umlichen und/oder zeitlichen Korrespondenzen zwischen Kamerabildern, sei es optischer Fluss, Stereo oder Szenenfluss, ist eine wesentliche Voraussetzung fĂŒr eine Vielzahl von Aufgaben in der Computergraphik, zum Beispiel zum Editieren von Szenen oder Bildinterpolation. Da sowohl die Anzahl der Kameras als auch die Bildfrequenz begrenzt sind, ist das Rekonstruktionsproblem unterbestimmt, weswegen automatisierte SchĂ€tzungen hĂ€ufig fehlerhafte Korrespondenzen fĂŒr nichttriviale FĂ€lle wie Verdeckungen, mehrdeutige oder große Bewegungen, oder einheitliche Texturen enthalten; jede Bildsynthese basierend auf den partiell falschen SchĂ€tzungen muß daher QualitĂ€tseinbußen in Kauf nehmen. Man kann nun zum einen versuchen, die SchĂ€tzungsalgorithmen zu verbessern. KomplementĂ€r dazu kann man möglichst effiziente Interaktionsmöglichkeiten entwickeln, die die QualitĂ€t der Rekonstruktion drastisch verbessern. Dies ist das Ziel dieser Dissertation. FĂŒr visuell ĂŒberzeugende Resultate mĂŒssen Bildsynthesefehler bislang manuell in einem aufwĂ€ndigen Nachbearbeitungsschritt mit Hilfe von Bildbearbeitungswerkzeugen korrigiert werden. Meine neuen Benutzerinteraktionen, welche menschliches SzenenverstĂ€ndnis in halbautomatische Algorithmen integrieren, verringern den Nachbearbeitungsaufwand betrĂ€chtlich und ermöglichen so eine schnellere und effizientere Produktion qualitativ hochwertiger synthetisierter Bilder

    Performance and policy dimensions in internet routing

    Get PDF
    The Internet Routing Project, referred to in this report as the 'Highball Project', has been investigating architectures suitable for networks spanning large geographic areas and capable of very high data rates. The Highball network architecture is based on a high speed crossbar switch and an adaptive, distributed, TDMA scheduling algorithm. The scheduling algorithm controls the instantaneous configuration and swell time of the switch, one of which is attached to each node. In order to send a single burst or a multi-burst packet, a reservation request is sent to all nodes. The scheduling algorithm then configures the switches immediately prior to the arrival of each burst, so it can be relayed immediately without requiring local storage. Reservations and housekeeping information are sent using a special broadcast-spanning-tree schedule. Progress to date in the Highball Project includes the design and testing of a suite of scheduling algorithms, construction of software reservation/scheduling simulators, and construction of a strawman hardware and software implementation. A prototype switch controller and timestamp generator have been completed and are in test. Detailed documentation on the algorithms, protocols and experiments conducted are given in various reports and papers published. Abstracts of this literature are included in the bibliography at the end of this report, which serves as an extended executive summary

    NASA Ocean Altimeter Pathfinder Project

    Get PDF
    The NOAA/NASA Pathfinder program was created by the Earth Observing System (EOS) Program Office to determine how satellite-based data sets can be processed and used to study global change. The data sets are designed to be long time-sedes data processed with stable calibration and community consensus algorithms to better assist the research community. The Ocean Altimeter Pathfinder Project involves the reprocessing of all altimeter observations with a consistent set of improved algorithms, based on the results from TOPEX/POSEIDON (T/P), into easy-to-use data sets for the oceanographic community for climate research. This report describes the processing schemes used to produce a consistent data set and two of the products derived f rom these data. Other reports have been produced that: a) describe the validation of these data sets against tide gauge measurements and b) evaluate the statistical properties of the data that are relevant to climate change. The use of satellite altimetry for earth observations was proposed in the early 1960s. The first successful space based radar altimeter experiment was flown on SkyLab in 1974. The first successful satellite radar altimeter was flown aboard the Geos-3 spacecraft between 1975 and 1978. While a useful data set was collected from this mission for geophysical studies, the noise in the radar measured and incomplete global coverage precluded ft from inclusion in the Ocean Altimeter Pathfinder program. This program initiated its analysis with the Seasat mission, which was the first satellite radar altimeter flown for oceanography
    • 

    corecore