16 research outputs found

    Computing Methodologies Supporting the Preservation of Electroacoustic Music from Analog Magnetic Tape

    Get PDF
    Electroacoustic music on analog magnetic tape is characterized by several specificities related to the carrier that have to be considered during the creation of a digital preservation copy of a document. The tape recorder need to be setup with the correct speed and equalization; moreover, the magnetic tape could present some intentional or unintentional alterations. During both the creation and the musicological analysis of a digital preservation copy, the quality of the work could be affected by human attention. This paper presents a methodology based on neural networks able to recognize and classify the alterations of a magnetic tape from the video of the tape itself flowing in the head of the tape recorder. Furthermore, some machine learning techniques has been tested to recognize equalization of a tape from its background noise. The encouraging results open the way to innovative tools able to unburden audio technicians and musicologists from repetitive tasks and improve the quality of their works

    Production and analysis of a Southern Ocean state estimate

    Get PDF
    Submitted in partial fulfillment of the requirements for the degree of Master of Science in Physical Oceanography, Massachusetts Institute of Technology and the Woods Hole Oceanographic Institution September 2006A modern general circulation model of the Southern Ocean with one-sixth of a degree resolution is optimized to the observed ocean in a weighted least squares sense. Convergence toward the state estimate solution is carried out by systematically adjusting the control variables (prescribed atmospheric state, initial conditions, and open northern boundary at 24.7°S) using the adjoint method. A cost function compares the model state to data from CTD synoptic sections, hydrographic climatology, satellite altimetry, and XBTs. Costs attributed to control variable perturbations ensure a physically realistic solution. An optimized solution is determined by the weights placed on the cost function terms. The state estimation procedure, along with the weights used, is described. A significant result is that the adjoint method is shown to work at eddy-permitting resolution in the highly-energetic Southern Ocean. At the time of the writing of this thesis the state estimate was not fully consistent with the observations. An analysis of the remaining misfit, as well as the mass transport in the preliminary state, is presented

    A systemic approach to the preservation of audio documents: methodology and software tools

    Get PDF
    This paper presents a methodology for the preservation of audio documents, the operational protocol that acts as the methodology, and an original open source software system that supports and automatizes several tasks along the process. The methodology is presented in the light of the ethical debate that has been challenging the international archival community for the last thirty years. The operational protocol reflects the methodological principles adopted by the authors, and its effectiveness is based on the results obtained in recent research projects involving some of the finest audio archives in Europe. Some recommendations are given for the rerecording process, aimed at minimizing the information loss and at quantifying the unintentional alterations introduced by the technical equipment. Finally, the paper introduces an original software system that guides and supports the preservation staff along the process, reducing the processing timing, automatizing tasks, minimizing errors, and using information hiding strategies to ease the cognitive load. Currently the software system is in use in several international archives

    Bayesian Learning of Coupled Biogeochemical-Physical Models

    Full text link
    Predictive dynamical models for marine ecosystems are used for a variety of needs. Due to sparse measurements and limited understanding of the myriad of ocean processes, there is however significant uncertainty. There is model uncertainty in the parameter values, functional forms with diverse parameterizations, level of complexity needed, and thus in the state fields. We develop a Bayesian model learning methodology that allows interpolation in the space of candidate models and discovery of new models from noisy, sparse, and indirect observations, all while estimating state fields and parameter values, as well as the joint PDFs of all learned quantities. We address the challenges of high-dimensional and multidisciplinary dynamics governed by PDEs by using state augmentation and the computationally efficient GMM-DO filter. Our innovations include stochastic formulation and complexity parameters to unify candidate models into a single general model as well as stochastic expansion parameters within piecewise function approximations to generate dense candidate model spaces. These innovations allow handling many compatible and embedded candidate models, possibly none of which are accurate, and learning elusive unknown functional forms. Our new methodology is generalizable, interpretable, and extrapolates out of the space of models to discover new ones. We perform a series of twin experiments based on flows past a ridge coupled with three-to-five component ecosystem models, including flows with chaotic advection. The probabilities of known, uncertain, and unknown model formulations, and of state fields and parameters, are updated jointly using Bayes' law. Non-Gaussian statistics, ambiguity, and biases are captured. The parameter values and model formulations that best explain the data are identified. When observations are sufficiently informative, model complexity and functions are discovered.Comment: 45 pages; 18 figures; 2 table

    Progetto di un sistema per la conservazione a lungo termine e a norma di legge di documenti elettronici

    Get PDF
    NOT AVAILABLENell\u27articolo viene presentato il lavoro svolto dall\u27Istituto di Fisiologia Clinica del CNR per la progetta- zione di un sistema di gestione elettronica dei dati prodotti durante l\u27attivit? clinica. Per garantire una conservazione di lungo periodo e il rispetto dei termini di validit? legale, ? stato necessario utilizzare strumenti e metodi di prevenzione nei confronti dell\u27invecchiamento dei supporti, dell\u27obsolescenza dei formati software e della scadenza delle firme digitali. Il sistema ? stato progettato rispettando la normativa CNIPA e in con- formit? con lo standard ISO OAIS. Il risultato finale ? stato la realizzazione di un sistema molto semplice dal punto di vista architetturale, modulare e flessibile in vista di un\u27esportazione verso altri enti. Particolare attenzione ? stata posta agli strumenti di indicizzazione e all\u27utilizzo di software open source

    Sampling in human cognition

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2010.Cataloged from PDF version of thesis.Includes bibliographical references (p. 117-126).Bayesian Decision Theory describes optimal methods for combining sparse, noisy data with prior knowledge to build models of an uncertain world and to use those models to plan actions and make novel decisions. Bayesian computational models correctly predict aspects of human behavior in cognitive domains ranging from perception to motor control and language. However the predictive success of Bayesian models of cognition has highlighted long-standing challenges in bridging the computational and process levels of cognition. First, the computations required for exact Bayesian inference are incommensurate with the limited resources available to cognition (e.g., computational speed; and memory). Second, Bayesian models describe computations but not the processes that carry out these computations and fail to accurately predict human behavior under conditions of cognitive load or deficits. I suggest a resolution to both challenges: The mind approximates Bayesian inference by sampling. Experiments across a wide range of cognition demonstrate Monte-Carlo-like behavior by human observers; moreover, models of cognition based on specific Monte Carlo algorithms can describe previously elusive cognitive phenomena such as perceptual bistability and probability matching. When sampling algorithms are treated as process models of human cognition, the computational and process levels can be modeled jointly to shed light on new and old cognitive phenomena..by Edward Vul.Ph.D

    Shape segmentation and retrieval based on the skeleton cut space

    Get PDF
    3D vormverzamelingen groeien snel in veel toepassingsgebieden. Om deze effectief te kunnen gebruiken bij modelleren, simuleren, of 3D contentontwikkeling moet men 3D vormen verwerken. Voorbeelden hiervan zijn het snijden van een vorm in zijn natuurlijke onderdelen (ook bekend als segmentatie), en het vinden van vormen die lijken op een gegeven model in een grote vormverzameling (ook bekend als opvraging). Dit proefschrift presenteert nieuwe methodes voor 3D vormsegmentatie en vormopvraging die gebaseerd zijn op het zogenaamde oppervlakskelet van een 3D vorm. Hoewel allang bekend, dergelijke skeletten kunnen alleen sinds kort snel, robuust, en bijna automatisch berekend worden. Deze ontwikkelingen stellen ons in staat om oppervlakskeletten te gebruiken om vormen te karakteriseren en analyseren zodat operaties zoals segmentatie en opvraging snel en automatisch gedaan kunnen worden. We vergelijken onze nieuwe methodes met moderne methodes voor dezelfde doeleinden en laten zien dat ons aanpak kwalitatief betere resultaten kan produceren. Ten slotte presenteren wij een nieuwe methode om oppervlakskeletten te extraheren die is veel simpeler dan, en heeft vergelijkbare snelheid met, de beste technieken in zijn klasse. Samenvattend, dit proefschrift laat zien hoe men een complete workflow kan implementeren voor het segmenteren en opvragen van 3D vormen gebruik makend van oppervlakskeletten alleen

    Assimilation de données de radar à nuages aéroporté pendant la campagne de mesures HyMeX

    Get PDF
    Les radars à nuages sont des atouts indéniables pour la Prévision Numérique du Temps (PNT). De par leur petite longueur d’onde, ils possèdent une excellente sensibilité aux particules nuageuses et ils sont facilement déployables à bord de plates-formes mobiles. Cette thèse a permis d’évaluer l’apport des observations de radars à nuages pour la validation et l’initialisation de modèles de PNT à échelle kilométrique. Dans la première partie, un opérateur d’observation pour la réflectivité en bande W a été conçu en cohérence avec le schéma microphysique à un moment d'Arome, le modèle de PNT à échelle kilométrique de Météo-France, mais de façon suffisamment générale pour pouvoir être adapté à un autre modèle de PNT à échelle kilométrique. Il est adaptable pour des radars à visée verticale aéroportés ou au sol. Afin de dissocier les erreurs de positionnement des nuages prévus par Arome, de celles présentes dans l’opérateur d’observation, une nouvelle méthode de validation, appelée "la méthode de la colonne la plus ressemblante (CPR), a été élaborée. Cette méthode a été employée afin de valider et de calibrer l'opérateur d'observation en utilisant les profils de réflectivité collectés par le radar à nuages aéroporté Rasta dans des conditions variées durant la première période d’observations (SOP1) du programme international HyMeX, qui vise à améliorer notre compréhension du cycle de l'eau en méditerranée. La seconde partie s'est intéressée à l'apport respectif de l'assimilation de profils verticaux de réflectivité et de vents horizontaux mesurés par le radar à nuages Rasta dans le système d'assimilation variationnel tridimensionnel (3DVar) d'Arome. Le bénéfice apporté par des conditions thermodynamiques, via l'assimilation de la réflectivité en bande W, et dynamiques, via l'assimilation des profils de vents horizontaux, cohérentes dans l'état initial a également été étudié. Pour assimiler la réflectivité en bande W, la méthode d'assimilation "1D+3DVar", qui est opérationnelle dans Arome pour assimiler les réflectivités des radars de précipitation au sol, a été employée. La méthode de restitution bayésienne 1D de profils d'humidité a été validée avec des mesures d'humidité in situ indépendantes. Puis, les expériences d'assimilation ont été menées sur un événement fortement convectif, ainsi que sur une plus longue période de 45 jours. Les résultats suggèrent notamment que l'assimilation conjointe des profils de réflectivité en bande W et des profils verticaux de vents horizontaux permet d'améliorer les analyses d'humidité, mais suggèrent également une légère amélioration des prévisions des cumuls de précipitatio
    corecore