117 research outputs found

    Análise de componentes independentes em Monte Carlo paralelo

    Get PDF
    Trabalho de Conclusão de Curso (graduação)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2018.Neste estudo verificamos se o uso de técnicas de análise de componentes independentes em amostras pseudo-aleatórias é capaz de melhorar a convergência de Monte Carlo paralelo. Uma das razões da lenta convergência do método de Monte Carlo paralelo é a correlação entre as amostras utilizadas para alimentar o método. Para fazer essa verificação utilizamos duas técnicas distintas de ICA: FastICA e mapas auto-organizáveis. Um método de Monte Carlo simples foi proposto para que fosse possível fazer essa verificação. Também foi feita a comparação dos resultados usando dois modelos de gerador de números pseudo-leatórios. Os resultado obtidos mostraram que nas condições propostas neste estudo tanto o FastICA quanto mapas auto-organizáveis não foram capazes gerar amostras menos correlacionadas e portanto não são alternativas viáveis em otimizar o método de Monte Carlo paralelo

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Extraction and Detection of Fetal Electrocardiograms from Abdominal Recordings

    Get PDF
    The non-invasive fetal ECG (NIFECG), derived from abdominal surface electrodes, offers novel diagnostic possibilities for prenatal medicine. Despite its straightforward applicability, NIFECG signals are usually corrupted by many interfering sources. Most significantly, by the maternal ECG (MECG), whose amplitude usually exceeds that of the fetal ECG (FECG) by multiple times. The presence of additional noise sources (e.g. muscular/uterine noise, electrode motion, etc.) further affects the signal-to-noise ratio (SNR) of the FECG. These interfering sources, which typically show a strong non-stationary behavior, render the FECG extraction and fetal QRS (FQRS) detection demanding signal processing tasks. In this thesis, several of the challenges regarding NIFECG signal analysis were addressed. In order to improve NIFECG extraction, the dynamic model of a Kalman filter approach was extended, thus, providing a more adequate representation of the mixture of FECG, MECG, and noise. In addition, aiming at the FECG signal quality assessment, novel metrics were proposed and evaluated. Further, these quality metrics were applied in improving FQRS detection and fetal heart rate estimation based on an innovative evolutionary algorithm and Kalman filtering signal fusion, respectively. The elaborated methods were characterized in depth using both simulated and clinical data, produced throughout this thesis. To stress-test extraction algorithms under ideal circumstances, a comprehensive benchmark protocol was created and contributed to an extensively improved NIFECG simulation toolbox. The developed toolbox and a large simulated dataset were released under an open-source license, allowing researchers to compare results in a reproducible manner. Furthermore, to validate the developed approaches under more realistic and challenging situations, a clinical trial was performed in collaboration with the University Hospital of Leipzig. Aside from serving as a test set for the developed algorithms, the clinical trial enabled an exploratory research. This enables a better understanding about the pathophysiological variables and measurement setup configurations that lead to changes in the abdominal signal's SNR. With such broad scope, this dissertation addresses many of the current aspects of NIFECG analysis and provides future suggestions to establish NIFECG in clinical settings.:Abstract Acknowledgment Contents List of Figures List of Tables List of Abbreviations List of Symbols (1)Introduction 1.1)Background and Motivation 1.2)Aim of this Work 1.3)Dissertation Outline 1.4)Collaborators and Conflicts of Interest (2)Clinical Background 2.1)Physiology 2.1.1)Changes in the maternal circulatory system 2.1.2)Intrauterine structures and feto-maternal connection 2.1.3)Fetal growth and presentation 2.1.4)Fetal circulatory system 2.1.5)Fetal autonomic nervous system 2.1.6)Fetal heart activity and underlying factors 2.2)Pathology 2.2.1)Premature rupture of membrane 2.2.2)Intrauterine growth restriction 2.2.3)Fetal anemia 2.3)Interpretation of Fetal Heart Activity 2.3.1)Summary of clinical studies on FHR/FHRV 2.3.2)Summary of studies on heart conduction 2.4)Chapter Summary (3)Technical State of the Art 3.1)Prenatal Diagnostic and Measuring Technique 3.1.1)Fetal heart monitoring 3.1.2)Related metrics 3.2)Non-Invasive Fetal ECG Acquisition 3.2.1)Overview 3.2.2)Commercial equipment 3.2.3)Electrode configurations 3.2.4)Available NIFECG databases 3.2.5)Validity and usability of the non-invasive fetal ECG 3.3)Non-Invasive Fetal ECG Extraction Methods 3.3.1)Overview on the non-invasive fetal ECG extraction methods 3.3.2)Kalman filtering basics 3.3.3)Nonlinear Kalman filtering 3.3.4)Extended Kalman filter for FECG estimation 3.4)Fetal QRS Detection 3.4.1)Merging multichannel fetal QRS detections 3.4.2)Detection performance 3.5)Fetal Heart Rate Estimation 3.5.1)Preprocessing the fetal heart rate 3.5.2)Fetal heart rate statistics 3.6)Fetal ECG Morphological Analysis 3.7)Problem Description 3.8)Chapter Summary (4)Novel Approaches for Fetal ECG Analysis 4.1)Preliminary Considerations 4.2)Fetal ECG Extraction by means of Kalman Filtering 4.2.1)Optimized Gaussian approximation 4.2.2)Time-varying covariance matrices 4.2.3)Extended Kalman filter with unknown inputs 4.2.4)Filter calibration 4.3)Accurate Fetal QRS and Heart Rate Detection 4.3.1)Multichannel evolutionary QRS correction 4.3.2)Multichannel fetal heart rate estimation using Kalman filters 4.4)Chapter Summary (5)Data Material 5.1)Simulated Data 5.1.1)The FECG Synthetic Generator (FECGSYN) 5.1.2)The FECG Synthetic Database (FECGSYNDB) 5.2)Clinical Data 5.2.1)Clinical NIFECG recording 5.2.2)Scope and limitations of this study 5.2.3)Data annotation: signal quality and fetal amplitude 5.2.4)Data annotation: fetal QRS annotation 5.3)Chapter Summary (6)Results for Data Analysis 6.1)Simulated Data 6.1.1)Fetal QRS detection 6.1.2)Morphological analysis 6.2)Own Clinical Data 6.2.1)FQRS correction using the evolutionary algorithm 6.2.2)FHR correction by means of Kalman filtering (7)Discussion and Prospective 7.1)Data Availability 7.1.1)New measurement protocol 7.2)Signal Quality 7.3)Extraction Methods 7.4)FQRS and FHR Correction Algorithms (8)Conclusion References (A)Appendix A - Signal Quality Annotation (B)Appendix B - Fetal QRS Annotation (C)Appendix C - Data Recording GU

    Méthodes de séparation aveugle de sources pour l'imagerie hyperspectrale : application à la télédétection urbaine et à l'astrophysique

    Get PDF
    Au cours de cette thèse nous avons développé des méthodes de Séparation Aveugle de Sources (SAS) pour des images hyperspectrales, dans le cadre de deux champs d'application : la télédétection urbaine et l'astrophysique. Dans la première partie de la thèse nous nous sommes intéressés au démélange hyperspectral pour des images urbaines, le but étant de retrouver d'une manière non supervisée les matériaux présents sur la scène en extrayant leurs spectres et leurs proportions. La plupart des méthodes de la littérature sont basées sur un modèle linéaire, qui n'est pas valide en milieu urbain à cause des structures 3D. Une première étape a donc été d'établir un modèle de mélange adapté aux milieux urbains, en partant d'équations physiques basées sur la théorie du transfert radiatif. Le modèle final de forme linéaire quadratique invariant spectralement, ainsi que les possibles hypothèses sur les coefficients de mélange, sont justifiés par les résultats obtenus sur des images simulées réalistes. Nous avons ensuite proposé, pour le démélange, des méthodes de SAS fondées sur la FMN (Factorisation en Matrices Non-négatives). Ces méthodes sont basées sur un calcul de gradient qui tient compte des termes quadratiques. La première méthode utilise un algorithme de gradient à pas fixe, à partir de laquelle une version de Newton a aussi été proposée. La dernière méthode est un algorithme FMN multiplicatif. Les méthodes proposées donnent de meilleures performances qu'une méthode linéaire de la littérature. En astrophysique nous avons développé des méthodes de SAS pour des images de champs denses d'étoiles du spectro-imageur MUSE. A cause de la PSF (Point Spread Function), les informations contenues dans les pixels peuvent résulter des contributions de plusieurs étoiles. C'est là que réside l'intérêt de la SAS : extraire, à partir de ces signaux qui sont des mélanges, les spectres des étoiles qui sont donc nos "sources". Le modèle de mélange est linéaire non invariant spectralement. Nous avons proposé une méthode de SAS basée sur la positivité des données. Cette approche exploite le modèle paramétrique de la FSF (Field Spread Function) de MUSE. La méthode mise en place est itérative et alterne l'estimation des spectres par moindres carrés (avec contraintes de positivité) et estimation des paramètres de la FSF par un algorithme de gradient projeté. La méthode proposée donne de bonnes performances sur des images simulées de MUSE.In this work, we developed Blind Source Separation methods (BSS) for hyperspectral images, concerning two applications : urban remote sensing and astrophysics. The first part of this work concerned spectral unmixing for urban images, with the aim of finding, by an unsupervised method, the materials present in the scene, by extracting their spectra and their proportions. Most existing methods rely on a linear model, which is not valid in urban environments because of 3D structures. Therefore, the first step was to derive a mixing model adapted to urban environments, starting from physical equations based on radiative transfer theory. The derived linear-quadratic model, and possible hypotheses on the mixing coefficients, are justified by results obtained with simulated realistic images. We then proposed, for the unmixing, BSS methods based on NMF (Non-negative Matrix Factorization). These methods are based on gradient computation taking into account the quadratic terms.The first method uses a gradient descent algorithm with a constant step, from which we then derived a Newton version. The last proposed method is a multiplicative NMF algorithm. These methods give better performance than a linear method from the literature. Concerning astrophysics, we developed BSS methods for dense field images of the MUSE instrument. Due to the PSF (Point Spread Function) effect, information contained in the pixels can result from contributions of many stars. Hence, there is a need for BSS, to extract from these signals that are mixtures, the star spectra which are our "sources". The mixing model is linear but spectrally non-invariant. We proposed a BSS method based on positivity. This approach uses the parametric model of MUSE FSF (Field Spread Function). The implemented method is iterative and alternates spectra estimation using least squares (with positivity constraint) and FSF parameter estimation by a projected gradient descent algorithm. The proposed method yields good performance with simulated MUSE images

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Aeronautical engineering: A continuing bibliography with indexes (supplement 189)

    Get PDF
    This bibliography lists 579 reports, articles and other documents introduced into the NASA scientific and technical information system in June 1985

    Advanced Testing and Characterization of Bituminous Materials, Two Volume Set

    Get PDF
    Bituminous materials are used to build durable roads that sustain diverse environmental conditions. However, due to their complexity and a global shortage of these materials, their design and technical development present several challenges. Advanced Testing and Characterisation of Bituminous Materials focuses on fundamental and performance testin

    Exploring variabilities through factor analysis in automatic acoustic language recognition

    Get PDF
    La problématique traitée par la Reconnaissance de la Langue (LR) porte sur la définition découverte de la langue contenue dans un segment de parole. Cette thèse se base sur des paramètres acoustiques de courte durée, utilisés dans une approche d adaptation de mélanges de Gaussiennes (GMM-UBM). Le problème majeur de nombreuses applications du vaste domaine de la re- problème connaissance de formes consiste en la variabilité des données observées. Dans le contexte de la Reconnaissance de la Langue (LR), cette variabilité nuisible est due à des causes diverses, notamment les caractéristiques du locuteur, l évolution de la parole et de la voix, ainsi que les canaux d acquisition et de transmission. Dans le contexte de la reconnaissance du locuteur, l impact de la variabilité solution peut sensiblement être réduit par la technique d Analyse Factorielle (Joint Factor Analysis, JFA). Dans ce travail, nous introduisons ce paradigme à la Reconnaissance de la Langue. Le succès de la JFA repose sur plusieurs hypothèses. La première est que l information observée est décomposable en une partie universelle, une partie dépendante de la langue et une partie de variabilité, qui elle est indépendante de la langue. La deuxième hypothèse, plus technique, est que la variabilité nuisible se situe dans un sous-espace de faible dimension, qui est défini de manière globale.Dans ce travail, nous analysons le comportement de la JFA dans le contexte d un dispositif de LR du type GMM-UBM. Nous introduisons et analysons également sa combinaison avec des Machines à Vecteurs Support (SVM). Les premières publications sur la JFA regroupaient toute information qui est amélioration nuisible à la tâche (donc ladite variabilité) dans un seul composant. Celui-ci est supposé suivre une distribution Gaussienne. Cette approche permet de traiter les différentes sortes de variabilités d une manière unique. En pratique, nous observons que cette hypothèse n est pas toujours vérifiée. Nous avons, par exemple, le cas où les données peuvent être groupées de manière logique en deux sous-parties clairement distinctes, notamment en données de sources téléphoniques et d émissions radio. Dans ce cas-ci, nos recherches détaillées montrent un certain avantage à traiter les deux types de données par deux systèmes spécifiques et d élire comme score de sortie celui du système qui correspond à la catégorie source du segment testé. Afin de sélectionner le score de l un des systèmes, nous avons besoin d un analyses détecteur de canal source. Nous proposons ici différents nouveaux designs pour engendrées de tels détecteurs automatiques. Dans ce cadre, nous montrons que les facteurs de variabilité (du sous-espace) de la JFA peuvent être utilisés avec succès pour la détection de la source. Ceci ouvre la perspective intéressante de subdiviser les5données en catégories de canal source qui sont établies de manière automatique. En plus de pouvoir s adapter à des nouvelles conditions de source, cette propriété permettrait de pouvoir travailler avec des données d entraînement qui ne sont pas accompagnées d étiquettes sur le canal de source. L approche JFA permet une réduction de la mesure de coûts allant jusqu à généraux 72% relatives, comparé au système GMM-UBM de base. En utilisant des systèmes spécifiques à la source, suivis d un sélecteur de scores, nous obtenons une amélioration relative de 81%.Language Recognition is the problem of discovering the language of a spoken definitionutterance. This thesis achieves this goal by using short term acoustic information within a GMM-UBM approach.The main problem of many pattern recognition applications is the variability of problemthe observed data. In the context of Language Recognition (LR), this troublesomevariability is due to the speaker characteristics, speech evolution, acquisition and transmission channels.In the context of Speaker Recognition, the variability problem is solved by solutionthe Joint Factor Analysis (JFA) technique. Here, we introduce this paradigm toLanguage Recognition. The success of JFA relies on several assumptions: The globalJFA assumption is that the observed information can be decomposed into a universalglobal part, a language-dependent part and the language-independent variabilitypart. The second, more technical assumption consists in the unwanted variability part to be thought to live in a low-dimensional, globally defined subspace. In this work, we analyze how JFA behaves in the context of a GMM-UBM LR framework. We also introduce and analyze its combination with Support Vector Machines(SVMs).The first JFA publications put all unwanted information (hence the variability) improvemen tinto one and the same component, which is thought to follow a Gaussian distribution.This handles diverse kinds of variability in a unique manner. But in practice,we observe that this hypothesis is not always verified. We have for example thecase, where the data can be divided into two clearly separate subsets, namely datafrom telephony and from broadcast sources. In this case, our detailed investigations show that there is some benefit of handling the two kinds of data with two separatesystems and then to elect the output score of the system, which corresponds to the source of the testing utterance.For selecting the score of one or the other system, we need a channel source related analyses detector. We propose here different novel designs for such automatic detectors.In this framework, we show that JFA s variability factors (of the subspace) can beused with success for detecting the source. This opens the interesting perspectiveof partitioning the data into automatically determined channel source categories,avoiding the need of source-labeled training data, which is not always available.The JFA approach results in up to 72% relative cost reduction, compared to the overall resultsGMM-UBM baseline system. Using source specific systems followed by a scoreselector, we achieve 81% relative improvement.AVIGNON-Bib. numérique (840079901) / SudocSudocFranceF

    Advanced Testing and Characterization of Bituminous Materials, Two Volume Set

    Get PDF
    Bituminous materials are used to build durable roads that sustain diverse environmental conditions. However, due to their complexity and a global shortage of these materials, their design and technical development present several challenges. Advanced Testing and Characterisation of Bituminous Materials focuses on fundamental and performance testin
    • …
    corecore