4,213 research outputs found

    Recovering facial shape using a statistical model of surface normal direction

    Get PDF
    In this paper, we show how a statistical model of facial shape can be embedded within a shape-from-shading algorithm. We describe how facial shape can be captured using a statistical model of variations in surface normal direction. To construct this model, we make use of the azimuthal equidistant projection to map the distribution of surface normals from the polar representation on a unit sphere to Cartesian points on a local tangent plane. The distribution of surface normal directions is captured using the covariance matrix for the projected point positions. The eigenvectors of the covariance matrix define the modes of shape-variation in the fields of transformed surface normals. We show how this model can be trained using surface normal data acquired from range images and how to fit the model to intensity images of faces using constraints on the surface normal direction provided by Lambert's law. We demonstrate that the combination of a global statistical constraint and local irradiance constraint yields an efficient and accurate approach to facial shape recovery and is capable of recovering fine local surface details. We assess the accuracy of the technique on a variety of images with ground truth and real-world images

    Global-scale regionalization of hydrologic model parameters

    Get PDF
    Current state-of-the-art models typically applied at continental to global scales (hereafter called macroscale) tend to use a priori parameters, resulting in suboptimal streamflow (Q) simulation. For the first time, a scheme for regionalization of model parameters at the global scale was developed. We used data from a diverse set of 1787 small-to-medium sized catchments ( 10-10,000 km(2)) and the simple conceptual HBV model to set up and test the scheme. Each catchment was calibrated against observed daily Q, after which 674 catchments with high calibration and validation scores, and thus presumably good-quality observed Q and forcing data, were selected to serve as donor catchments. The calibrated parameter sets for the donors were subsequently transferred to 0.5 degrees grid cells with similar climatic and physiographic characteristics, resulting in parameter maps for HBV with global coverage. For each grid cell, we used the 10 most similar donor catchments, rather than the single most similar donor, and averaged the resulting simulated Q, which enhanced model performance. The 1113 catchments not used as donors were used to independently evaluate the scheme. The regionalized parameters outperformed spatially uniform (i.e., averaged calibrated) parameters for 79% of the evaluation catchments. Substantial improvements were evident for all major Koppen-Geiger climate types and even for evaluation catchments>5000 km distant from the donors. The median improvement was about half of the performance increase achieved through calibration. HBV with regionalized parameters outperformed nine state-of-the-art macroscale models, suggesting these might also benefit from the new regionalization scheme. The produced HBV parameter maps including ancillary data are available via

    Patterns in Motion - From the Detection of Primitives to Steering Animations

    Get PDF
    In recent decades, the world of technology has developed rapidly. Illustrative of this trend is the growing number of affrdable methods for recording new and bigger data sets. The resulting masses of multivariate and high-dimensional data represent a new challenge for research and industry. This thesis is dedicated to the development of novel methods for processing multivariate time series data, thus meeting this Data Science related challenge. This is done by introducing a range of different methods designed to deal with time series data. The variety of methods re ects the different requirements and the typical stage of data processing ranging from pre-processing to post- processing and data recycling. Many of the techniques introduced work in a general setting. However, various types of motion recordings of human and animal subjects were chosen as representatives of multi-variate time series. The different data modalities include Motion Capture data, accelerations, gyroscopes, electromyography, depth data (Kinect) and animated 3D-meshes. It is the goal of this thesis to provide a deeper understanding of working with multi-variate time series by taking the example of multi-variate motion data. However, in order to maintain an overview of the matter, the thesis follows a basic general pipeline. This pipeline was developed as a guideline for time series processing and is the first contribution of this work. Each part of the thesis represents one important stage of this pipeline which can be summarized under the topics segmentation, analysis and synthesis. Specific examples of different data modalities, processing requirements and methods to meet those are discussed in the chapters of the respective parts. One important contribution of this thesis is a novel method for temporal segmentation of motion data. It is based on the idea of self-similarities within motion data and is capable of unsupervised segmentation of range of motion data into distinct activities and motion primitives. The examples concerned with the analysis of multi-variate time series re ect the role of data analysis in different inter-disciplinary contexts and also the variety of requirements that comes with collaboration with other sciences. These requirements are directly connected to current challenges in data science. Finally, the problem of synthesis of multi-variate time series is discussed using a graph-based example and examples related to rigging or steering of meshes. Synthesis is an important stage in data processing because it creates new data from existing ones in a controlled way. This makes exploiting existing data sets and and access of more condensed data possible, thus providing feasible alternatives to otherwise time-consuming manual processing.Muster in Bewegung - Von der Erkennung von Primitiven zur Steuerung von Animationen In den letzten Jahrzehnten hat sich die Welt der Technologie rapide entwickelt. Beispielhaft für diese Entwicklung ist die wachsende Zahl erschwinglicher Methoden zum Aufzeichnen neuer und immer größerer Datenmengen. Die sich daraus ergebenden Massen multivariater und hochdimensionaler Daten stellen Forschung wie Industrie vor neuartige Probleme. Diese Arbeit ist der Entwicklung neuer Verfahren zur Verarbeitung multivariater Zeitreihen gewidmet und stellt sich damit einer großen Herausforderung, welche unmittelbar mit dem neuen Feld der sogenannten Data Science verbunden ist. In ihr werden ein Reihe von verschiedenen Verfahren zur Verarbeitung multivariater Zeitserien eingeführt. Die verschiedenen Verfahren gehen jeweils auf unterschiedliche Anforderungen und typische Stadien der Datenverarbeitung ein und reichen von Vorverarbeitung bis zur Nachverarbeitung und darüber hinaus zur Wiederverwertung. Viele der vorgestellten Techniken eignen sich zur Verarbeitung allgemeiner multivariater Zeitreihen. Allerdings wurden hier eine Anzahl verschiedenartiger Aufnahmen von menschlichen und tierischen Subjekte ausgewählt, welche als Vertreter für allgemeine multivariate Zeitreihen gelten können. Zu den unterschiedlichen Modalitäten der Aufnahmen gehören Motion Capture Daten, Beschleunigungen, Gyroskopdaten, Elektromyographie, Tiefenbilder ( Kinect ) und animierte 3D -Meshes. Es ist das Ziel dieser Arbeit, am Beispiel der multivariaten Bewegungsdaten ein tieferes Verstndnis für den Umgang mit multivariaten Zeitreihen zu vermitteln. Um jedoch einen Überblick ber die Materie zu wahren, folgt sie jedoch einer grundlegenden und allgemeinen Pipeline. Diese Pipeline wurde als Leitfaden für die Verarbeitung von Zeitreihen entwickelt und ist der erste Beitrag dieser Arbeit. Jeder weitere Teil der Arbeit behandelt eine von drei größeren Stationen in der Pipeline, welche sich unter unter die Themen Segmentierung, Analyse und Synthese eingliedern lassen. Beispiele verschiedener Datenmodalitäten und Anforderungen an ihre Verarbeitung erläutern die jeweiligen Verfahren. Ein wichtiger Beitrag dieser Arbeit ist ein neuartiges Verfahren zur zeitlichen Segmentierung von Bewegungsdaten. Dieses basiert auf der Idee der Selbstähnlichkeit von Bewegungsdaten und ist in der Lage, verschiedenste Bewegungsdaten voll-automatisch in unterschiedliche Aktivitäten und Bewegungs-Primitive zu zerlegen. Die Beispiele fr die Analyse multivariater Zeitreihen spiegeln die Rolle der Datenanalyse in verschiedenen interdisziplinären Zusammenhänge besonders wider und illustrieren auch die Vielfalt der Anforderungen, die sich in interdisziplinären Kontexten auftun. Schließlich wird das Problem der Synthese multivariater Zeitreihen unter Verwendung eines graph-basierten und eines Steering Beispiels diskutiert. Synthese ist insofern ein wichtiger Schritt in der Datenverarbeitung, da sie es erlaubt, auf kontrollierte Art neue Daten aus vorhandenen zu erzeugen. Dies macht die Nutzung bestehender Datensätze und den Zugang zu dichteren Datenmodellen möglich, wodurch Alternativen zur ansonsten zeitaufwendigen manuellen Verarbeitung aufgezeigt werden

    The Cosmic History of X-ray Binary Evolution

    Get PDF
    The Chandra Deep Fields provide an extraordinary window into the high-energy history of the cosmos. Observations of non-active galaxies within the deep fields can be leveraged to extract information about the formation and evolution of X-ray binaries (XRBs). Previous studies have suggested that the evolution of XRB luminosity can be expressed a function of physical parameters such as star formation rate, stellar mass, stellar age, and metallicity. The goal of this work is to develop and implement a complete physical parameterization for the luminosity of XRB populations, which can be utilized for a variety of further studies. Chapter 1 provides the necessary scientific background for the remainder of the work. This specifically covers the formation of XRBs and the observed general trends associated with populations of XRBs. The motivating work for the later chapters is detailed as well. Chapter 2 outlines the groundwork necessary to determine the star formation history of a galaxy, which is an essential step in developing an age-dependent model. The components of the Lightning spectral energy distribution fitting procedure are explained and the supporting evidence for use of Lightning is presented. Chapter 3 establishes a procedure for creating and fitting a non-parametric age-based model for the evolution of XRB luminosity. A sample selection procedure is detailed, producing a sample of 344 deep field galaxies to be fit. Two models are fit, and one is found to provide a statistically robust. The results of the model are presented and interpreted for various applications. Chapter 4 continues the pursuit of a complete physical parameterization. It begins by expanding the sample size through loosening the restrictions required for Lightning and incorporating metallicity measurements from the fundamental metallicity relation. Three different functional forms and the motivations behind each one are established. These models are fit to the data and although one is found to produce an acceptable fit, the model is not widely applicable. Chapter 5 summarizes the findings of this work and discusses the accomplishments and shortcomings. Additionally, possible routes for future studies are discussed

    High precision fundamental constants at the TeV scale

    Full text link
    This report summarizes the proceedings of the 2014 Mainz Institute for Theoretical Physics (MITP) scientific program on "High precision fundamental constants at the TeV scale". The two outstanding parameters in the Standard Model dealt with during the MITP scientific program are the strong coupling constant αs\alpha_s and the top-quark mass mtm_t. Lacking knowledge on the value of those fundamental constants is often the limiting factor in the accuracy of theoretical predictions. The current status on αs\alpha_s and mtm_t has been reviewed and directions for future research have been identified.Comment: 57 pages, 24 figures, pdflate

    The Skyrme Interaction in finite nuclei and nuclear matter

    Get PDF
    Self-consistent mean-field models are a powerful tool in the investigation of nuclear structure and low-energy dynamics. They are based on effective energy-density functionals, often formulated in terms of effective density-dependent nucleon-nucleon interactions. The free parameters of the functional are adjusted to empirical data. A proper choice of these parameters requires a comprehensive set of constraints covering experimental data on finite nuclei, concerning static as well as dynamical properties, empirical characteristics of nuclear matter, and observational information on nucleosynthesis, neutron stars and supernovae. This work aims at a comprehensive survey of the performance of one of the most successful non-relativistic self-consistent method, the Skyrme-Hartree-Fock model (SHF), with respect to these constraints. A full description of the Skyrme functional is given and its relation to other effective interactions is discussed. The validity of the application of SHF far from stability and in dense environments beyond the nuclear saturation density is critically assessed. The use of SHF in models extended beyond the mean field approximation by including some correlations is discussed. Finally, future prospects for further development of SHF towards a more consistent application of the existing and promisingly newly developing constraints are outlined.Comment: 71 pages, 22 figures. Accepted for publication in Prog.Part.Nucl.Phy

    Programmed design of ship forms

    Get PDF
    This paper describes a new category of CAD applications devoted to the definition and parameterization of hull forms, called programmed design. Programmed design relies on two prerequisites. The first one is a product model with a variety of types large enough to face the modeling of any type of ship. The second one is a design language dedicated to create the product model. The main purpose of the language is to publish the modeling algorithms of the application in the designer knowledge domain to let the designer create parametric model scripts. The programmed design is an evolution of the parametric design but it is not just parametric design. It is a tool to create parametric design tools. It provides a methodology to extract the design knowledge by abstracting a design experience in order to store and reuse it. Programmed design is related with the organizational and architectural aspects of the CAD applications but not with the development of modeling algorithms. It is built on top and relies on existing algorithms provided by a comprehensive product model. Programmed design can be useful to develop new applications, to support the evolution of existing applications or even to integrate different types of application in a single one. A three-level software architecture is proposed to make the implementation of the programmed design easier. These levels are the conceptual level based on the design language, the mathematical level based on the geometric formulation of the product model and the visual level based on the polyhedral representation of the model as required by the graphic card. Finally, some scenarios of the use of programmed design are discussed. For instance, the development of specialized parametric hull form generators for a ship type or a family of ships or the creation of palettes of hull form components to be used as parametric design patterns. Also two new processes of reverse engineering which can considerably improve the application have been detected: the creation of the mathematical level from the visual level and the creation of the conceptual level from the mathematical level. © 2012 Elsevier Ltd. All rights reserved. 1. Introductio
    • …
    corecore