12 research outputs found

    Efficient Analysis in Multimedia Databases

    Get PDF
    The rapid progress of digital technology has led to a situation where computers have become ubiquitous tools. Now we can find them in almost every environment, be it industrial or even private. With ever increasing performance computers assumed more and more vital tasks in engineering, climate and environmental research, medicine and the content industry. Previously, these tasks could only be accomplished by spending enormous amounts of time and money. By using digital sensor devices, like earth observation satellites, genome sequencers or video cameras, the amount and complexity of data with a spatial or temporal relation has gown enormously. This has led to new challenges for the data analysis and requires the use of modern multimedia databases. This thesis aims at developing efficient techniques for the analysis of complex multimedia objects such as CAD data, time series and videos. It is assumed that the data is modeled by commonly used representations. For example CAD data is represented as a set of voxels, audio and video data is represented as multi-represented, multi-dimensional time series. The main part of this thesis focuses on finding efficient methods for collision queries of complex spatial objects. One way to speed up those queries is to employ a cost-based decompositioning, which uses interval groups to approximate a spatial object. For example, this technique can be used for the Digital Mock-Up (DMU) process, which helps engineers to ensure short product cycles. This thesis defines and discusses a new similarity measure for time series called threshold-similarity. Two time series are considered similar if they expose a similar behavior regarding the transgression of a given threshold value. Another part of the thesis is concerned with the efficient calculation of reverse k-nearest neighbor (RkNN) queries in general metric spaces using conservative and progressive approximations. The aim of such RkNN queries is to determine the impact of single objects on the whole database. At the end, the thesis deals with video retrieval and hierarchical genre classification of music using multiple representations. The practical relevance of the discussed genre classification approach is highlighted with a prototype tool that helps the user to organize large music collections. Both the efficiency and the effectiveness of the presented techniques are thoroughly analyzed. The benefits over traditional approaches are shown by evaluating the new methods on real-world test datasets

    Spatial Database Support for Virtual Engineering

    Get PDF
    The development, design, manufacturing and maintenance of modern engineering products is a very expensive and complex task. Shorter product cycles and a greater diversity of models are becoming decisive competitive factors in the hard-fought automobile and plane market. In order to support engineers to create complex products when being pressed for time, systems are required which answer collision and similarity queries effectively and efficiently. In order to achieve industrial strength, the required specialized functionality has to be integrated into fully-fledged database systems, so that fundamental services of these systems can be fully reused, including transactions, concurrency control and recovery. This thesis aims at the development of theoretical sound and practical realizable algorithms which effectively and efficiently detect colliding and similar complex spatial objects. After a short introductory Part I, we look in Part II at different spatial index structures and discuss their integrability into object-relational database systems. Based on this discussion, we present two generic approaches for accelerating collision queries. The first approach exploits available statistical information in order to accelerate the query process. The second approach is based on a cost-based decompositioning of complex spatial objects. In a broad experimental evaluation based on real-world test data sets, we demonstrate the usefulness of the presented techniques which allow interactive query response times even for large data sets of complex objects. In Part III of the thesis, we discuss several similarity models for spatial objects. We show by means of a new evaluation method that data-partitioning similarity models yield more meaningful results than space-partitioning similarity models. We introduce a very effective similarity model which is based on a new paradigm in similarity search, namely the use of vector set represented objects. In order to guarantee efficient query processing, suitable filters are introduced for accelerating similarity queries on complex spatial objects. Based on clustering and the introduced similarity models we present an industrial prototype which helps the user to navigate through massive data sets.Ein schneller und reibungsloser Entwicklungsprozess neuer Produkte ist ein wichtiger Faktor fĂŒr den wirtschaftlichen Erfolg vieler Unternehmen insbesondere aus der Luft- und Raumfahrttechnik und der Automobilindustrie. Damit Ingenieure in immer kĂŒrzerer Zeit immer anspruchsvollere Produkte entwickeln können, werden effektive und effiziente Kollisions- und Ähnlichkeitsanfragen auf komplexen rĂ€umlichen Objekten benötigt. Um den hohen Anforderungen eines produktiven Einsatzes zu genĂŒgen, mĂŒssen entsprechend spezialisierte Zugriffsmethoden in vollwertige Datenbanksysteme integriert werden, so dass zentrale Datenbankdienste wie Trans-aktionen, kontrollierte NebenlĂ€ufigkeit und Wiederanlauf sichergestellt sind. Ziel dieser Doktorarbeit ist es deshalb, effektive und effiziente Algorithmen fĂŒr Kollisions- und Ähnlichkeitsanfragen auf komplexen rĂ€umlichen Objekten zu ent-wickeln und diese in kommerzielle Objekt-Relationale Datenbanksysteme zu integrieren. Im ersten Teil der Arbeit werden verschiedene rĂ€umliche Indexstrukturen zur effizienten Bearbeitung von Kollisionsanfragen diskutiert und auf ihre IntegrationsfĂ€higkeit in Objekt-Relationale Datenbanksysteme hin untersucht. Daran an-knĂŒpfend werden zwei generische Verfahren zur Beschleunigung von Kollisionsanfragen vorgestellt. Das erste Verfahren benutzt statistische Informationen rĂ€umlicher Indexstrukturen, um eine gegebene Anfrage zu beschleunigen. Das zweite Verfahren beruht auf einer kostenbasierten Zerlegung komplexer rĂ€umlicher Datenbank- Objekte. Diese beiden Verfahren ergĂ€nzen sich gegenseitig und können unabhĂ€ngig voneinander oder zusammen eingesetzt werden. In einer ausfĂŒhrlichen experimentellen Evaluation wird gezeigt, dass die beiden vorgestellten Verfahren interaktive Kollisionsanfragen auf umfangreichen Datenmengen und komplexen Objekten ermöglichen. Im zweiten Teil der Arbeit werden verschiedene Ähnlichkeitsmodelle fĂŒr rĂ€um-liche Objekte vorgestellt. Es wird experimentell aufgezeigt, dass datenpartitionierende Modelle effektiver sind als raumpartitionierende Verfahren. Weiterhin werden geeignete Filtertechniken zur Beschleunigung des Anfrageprozesses entwickelt und experimentell untersucht. Basierend auf Clustering und den entwickelten Ähnlichkeitsmodellen wird ein industrietauglicher Prototyp vorgestellt, der Benutzern hilft, durch große Datenmengen zu navigieren

    A Coupled Stochastic-Deterministic Method for the Numerical Solution of Population Balance Systems

    Get PDF
    In this thesis, a new algorithm for the numerical solution of population balance systems is proposed and applied within two simulation projects. The regarded systems stem from chemical engineering. In particular, crystallization processes in fluid environment are regarded. The descriptive population balance equations are extensions of the classical Smoluchowski coagulation equation, of which they inherit the numerical difficulties introduced with the coagulation integral, especially in regard of higher dimensional particle models. The new algorithm brings together two different fields of numerical mathematics and scientific computing, namely a stochastic particle simulation based on a Markov process Monte—Carlo method, and (deterministic) finite element schemes from computational fluid dynamics. Stochastic particle simulations are approved methods for the solution of population balance equations. Their major advantages are the inclusion of microscopic information into the model while offering convergence against solutions of the macroscopic equation, as well as numerical efficiency and robustness. The embedding of a stochastic method into a deterministic flow simulation offers new possibilities for the solution of coupled population balance systems, especially in regard of the microscopic details of the interaction of particles. In the thesis, the new simulation method is first applied to a population balance system that models an experimental tube crystallizer which is used for the production of crystalline aspirin. The device is modeled in an axisymmetric two-dimensional fashion. Experimental data is reproduced in moderate computing time. Thereafter, the method is extended to three spatial dimensions and used for the simulation of an experimental, continuously operated fluidized bed crystallizer. This system is fully instationary, the turbulent flow is computed on-the-fly. All the used methods from the simulation of the Navier—Stokes equations, the simulation of convection-diffusion equations, and of stochastic particle simulation are introduced, motivated and discussed extensively. Coupling phenomena in the regarded population balance systems and the coupling algorithm itself are discussed in great detail. Furthermore, own results about the efficient numerical solution of the Navier—Stokes equations are presented, namely an assessment of fast solvers for discrete saddle point problems, and an own interpretation of the classical domain decompositioning method for the parallelization of the finite element method

    A Conceptual Model of Exploration Wayfinding: An Integrated Theoretical Framework and Computational Methodology

    Get PDF
    This thesis is an attempt to integrate contending cognitive approaches to modeling wayfinding behavior. The primary goal is to create a plausible model for exploration tasks within indoor environments. This conceptual model can be extended for practical applications in the design, planning, and Social sciences. Using empirical evidence a cognitive schema is designed that accounts for perceptual and behavioral preferences in pedestrian navigation. Using this created schema, as a guiding framework, the use of network analysis and space syntax act as a computational methods to simulate human exploration wayfinding in unfamiliar indoor environments. The conceptual model provided is then implemented in two ways. First of which is by updating an existing agent-based modeling software directly. The second means of deploying the model is using a spatial interaction model that distributed visual attraction and movement permeability across a graph-representation of building floor plans

    Enhanced Query Processing on Complex Spatial and Temporal Data

    Get PDF
    Innovative technologies in the area of multimedia and mechanical engineering as well as novel methods for data acquisition in different scientific subareas, including geo-science, environmental science, medicine, biology and astronomy, enable a more exact representation of the data, and thus, a more precise data analysis. The resulting quantitative and qualitative growth of specifically spatial and temporal data leads to new challenges for the management and processing of complex structured objects and requires the employment of efficient and effective methods for data analysis. Spatial data denote the description of objects in space by a well-defined extension, a specific location and by their relationships to the other objects. Classical representatives of complex structured spatial objects are three-dimensional CAD data from the sector "mechanical engineering" and two-dimensional bounded regions from the area "geography". For industrial applications, efficient collision and intersection queries are of great importance. Temporal data denote data describing time dependent processes, as for instance the duration of specific events or the description of time varying attributes of objects. Time series belong to one of the most popular and complex type of temporal data and are the most important form of description for time varying processes. An elementary type of query in time series databases is the similarity query which serves as basic query for data mining applications. The main target of this thesis is to develop an effective and efficient algorithm supporting collision queries on spatial data as well as similarity queries on temporal data, in particular, time series. The presented concepts are based on the efficient management of interval sequences which are suitable for spatial and temporal data. The effective analysis of the underlying objects will be efficiently supported by adequate access methods. First, this thesis deals with collision queries on complex spatial objects which can be reduced to intersection queries on interval sequences. We introduce statistical methods for the grouping of subsequences. Involving the concept of multi-step query processing, these methods enable the user to accelerate the query process drastically. Furthermore, in this thesis we will develop a cost model for the multi-step query process of interval sequences in distributed systems. The proposed approach successfully supports a cost based query strategy. Second, we introduce a novel similarity measure for time series. It allows the user to focus specific time series amplitudes for the similarity measurement. The new similarity model defines two time series to be similar iff they show similar temporal behavior w.r.t. being below or above a specific threshold. This type of query is primarily required in natural science applications. The main goal of this new query method is the detection of anomalies and the adaptation to new claims in the area of data mining in time series databases. In addition, a semi-supervised cluster analysis method will be presented which is based on the introduced similarity model for time series. The efficiency and effectiveness of the proposed techniques will be extensively discussed and the advantages against existing methods experimentally proofed by means of datasets derived from real-world applications

    Symmetry in Electromagnetism

    Get PDF
    Electromagnetism plays a crucial role in basic and applied physics research. The discovery of electromagnetism as the unifying theory for electricity and magnetism represents a cornerstone in modern physics. Symmetry was crucial to the concept of unification: electromagnetism was soon formulated as a gauge theory in which local phase symmetry explained its mathematical formulation. This early connection between symmetry and electromagnetism shows that a symmetry-based approach to many electromagnetic phenomena is recurrent, even today. Moreover, many recent technological advances are based on the control of electromagnetic radiation in nearly all its spectra and scales, the manipulation of matter–radiation interactions with unprecedented levels of sophistication, or new generations of electromagnetic materials. This is a fertile field for applications and for basic understanding in which symmetry, as in the past, bridges apparently unrelated phenomena―from condensed matter to high-energy physics. In this book, we present modern contributions in which symmetry proves its value as a key tool. From dual-symmetry electrodynamics to applications to sustainable smart buildings, or magnetocardiography, we can find a plentiful crop, full of exciting examples of modern approaches to electromagnetism. In all cases, symmetry sheds light on the theoretical and applied works presented in this book

    Learned infinite elements for helioseismology

    Get PDF
    This thesis presents efficient techniques for integrating the information contained in the Dirichlet-to-Neumann (DtN) map of time-harmonic waves propagating in a stratified medium into finite element discretizations. This task arises in the context of domain decomposition methods, e.g. when reducing a problem posed on an unbounded domain to a bounded computational domain on which the problem can then be discretized. Our focus is on stratified media like the Sun, that allow for strong reflection of waves and for which suitable methods are lacking. We present learned infinite elements as a possible approach to deal with such media utilizing the assumption of a separable geometry. In this case, the DtN map is separable, however, it remains a non-local operator with a dense matrix representation, which renders its direct use computationally inefficient. Therefore, we approximate the DtN only indirectly by adding additional degrees of freedom to the linear system in such a way that the Schur complement w.r.t. the latter provides an optimal approximation of DtN and sparsity of the linear system is preserved. This optimality is ensured via the solution of a small minimization problem, which incorporates solutions of one-dimensional time-harmonic wave equations and allows for great flexibility w.r.t. properties of the medium. In the first half of the thesis we provide an error analysis of the proposed method in a generic framework which demonstrates that exponentially fast convergence rates can be expected. Numerical experiments for the Helmholtz equation and an in-depth study on modelling the solar atmosphere with learned infinite elements demonstrate the high accuracy and flexibility of the proposed method in practical applications. In the second half of the thesis, the potential of learned infinite elements in the context of sweeping preconditioners for the efficient iterative solution of large linear systems is investigated. Even though learned infinite elements are very suitable for separable media, they can only be used for tiny perturbations thereof since the corresponding DtN maps turn out to be extremely sensitive to perturbations in the presence of strong reflections.2021-12-2

    Quod Erat Demonstrandum: From Herodotus’ ethnographic journeys to cross-cultural research

    Get PDF
    A peer-reviewed book based on presentations at the XVIII Congress of the International Association for Cross-Cultural Psychology, 2006, Isle of Spetses, Greece. (c) 2009, International Association for Cross-Cultural Psychologyhttps://scholarworks.gvsu.edu/iaccp_proceedings/1004/thumbnail.jp

    A systematic approach for integrated product, materials, and design-process design

    Get PDF
    Designers are challenged to manage customer, technology, and socio-economic uncertainty causing dynamic, unquenchable demands on limited resources. In this context, increased concept flexibility, referring to a designer s ability to generate concepts, is crucial. Concept flexibility can be significantly increased through the integrated design of product and material concepts. Hence, the challenge is to leverage knowledge of material structure-property relations that significantly affect system concepts for function-based, systematic design of product and materials concepts in an integrated fashion. However, having selected an integrated product and material system concept, managing complexity in embodiment design-processes is important. Facing a complex network of decisions and evolving analysis models a designer needs the flexibility to systematically generate and evaluate embodiment design-process alternatives. In order to address these challenges and respond to the primary research question of how to increase a designer s concept and design-process flexibility to enhance product creation in the conceptual and early embodiment design phases, the primary hypothesis in this dissertation is embodied as a systematic approach for integrated product, materials and design-process design. The systematic approach consists of two components i) a function-based, systematic approach to the integrated design of product and material concepts from a systems perspective, and ii) a systematic strategy to design-process generation and selection based on a decision-centric perspective and a value-of-information-based Process Performance Indicator. The systematic approach is validated using the validation-square approach that consists of theoretical and empirical validation. Empirical validation of the framework is carried out using various examples including: i) design of a reactive material containment system, and ii) design of an optoelectronic communication system.Ph.D.Committee Chair: Allen, Janet K.; Committee Member: Aidun, Cyrus K.; Committee Member: Klein, Benjamin; Committee Member: McDowell, David L.; Committee Member: Mistree, Farrokh; Committee Member: Yoder, Douglas P
    corecore