10 research outputs found

    On the algebraic construction of sparse multilevel approximations of elliptic tensor product problems

    Get PDF
    We consider the solution of elliptic problems on the tensor product of two physical domains as e.g. present in the approximation of the solution covariance of elliptic partial differential equations with random input. Previous sparse approximation approaches used a geometrically constructed multilevel hierarchy. Instead, we construct this hierarchy for a given discretized problem by means of the algebraic multigrid method (AMG). Thereby, we are able to apply the sparse grid combination technique to problems given on complex geometries and for discretizations arising from unstructured grids, which was not feasible before. Numerical results show that our algebraic construction exhibits the same convergence behaviour as the geometric construction, while being applicable even in black-box type PDE solvers

    Sequential Monte Carlo Instant Radiosity

    Get PDF
    The focus of this thesis is to accelerate the synthesis of physically accurate images using computers. Such images are generated by simulating how light flows in the scene using unbiased Monte Carlo algorithms. To date, the efficiency of these algorithms has been too low for real-time rendering of error-free images. This limits the applicability of physically accurate image synthesis in interactive contexts, such as pre-visualization or video games. We focus on the well-known Instant Radiosity algorithm by Keller [1997], that approximates the indirect light field using virtual point lights (VPLs). This approximation is unbiased and has the characteristic that the error is spread out over large areas in the image. This low-frequency noise manifests as an unwanted 'flickering' effect in image sequences if not kept temporally coherent. Currently, the limited VPL budget imposed by running the algorithm at interactive rates results in images which may noticeably differ from the ground-truth. We introduce two new algorithms that alleviate these issues. The first, clustered hierarchical importance sampling, reduces the overall error by increasing the VPL budget without incurring a significant performance cost. It uses an unbiased Monte Carlo estimator to estimate the sensor response caused by all VPLs. We reduce the variance of this estimator with an efficient hierarchical importance sampling method. The second, sequential Monte Carlo Instant Radiosity, generates the VPLs using heuristic sampling and employs non-parametric density estimation to resolve their probability densities. As a result the algorithm is able to reduce the number of VPLs that move between frames, while also placing them in regions where they bring light to the image. This increases the quality of the individual frames while keeping the noise temporally coherent — and less noticeable — between frames. When combined, the two algorithms form a rendering system that performs favourably against traditional path tracing methods, both in terms of performance and quality. Unlike prior VPL-based methods, our system does not suffer from the objectionable lack of temporal coherence in highly occluded scenes

    IT Supported Construction Project Management Methodology Based on Process and Product Model and Quality Management

    Get PDF
    Computer Integrated Construction Project Management (CPM) supported by product and process models can be seen as a future type of integration structure facilitating the solution of various management problems in the fragmented Construction Industry. The key to success is directly correlated with the comprehensive integration of currently isolated IT applications. However, despite that a number of initiatives have been developed, no fully generic models have yet to be formally standardized. This topic has been the subject of intensive research during the last decades. In this thesis a Computer Integrated CPM approach, which is supported by IFC (Industry Foundation Classes) and ISO9001:2000 Quality Management System, is proposed. The main aim is to provide integration of product, process and organizational information to help achieve the interoperability of the involved actors and tools in a concurrent environment. According to implied requirements which are represented in the ‘state of the art’ section, the fundamental concepts are presented in two parts as: (1) realization of CPM in an IT concept and (2) formalization of IFC Views for software interoperability on the example of Bidding Preparation Phase. In order to realize a generic framework using a high-level process core model named Organizational Management Process (OMP) model, different aspects have been brought together into a consistent life cycle structure. These are: (1) a set of layered processes based on ISO procedural definitions, (2) software integration requirements based on Construction Management Phases, (3) application methods of the Procurement System and (4) Organizational data. This provides for synchronizing technical products, processes, documents, and actors in their inter-relationship. The framework is hierarchically structured in three layers Phases – Processes - Product data. The developed IT Management Processes (ITMP) which are used as a baseline for the IFC Views implementation are derived from the OMP. Moreover, in order to support completeness, a mapping structure between processes and scenarios based on the Procurement Systems was constituted. The representation of OMP and ITMP is provided by using the ARIS eEPC (extended event-driven process chain) modeling method. On the basis of a generalized representation of product data, a system-wide integration model for heterogeneous client applications which supports different CPM areas can be achieved. IFC Product Data Model integrates different domains thereby enabling coordination of bidding preparations. However, there is a need to realize individual model subsets. i.e. views of the product model. In this context, adaptable views were developed based on ITMP. The defined resources’ relevancies to IFC Objects are examined by realizing central information elements. These provide a mapping structure between process resources and IFC Classes. On that basis integration of process and product models can be accomplished. In order to realize IFC Views, IFC Concepts and IFC Instance Diagrams were developed based on IFC View Definition Format. The grouping of IFC Concepts enables the implementation of the adaptable IFC Views that are required for standardized system integration. This is achieved with the help of formal specification using the Generalized Subset Definition Schema. The validation has been made based on an alphanumerical comparison. The selected 3D full-model and the developed IFC View for Product Catalog models are compared in this context. There are two consequences observed. In the first case, which also addresses Unit Price Procurement systems, the desired results were obtained by filtering the required data. However, when the results were compared for Design & Build and Lump-sum Procurement Systems (contracts), an extension need was observed in the IFC Model. The solution is provided via formalization of cost data and material analysis information by an extension of IFC Concept namely ‘IfcConstructionResource’ with new classes and with new relations. Thereby a common information model based on the data schema of the IFC standard is constituted.Das von Produkt- und Prozessmodellen unterstĂŒtzte computerintegrierte Bauprojektmanagement (CPM) kann als der zukĂŒnftige Typ der Integrationsstruktur angesehen werden, der die Lösung verschiedener Baumanagementprobleme in der fragmentierten Bauindustrie erleichtern kann. Der SchlĂŒssel zum Erfolg steht in direkter Beziehung zu einer umfassenden Integration derzeit getrennter IT-Anwendungen. Trotz zahlreich entwickelter AnsĂ€tze, die zur VerfĂŒgung gestellt wurden, sind bisher noch keine vollstĂ€ndig generischen Modelle formell standardisiert worden, obwohl dies in den letzten Jahrzehnten ein Thema intensiver Forschung war. In dieser Promotionsschrift wird eine computerintegrierte CPM-Methode, die auf Basis der IFC (Industry Foundation Classes) und dem QualitĂ€tsmanagement ISO 9001:2000 aufbaut, vorgeschlagen. Das Hauptziel besteht in der Schaffung der Integration von Produkt-, Prozess- und Organisationsinformationen, um die InteroperabilitĂ€t der beteiligten Akteure und Tools in einer parallelen Umgebung erreichen zu können. Entsprechend den Anforderungen, die im Abschnitt „Stand der Technik“ aufgefĂŒhrt sind, werden die vorgeschlagenen, grundlegenden Konzepte in zwei Bereiche aufgeteilt: (1) Umsetzung der CPM-Prozesse in ein IT-Konzept und (2) Formalisierung der IFC-Sichten fĂŒr die InteroperabilitĂ€t von Software, beispielhaft ausgefĂŒhrt fĂŒr die der Ausschreibungsphase. Um einen generischen Rahmen unter Verwendung eines hochrangigen Prozesskernmodells, das als organisatorischer Managementprozess (OMP) bezeichnet wird, zu realisieren, werden zuerst die verschiedenen Aspekte in einer konsistenten Lebenszyklenstruktur zusammengefĂŒgt. Diese sind: (1) eine Menge hierarchisch geschichteter Prozesse, erstellt auf der Grundlage der Verfahrensdefinitionen von ISO 9001, (2) die Softwareintegrationsanforderungen auf der Grundlage der Baumanagementphasen, (3) die Anwendungsmethoden des Beschaffungssystems und (4) die Organisationsdaten. Dadurch wird die Synchronisation der in Wechselbeziehung stehenden technischen Produkte, Prozesse, Dokumente und Akteure geschaffen. Das gesamte System ist hierarchisch in die drei Ebenen Phasen – Prozesse – Produktdaten strukturiert. Die entwickelten IT-Managementprozesse (ITMP), die als Grundlage fĂŒr die IFC-Implementierungssichten dienen, werden aus dem OMP hergeleitet. Der VollstĂ€ndigkeit halber, wird eine Abbildungsstruktur zwischen den Prozessen und den Szenarien, die die Beschaffungssysteme beschreiben, entwickelt. Die Darstellung der OMP und ITMP erfolgt unter Verwendung der erweiterten ereignisgesteuerten Prozessketten (eEPK) nach der ARIS-Modelliermethode. Auf der Grundlage einer verallgemeinerten Darstellung der Prozessdaten kann das systemweite Integrationsmodell fĂŒr heterogene Client-Anwendungen, das verschiedene CPM-Bereiche unterstĂŒtzt, erreicht werden. Das IFC-Produktdatenmodell integriert verschiedene DomĂ€nen und ermöglicht somit die Koordinierung der hier beispielhaft gewĂ€hlten Ausschreibungsbearbeitungen. Hierzu ist es notwendig, Teilmodelle, d. h. Sichten des Produktmodells zu erzeugen. Entsprechend wurden anpassbare Sichten auf der Grundlage von ITMP entwickelt. Die Bedeutung der in diesem Zusammenhang identifizierten Informationsprozessressourcen in Bezug auf die IFC-Objekte wurde durch die EinfĂŒhrung zentraler Informationselemente, sog. IFC Concepts, untersucht. Diese stellen eine Abbildungsstruktur zwischen den Prozessressourcen und IFC-Klassen zur VerfĂŒgung. Auf dieser Grundlage konnte die Integration von Prozess- und Produktmodellen erreicht werden. Um die IFC-Sichten zu realisieren, wurden auf der Grundlage des IFC-Sichtendefinitionsformats IFC-Konzepte und IFC-Instanzendiagramme entwickelt. Die Gruppierung in IFC-Konzepten ermöglichte die Implementierung von anpassbaren IFC-Sichten, die fĂŒr die standardisierte Systemintegration erforderlich sind. Diese wird mit Hilfe einer formellen Spezifikation unter Verwendung der verallgemeinerten Subset-Definitionsschema-Methode (GMSD) erreicht. Die Validierung erfolgte auf der Grundlage eines alphanumerischen Vergleichs, in dem ein ausgewĂ€hltes 3D-Produktmodell und die daraus entwickelte IFC-Sicht fĂŒr das Produktkatalogmodell verglichen wurden. Es ergaben sich zwei Schlussfolgerungen. Im ersten Fall, der auch das Einheitspreisbeschaffungssystem betrifft, konnten die gewĂŒnschten Ergebnisse direkt durch Filterung der erforderlichen Daten erhalten werden. Beim Vergleich der Ergebnisse sowohl fĂŒr Pauschal-, als auch fĂŒr Entwurfs- und Baubeschaffungssysteme (VertrĂ€ge) wurde jedoch festgestellt, dass fĂŒr das IFC-Modell ein Erweiterungsbedarf besteht. Eine Lösung wurde ĂŒber die Formalisierung der Kostendaten und Materialanalyseinformationen durch Erweiterung des IFC-Konzepts IfcBauRessource mit neuen Klassen und mit neuen Beziehungen erreicht. Somit erhĂ€lt man ein allgemeines Informationsmodell auf der Grundlage des Datenschemas des IFC-Standards

    An ontological analysis of vague motion verbs, with an application to event recognition

    Get PDF
    This research presents a methodology for the ontological formalisation of vague spatial concepts from natural language, with an application to the automatic recognition of event occurrences on video data. The main issue faced when defining concepts sourced from language is vagueness, related to the presence of ambiguities and borderline cases even in simple concepts such as ‘near’, ‘fast’, ‘big’, etc. Other issues specific to this semantic domain are saliency, granularity and uncertainty. In this work, the issue of vagueness in formal semantics is discussed and a methodology based on supervaluation semantics is proposed. This constitutes the basis for the formalisation of an ontology of vague spatial concepts based on classical logic, Event Calculus and supervaluation semantics. This ontology is structured in layers where high-level concepts, corresponding to complex actions and events, are inferred through mid-level concepts, corresponding to simple processes and properties of objects, and low-level primitive concepts, representing the most essential spatio-temporal characteristics of the real world. The development of ProVision, an event recognition system based on a logic-programming implementation of the ontology, demonstrates a practical application of the methodology. ProVision grounds the ontology on data representing the content of simple video scenes, leading to the inference of event occurrences and other high-level concepts. The contribution of this research is a methodology for the semantic characterisation of vague and qualitative concepts. This methodology addresses the issue of vagueness in ontologies and demonstrates the applicability of a supervaluationist approach to the formalisation of vague concepts. It is also proven to be effective towards solving a practical reasoning task, such as the event recognition on which this work focuses

    New variants of variable neighbourhood search for 0-1 mixed integer programming and clustering

    Get PDF
    Many real-world optimisation problems are discrete in nature. Although recent rapid developments in computer technologies are steadily increasing the speed of computations, the size of an instance of a hard discrete optimisation problem solvable in prescribed time does not increase linearly with the computer speed. This calls for the development of new solution methodologies for solving larger instances in shorter time. Furthermore, large instances of discrete optimisation problems are normally impossible to solve to optimality within a reasonable computational time/space and can only be tackled with a heuristic approach. In this thesis the development of so called matheuristics, the heuristics which are based on the mathematical formulation of the problem, is studied and employed within the variable neighbourhood search framework. Some new variants of the variable neighbourhood searchmetaheuristic itself are suggested, which naturally emerge from exploiting the information from the mathematical programming formulation of the problem. However, those variants may also be applied to problems described by the combinatorial formulation. A unifying perspective on modern advances in local search-based metaheuristics, a so called hyper-reactive approach, is also proposed. Two NP-hard discrete optimisation problems are considered: 0-1 mixed integer programming and clustering with application to colour image quantisation. Several new heuristics for 0-1 mixed integer programming problem are developed, based on the principle of variable neighbourhood search. One set of proposed heuristics consists of improvement heuristics, which attempt to find high-quality near-optimal solutions starting from a given feasible solution. Another set consists of constructive heuristics, which attempt to find initial feasible solutions for 0-1 mixed integer programs. Finally, some variable neighbourhood search based clustering techniques are applied for solving the colour image quantisation problem. All new methods presented are compared to other algorithms recommended in literature and a comprehensive performance analysis is provided. Computational results show that the methods proposed either outperform the existing state-of-the-art methods for the problems observed, or provide comparable results. The theory and algorithms presented in this thesis indicate that hybridisation of the CPLEX MIP solver and the VNS metaheuristic can be very effective for solving large instances of the 0-1 mixed integer programming problem. More generally, the results presented in this thesis suggest that hybridisation of exact (commercial) integer programming solvers and some metaheuristic methods is of high interest and such combinations deserve further practical and theoretical investigation. Results also show that VNS can be successfully applied to solving a colour image quantisation problem.EThOS - Electronic Theses Online ServiceMathematical Institute, Serbian Academy of Sciences and ArtsGBUnited Kingdo

    Proceedings of the Workshop on Change of Representation and Problem Reformulation

    Get PDF
    The proceedings of the third Workshop on Change of representation and Problem Reformulation is presented. In contrast to the first two workshops, this workshop was focused on analytic or knowledge-based approaches, as opposed to statistical or empirical approaches called 'constructive induction'. The organizing committee believes that there is a potential for combining analytic and inductive approaches at a future date. However, it became apparent at the previous two workshops that the communities pursuing these different approaches are currently interested in largely non-overlapping issues. The constructive induction community has been holding its own workshops, principally in conjunction with the machine learning conference. While this workshop is more focused on analytic approaches, the organizing committee has made an effort to include more application domains. We have greatly expanded from the origins in the machine learning community. Participants in this workshop come from the full spectrum of AI application domains including planning, qualitative physics, software engineering, knowledge representation, and machine learning

    Advances in Robotics, Automation and Control

    Get PDF
    The book presents an excellent overview of the recent developments in the different areas of Robotics, Automation and Control. Through its 24 chapters, this book presents topics related to control and robot design; it also introduces new mathematical tools and techniques devoted to improve the system modeling and control. An important point is the use of rational agents and heuristic techniques to cope with the computational complexity required for controlling complex systems. Through this book, we also find navigation and vision algorithms, automatic handwritten comprehension and speech recognition systems that will be included in the next generation of productive systems developed by man

    Design time detection of architectural mismatches in service oriented architectures

    Get PDF
    Service Oriented Architecture (SOA) is a software component paradigm that has the potential to allow for exible systems that are loosely coupled to each other. They are discoverable entities that may be bound to at run time by a client who is able to use the service correctly by referring to the service's description documents. Assumptions often have to be made in any design process if the problem domain is not fully speci ed. If those decisions are about the software architecture of that component and it is inserted into a system with di ering and incompatible assumptions then we say that an architectural mismatch exists. Architectural styles are a form of software reuse. They can simply be used by referring to a name such as \client-server" or \pipe and lter", where these names may conjure up topologies and expected properties in the architects mind. They can also however be more rigorously de ned given the right software environment. This can lead to a vocabulary of elements in the system, de ned properties of those elements along with rules and analysis to either show correctness of an implementation or reveal some emergent property of the whole. SOA includes a requirement that the service components make available descriptions of themselves, indicating how they are to be used. With this in mind and assuming we have a suitable description of the client application it should be the case that we can detect architectural mismatches when designing a new system. Here designing can range from organising a set of existing components into a novel con guration through to devising an entirely new set of components for an SOA. This work investigates the above statement using Web Services as the SOA implementation and found that, to a degree, the above statement is true. The only element of description required for a web service is the Web Service Description Language (WSDL) document and this does indeed allow the detection of a small number of mismatches when represented using our minimal web service architectural style. However from the literature we nd that the above mismatches are only a subset of those that we argue should be detectable. In response to this we produce an enhanced web service architectural style containing properties and analysis supporting the detection of this more complete set of mismatches and demonstrate its e ectiveness against a number of case studies.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
    corecore