319 research outputs found

    Distributed Random Set Theoretic Soft/Hard Data Fusion

    Get PDF
    Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application

    Evidential Reasoning & Analytical Techniques In Criminal Pre-Trial Fact Investigation

    Get PDF
    This thesis is the work of the author and is concerned with the development of a neo-Wigmorean approach to evidential reasoning in police investigation. The thesis evolved out of dissatisfaction with cardinal aspects of traditional approaches to police investigation, practice and training. Five main weaknesses were identified: Firstly, a lack of a theoretical foundation for police training and practice in the investigation of crime and evidence management; secondly, evidence was treated on the basis of its source rather than it's inherent capacity for generating questions; thirdly, the role of inductive elimination was underused and misunderstood; fourthly, concentration on single, isolated cases rather than on the investigation of multiple cases and, fifthly, the credentials of evidence were often assumed rather than considered, assessed and reasoned within the context of argumentation. Inspiration from three sources were used to develop the work: Firstly, John Henry Wigmore provided new insights into the nature of evidential reasoning and formal methods for the construction of arguments; secondly, developments in biochemistry provided new insights into natural methods of storing and using information; thirdly, the science of complexity provided new insights into the complex nature of collections of data that could be developed into complex systems of information and evidence. This thesis is an application of a general methodology supported by new diagnostic and analytical techniques. The methodology was embodied in a software system called Forensic Led Intelligence System: FLINTS. My standpoint is that of a forensic investigator with an interest in how evidential reasoning can improve the operation we call investigation. New areas of evidential reasoning are in progress and these are discussed including a new application in software designed by the author: MAVERICK. There are three main themes; Firstly, how a broadened conception of evidential reasoning supported by new diagnostic and analytical techniques can improve the investigation and discovery process. Secondly, an explanation of how a greater understanding of the roles and effects of different styles of reasoning can assist the user; and thirdly; a range of concepts and tools are presented for the combination, comparison, construction and presentation of evidence in imaginative ways. Taken together these are intended to provide examples of a new approach to the science of evidential reasoning. Originality will be in four key areas; 1. Extending and developing Wigmorean techniques to police investigation and evidence management. 2. Developing existing approaches in single case analysis and introducing an intellectual model for multi case analysis. 3. Introducing a new model for police training in investigative evidential reasoning. 4. Introducing a new software system to manage evidence in multi case approaches using forensic scientific evidence. FLINTS

    The Use of Simulation to Learn Project Business in a Complex Context

    Get PDF
    This paper presents how the use of simulation can provide benefits to learn project business in nowadays world’s complex context. This last decade, due to the fast technological advances and the rising globalization, markets are becoming more and more competitive, and this drives companies to increase their interest in planning and implementing specific projects, as they strive to achieve corporate goals. This resulted on various standards emerging around project management and an increase in the degree of professionalism. However, projects still continue to fail due to the increasing complexity of the projects or an underestimation of the projects complexity. The aim of the framework is to presents after a literature review, what a project business is, what complexity means in this context, and how simulation can help to avoid the gaps of nowadays. With this information a proper simulation model is designed using the software Anylogic. The model is based in a project process network and it’s mainly focused on buffers allocation, resources, quality, budget and time management. Once the model was done has been validated in a workshop through a form the participants filled after interacting with the software and the models. With that feedback it has been possible to evaluate how students feel about using simulation as a teaching tool and show how the literature references were right and closely connected with the results obtained

    A framework for managing global risk factors affecting construction cost performance

    Get PDF
    Poor cost performance of construction projects has been a major concern for both contractors and clients. The effective management of risk is thus critical to the success of any construction project and the importance of risk management has grown as projects have become more complex and competition has increased. Contractors have traditionally used financial mark-ups to cover the risk associated with construction projects but as competition increases and margins have become tighter they can no longer rely on this strategy and must improve their ability to manage risk. Furthermore, the construction industry has witnessed significant changes particularly in procurement methods with clients allocating greater risks to contractors. Evidence shows that there is a gap between existing risk management techniques and tools, mainly built on normative statistical decision theory, and their practical application by construction contractors. The main reason behind the lack of use is that risk decision making within construction organisations is heavily based upon experience, intuition and judgement and not on mathematical models. This thesis presents a model for managing global risk factors affecting construction cost performance of construction projects. The model has been developed using behavioural decision approach, fuzzy logic technology, and Artificial Intelligence technology. The methodology adopted to conduct the research involved a thorough literature survey on risk management, informal and formal discussions with construction practitioners to assess the extent of the problem, a questionnaire survey to evaluate the importance of global risk factors and, finally, repertory grid interviews aimed at eliciting relevant knowledge. There are several approaches to categorising risks permeating construction projects. This research groups risks into three main categories, namely organisation-specific, global and Acts of God. It focuses on global risk factors because they are ill-defined, less understood by contractors and difficult to model, assess and manage although they have huge impact on cost performance. Generally, contractors, especially in developing countries, have insufficient experience and knowledge to manage them effectively. The research identified the following groups of global risk factors as having significant impact on cost performance: estimator related, project related, fraudulent practices related, competition related, construction related, economy related and political related factors. The model was tested for validity through a panel of validators (experts) and crosssectional cases studies, and the general conclusion was that it could provide valuable assistance in the management of global risk factors since it is effective, efficient, flexible and user-friendly. The findings stress the need to depart from traditional approaches and to explore new directions in order to equip contractors with effective risk management tools

    Deterministic ethernet in a safety critical environment

    Get PDF
    This thesis explores the concept of creating safety critical networks with low congestion and latency (known as critical networking) for real time critical communication (safety critical environment). Critical networking refers to the dynamic management of all the application demands in a network within all available network bandwidth, in order to avoid congestion. Critical networking removes traffic congestion and delay to provide quicker response times. A Deterministic Ethernet communication system in a Safety Critical environment addresses the disorderly Ethernet traffic condition inherent in all Ethernet networks. Safety Critical environment means both time critical (delay sensitive) and content critical (error free). Ethernet networks however do not operate in a deterministic fashion, giving rise to congestion. To discover the common traffic patterns that cause congestion a detailed analysis was carried out using neural network techniques. This analysis has investigated the issues associated with delay and congestion and identified their root cause, namely unknown transmission conditions. The congestion delay, and its removal, was explored in a simulated control environment in a small star network using the Air-field communication standard. A Deterministic Ethernet was created and implemented using a Network Traffic Oscillator (NTO). NTO uses Critical Networking principles to transform random burst application transmission impulses into deterministic sinusoid transmissions. It is proved that the NTO has the potential to remove congestion and minimise latency. Based on its potential, it is concluded that the proposed Deterministic Ethernet can be used to improve network security as well as control long haul communication

    Artificial Intelligence for Small Satellites Mission Autonomy

    Get PDF
    Space mission engineering has always been recognized as a very challenging and innovative branch of engineering: since the beginning of the space race, numerous milestones, key successes and failures, improvements, and connections with other engineering domains have been reached. Despite its relative young age, space engineering discipline has not gone through homogeneous times: alternation of leading nations, shifts in public and private interests, allocations of resources to different domains and goals are all examples of an intrinsic dynamism that characterized this discipline. The dynamism is even more striking in the last two decades, in which several factors contributed to the fervour of this period. Two of the most important ones were certainly the increased presence and push of the commercial and private sector and the overall intent of reducing the size of the spacecraft while maintaining comparable level of performances. A key example of the second driver is the introduction, in 1999, of a new category of space systems called CubeSats. Envisioned and designed to ease the access to space for universities, by standardizing the development of the spacecraft and by ensuring high probabilities of acceptance as piggyback customers in launches, the standard was quickly adopted not only by universities, but also by agencies and private companies. CubeSats turned out to be a disruptive innovation, and the space mission ecosystem was deeply changed by this. New mission concepts and architectures are being developed: CubeSats are now considered as secondary payloads of bigger missions, constellations are being deployed in Low Earth Orbit to perform observation missions to a performance level considered to be only achievable by traditional, fully-sized spacecraft. CubeSats, and more in general the small satellites technology, had to overcome important challenges in the last few years that were constraining and reducing the diffusion and adoption potential of smaller spacecraft for scientific and technology demonstration missions. Among these challenges were: the miniaturization of propulsion technologies, to enable concepts such as Rendezvous and Docking, or interplanetary missions; the improvement of telecommunication state of the art for small satellites, to enable the downlink to Earth of all the data acquired during the mission; and the miniaturization of scientific instruments, to be able to exploit CubeSats in more meaningful, scientific, ways. With the size reduction and with the consolidation of the technology, many aspects of a space mission are reduced in consequence: among these, costs, development and launch times can be cited. An important aspect that has not been demonstrated to scale accordingly is operations: even for small satellite missions, human operators and performant ground control centres are needed. In addition, with the possibility of having constellations or interplanetary distributed missions, a redesign of how operations are management is required, to cope with the innovation in space mission architectures. The present work has been carried out to address the issue of operations for small satellite missions. The thesis presents a research, carried out in several institutions (Politecnico di Torino, MIT, NASA JPL), aimed at improving the autonomy level of space missions, and in particular of small satellites. The key technology exploited in the research is Artificial Intelligence, a computer science branch that has gained extreme interest in research disciplines such as medicine, security, image recognition and language processing, and is currently making its way in space engineering as well. The thesis focuses on three topics, and three related applications have been developed and are here presented: autonomous operations by means of event detection algorithms, intelligent failure detection on small satellite actuator systems, and decision-making support thanks to intelligent tradespace exploration during the preliminary design of space missions. The Artificial Intelligent technologies explored are: Machine Learning, and in particular Neural Networks; Knowledge-based Systems, and in particular Fuzzy Logics; Evolutionary Algorithms, and in particular Genetic Algorithms. The thesis covers the domain (small satellites), the technology (Artificial Intelligence), the focus (mission autonomy) and presents three case studies, that demonstrate the feasibility of employing Artificial Intelligence to enhance how missions are currently operated and designed
    • …
    corecore