3,504 research outputs found

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Ontological approach to development of computing with words based systems

    Get PDF
    AbstractComputing with words introduced by Zadeh becomes a very important concept in processing of knowledge represented in the form of propositions. Two aspects of this concept – approximation and personalization – are essential to the process of building intelligent systems for human-centric computing.For the last several years, Artificial Intelligence community has used ontology as a means for representing knowledge. Recently, the development of a new Internet paradigm – the Semantic Web – has led to introduction of another form of ontology. It allows for defining concepts, identifying relationships among these concepts, and representing concrete information. In other words, an ontology has become a very powerful way of representing not only information but also its semantics.The paper proposes an application of ontology, in the sense of the Semantic Web, for development of computing with words based systems capable of performing operations on propositions including their semantics. The ontology-based approach is very flexible and provides a rich environment for expressing different types of information including perceptions. It also provides a simple way of personalization of propositions. An architecture of computing with words based system is proposed. A prototype of such a system is described

    High Performance Reconfigurable Fuzzy Logic Device for Medical Risk Evaluation

    Get PDF
    To date cardiovascular diseases (CVD) account for approximately 35% of all deaths worldwide. Many of these deaths are preventable if the risk of developing them can be accurately assessed early. Medical devices in use today cannot determine a patient's risk of developing a CVD condition. If accurate risk assessment was readily available to doctors, they can track rising trends in risk levels and recommend preventative measures for their patients. If patients had this risk assessment information before symptoms developed or life-threatening conditions occurred, they can contact their doctors to inquire about recommendations or seek help in emergency situations.This thesis research proposes the idea of using evolutionary programmed and tuned fuzzy logic controllers to diagnose a patient's risk of developing a CVD condition. The specific aim of this research seeks to advance the flexibility and functionality of fuzzy logic systems without sacrificing high speed and low resource utilization. The proposed system can be broken down into two layers. The bottom layer contains the controller that implements the fuzzy logic model and calculates the patient's risk of developing a CVD. The controller is designed in a context switchable hardware architecture the can be reconfigured to assess the risk of different CVD diseases. The top layer implements the evolutionary genetic algorithm in software, which configures the fuzzy parameters that optimize the behavior of the controller. The current implementation inputs patient's personal data such as electrocardiogram (ECG) wave features, age and body mass index (BMI) and outputs a risk percentage for Sinus Bradycardia (SB), a common cardiac arrhythmia. We validated this system via Matlab and Modelsim simulations and built the first prototype on a Xilinx Virtex-5 FPGA platform. Experimental results show that this 3-input-1-output fuzzy controller with 5 fuzzy sets per variable and 125 rule propositions produces results within an interval of approximately 1us while reducing hardware resource utilization by at least 25% when compared with existing designs

    Emerging technologies for learning report (volume 3)

    Get PDF

    Designing as playing games of make-believe

    Get PDF
    Designing complex products involves working with uncertainties as the product, the requirements and the environment in which it is used co-evolve, and designers and external stakeholders make decisions that affect the evolving design. Rather than being held back by uncertainty, designers work, cooperate and communicate with each other notwithstanding these uncertainties by making assumptions to carry out their own tasks. To explain this, the paper proposes an adaptation of Kendall Walton’s make-believe theory, to conceptualize designing as playing games of make-believe by inferring what is required and imagining what is possible given the current set of assumptions and decisions, while knowing these are subject to change. What one is allowed and encouraged to imagine, conclude or propose is governed by socially agreed rules and constraints. The paper uses jet engine component design as an example to illustrate how different design teams make assumptions at the beginning of design activities and negotiate what can and cannot be done with the design. This often involves iteration – repeating activities under revised sets of assumptions. As assumptions are collectively revised they become part of a new game of make-believe in the sense that there is social agreement that the decisions constitute part of the constraints that govern what can legitimately be inferred about the design or added to it

    A Digital Game Maturity Model

    Get PDF
    Game development is an interdisciplinary concept that embraces artistic, software engineering, management, and business disciplines. Game development is considered as one of the most complex tasks in software engineering. Hence, for successful development of good-quality games, the game developers must consider and explore all related dimensions as well as discussing them with the stakeholders involved. This research facilitates a better understanding of important dimensions of digital game development methodology. The increased popularity of digital games, the challenges faced by game development organizations in developing quality games, and severe competition in the digital game industry demand a game development process maturity assessment. Consequently, this study presents a Digital Game Maturity Model to evaluate the current development methodology in an organization. The objective is first to identify key factors in the game development process, then to classify these factors into target groups, and eventually to use this grouping as a theoretical basis for proposing a maturity model for digital game development. In doing so, the research focuses on three major stakeholders in game development: developers, consumers, and business management. The framework of the proposed model consists of assessment questionnaires made up of key identified factors from three empirical studies, a performance scale, and a rating method. The main goal of the questionnaires is to collect information about current processes and practices. This research contributes towards formulating a comprehensive and unified strategy for game development process maturity assessment. The proposed model was evaluated with two case studies from the digital game industry

    Review of Health Prognostics and Condition Monitoring of Electronic Components

    Get PDF
    To meet the specifications of low cost, highly reliable electronic devices, fault diagnosis techniques play an essential role. It is vital to find flaws at an early stage in design, components, material, or manufacturing during the initial phase. This review paper attempts to summarize past development and recent advances in the areas about green manufacturing, maintenance, remaining useful life (RUL) prediction, and like. The current state of the art in reliability research for electronic components, mainly includes failure mechanisms, condition monitoring, and residual lifetime evaluation is explored. A critical analysis of reliability studies to identify their relative merits and usefulness of the outcome of these studies' vis-a-vis green manufacturing is presented. The wide array of statistical, empirical, and intelligent tools and techniques used in the literature are then identified and mapped. Finally, the findings are summarized, and the central research gap is highlighted
    corecore