158 research outputs found

    Uncertainty Management of Intelligent Feature Selection in Wireless Sensor Networks

    Get PDF
    Wireless sensor networks (WSN) are envisioned to revolutionize the paradigm of monitoring complex real-world systems at a very high resolution. However, the deployment of a large number of unattended sensor nodes in hostile environments, frequent changes of environment dynamics, and severe resource constraints pose uncertainties and limit the potential use of WSN in complex real-world applications. Although uncertainty management in Artificial Intelligence (AI) is well developed and well investigated, its implications in wireless sensor environments are inadequately addressed. This dissertation addresses uncertainty management issues of spatio-temporal patterns generated from sensor data. It provides a framework for characterizing spatio-temporal pattern in WSN. Using rough set theory and temporal reasoning a novel formalism has been developed to characterize and quantify the uncertainties in predicting spatio-temporal patterns from sensor data. This research also uncovers the trade-off among the uncertainty measures, which can be used to develop a multi-objective optimization model for real-time decision making in sensor data aggregation and samplin

    Final Report of the ModSysC2020 Working Group - Data, Models and Theories for Complex Systems: new challenges and opportunities

    Get PDF
    Final Report of the ModSysC2020 Working Group at University Montpellier 2At University Montpellier 2, the modeling and simulation of complex systems has been identified as a major scientific challenge and one of the priority axes in interdisciplinary research, with major potential impact on training, economy and society. Many research groups and laboratories in Montpellier are already working in that direction, but typically in isolation within their own scientific discipline. Several local actions have been initiated in order to structure the scientific community with interdisciplinary projects, but with little coordination among the actions. The goal of the ModSysC2020 (modeling and simulation of complex systems in 2020) working group was to analyze the local situation (forces and weaknesses, current projects), identify the critical research directions and propose concrete actions in terms of research projects, equipment facilities, human resources and training to be encouraged. To guide this perspective, we decomposed the scientific challenge into four main themes, for which there is strong background in Montpellier: (1) modeling and simulation of complex systems; (2) algorithms and computing; (3) scientific data management; (4) production, storage and archiving of data from the observation of the natural and biological media. In this report, for each theme, we introduce the context and motivations, analyze the situation in Montpellier, identify research directions and propose specific actions in terms of interdisciplinary research projects and training. We also provide an analysis of the socio-economical aspects of modeling and simulation through use cases in various domains such as life science and healthcare, environmental science and energy. Finally, we discuss the importance of revisiting students training in fundamental domains such as modeling, computer programming and database which are typically taught too late, in specialized masters

    Fusion of Information and Analytics: A Discussion on Potential Methods to Cope with Uncertainty in Complex Environments (Big Data and IoT)

    Get PDF
    International audienceInformation overload and complexity are core problems to most organizations of today. The advances in networking capabilities have created the conditions of complexity by enabling richer, real-time interactions between and among individuals, objects, systems and organizations. Fusion of Information and Analytics Technologies (FIAT) are key enablers for the design of current and future decision support systems to support prognosis, diagnosis, and prescriptive tasks in such complex environments. Hundreds of methods and technologies exist, and several books have been dedicated to either analytics or information fusion so far. However, very few have discussed the methodological aspects and the need of integrating frameworks for these techniques coming from multiple disciplines. This paper presents a discussion of potential integrating frameworks as well as the development of a computational model to evolve FIAT-based systems capable of meeting the challenges of complex environments such as in Big Data and Internet of Things (IoT)

    Mining climate data for shire level wheat yield predictions in Western Australia

    Get PDF
    Climate change and the reduction of available agricultural land are two of the most important factors that affect global food production especially in terms of wheat stores. An ever increasing world population places a huge demand on these resources. Consequently, there is a dire need to optimise food production. Estimations of crop yield for the South West agricultural region of Western Australia have usually been based on statistical analyses by the Department of Agriculture and Food in Western Australia. Their estimations involve a system of crop planting recommendations and yield prediction tools based on crop variety trials. However, many crop failures arise from adherence to these crop recommendations by farmers that were contrary to the reported estimations. Consequently, the Department has sought to investigate new avenues for analyses that improve their estimations and recommendations. This thesis explores a new approach in the way analyses are carried out. This is done through the introduction of new methods of analyses such as data mining and online analytical processing in the strategy. Additionally, this research attempts to provide a better understanding of the effects of both gradual variation parameters such as soil type, and continuous variation parameters such as rainfall and temperature, on the wheat yields. The ultimate aim of the research is to enhance the prediction efficiency of wheat yields. The task was formidable due to the complex and dichotomous mixture of gradual and continuous variability data that required successive information transformations. It necessitated the progressive moulding of the data into useful information, practical knowledge and effective industry practices. Ultimately, this new direction is to improve the crop predictions and to thereby reduce crop failures. The research journey involved data exploration, grappling with the complexity of Geographic Information System (GIS), discovering and learning data compatible software tools, and forging an effective processing method through an iterative cycle of action research experimentation. A series of trials was conducted to determine the combined effects of rainfall and temperature variations on wheat crop yields. These experiments specifically related to the South Western Agricultural region of Western Australia. The study focused on wheat producing shires within the study area. The investigations involved a combination of macro and micro analyses techniques for visual data mining and data mining classification techniques, respectively. The research activities revealed that wheat yield was most dependent upon rainfall and temperature. In addition, it showed that rainfall cyclically affected the temperature and soil type due to the moisture retention of crop growing locations. Results from the regression analyses, showed that the statistical prediction of wheat yields from historical data, may be enhanced by data mining techniques including classification. The main contribution to knowledge as a consequence of this research was the provision of an alternate and supplementary method of wheat crop prediction within the study area. Another contribution was the division of the study area into a GIS surface grid of 100 hectare cells upon which the interpolated data was projected. Furthermore, the proposed framework within this thesis offers other researchers, with similarly structured complex data, the benefits of a general processing pathway to enable them to navigate their own investigations through variegated analytical exploration spaces. In addition, it offers insights and suggestions for future directions in other contextual research explorations

    Analisis orientado a objetos de imágenes de teledetección para cartografia forestal : bases conceptuales y un metodo de segmentacion para obtener una particion inicial para la clasificacion = Object-oriented analysis of remote sensing images for land cover mapping : Conceptual foundations and a segmentation method to derive a baseline partition for classification

    Full text link
    El enfoque comúnmente usado para analizar las imágenes de satélite con fines cartográficos da lugar a resultados insatisfactorios debido principalmente a que únicamente utiliza los patrones espectrales de los píxeles, ignorando casi por completo la estructura espacial de la imagen. Además, la equiparación de las clases de cubierta a tipos de materiales homogéneos permite que cualquier parte arbitrariamente delimitada dentro de una tesela del mapa siga siendo un referente del concepto definido por su etiqueta. Esta posibilidad es incongruente con el modelo jerárquico del paisaje cada vez más aceptado en Ecología del Paisaje, que asume que la homogeneidad depende de la escala de observación y en cualquier caso es más semántica que biofísica, y que por tanto los paisajes son intrínsecamente heterogéneos y están compuestos de unidades (patches) que funcionan simultáneamente como un todo diferente de lo que les rodea y como partes de un todo mayor. Por tanto se hace necesario un nuevo enfoque (orientado a objetos) que sea compatible con este modelo y en el que las unidades básicas del análisis sean delimitadas de acuerdo a la variación espacial del fenómeno estudiado. Esta tesis pretende contribuir a este cambio de paradigma en teledetección, y sus objetivos concretos son: 1.- Poner de relieve las deficiencias del enfoque tradicionalmente empleado en la clasificación de imágenes de satélite. 2.- Sentar las bases conceptuales de un enfoque alternativo basado en zonas básicas clasificables como objetos. 3.- Desarrollar e implementar una versión demostrativa de un método automático que convierte una imagen multiespectral en una capa vectorial formada por esas zonas. La estrategia que se propone es producir, basándose en la estructura espacial de las imágenes, una partición de estas en la que cada región puede considerarse relativamente homogénea y diferente de sus vecinas y que además supera (aunque no por mucho) el tamaño de la unidad mínima cartografiable. Cada región se asume corresponde a un rodal que tras la clasificación será agregado junto a otros rodales vecinos en una región mayor que en conjunto pueda verse como una instancia de un cierto tipo de objetos que más tarde son representados en el mapa mediante teselas de una clase particular

    Methodology of Algorithm Engineering

    Full text link
    Research on algorithms has drastically increased in recent years. Various sub-disciplines of computer science investigate algorithms according to different objectives and standards. This plurality of the field has led to various methodological advances that have not yet been transferred to neighboring sub-disciplines. The central roadblock for a better knowledge exchange is the lack of a common methodological framework integrating the perspectives of these sub-disciplines. It is the objective of this paper to develop a research framework for algorithm engineering. Our framework builds on three areas discussed in the philosophy of science: ontology, epistemology and methodology. In essence, ontology describes algorithm engineering as being concerned with algorithmic problems, algorithmic tasks, algorithm designs and algorithm implementations. Epistemology describes the body of knowledge of algorithm engineering as a collection of prescriptive and descriptive knowledge, residing in World 3 of Popper's Three Worlds model. Methodology refers to the steps how we can systematically enhance our knowledge of specific algorithms. The framework helps us to identify and discuss various validity concerns relevant to any algorithm engineering contribution. In this way, our framework has important implications for researching algorithms in various areas of computer science
    • …
    corecore