139 research outputs found
Recommended from our members
Hybrid intelligent decision support system for distributed detection based on ad hoc integrated WSN & RFID
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe real time monitoring of environment context aware activities, based on distributed detection, is becoming a standard in public safety and service delivery in a wide range of domains (child and elderly care and supervision, logistics, circulation, and other). The safety of people, goods and premises depends on the prompt immediate reaction to potential hazards identified in real time, at an early stage to engage appropriate control actions. Effective emergency response can be supported only by available and acquired expertise or elaborate collaborative knowledge in the domain of distributed detection that include indoor sensing, tracking and localizing. This research proposes a hybrid conceptual multi-agent framework for the acquisition of collaborative knowledge in dynamic complex context aware environments for distributed detection. This framework has been applied for the design and development of a hybrid intelligent multi-agent decision system (HIDSS) that supports a decentralized active sensing, tracking and localizing strategy, and the deployment and configuration of smart detection devices associated to active sensor nodes wirelessly connected in a network topology to configure, deploy and control ad hoc wireless sensor networks (WSNs). This system, which is based on the interactive use of data, models and knowledge base, has been implemented to support fire detection and control access fusion functions aimed at elaborating: An integrated data model, grouping the building information data and WSN-RFID database, composed of the network configuration and captured data, A virtual layout configuration of the controlled premises, based on using a building information model, A knowledge-based support for the design of generic detection devices, A multi-criteria decision making model for generic detection devices distribution, ad hoc WSNs configuration, clustering and deployment, and Predictive data models for evacuation planning, and fire and evacuation simulation. An evaluation of the system prototype has been carried out to enrich information and knowledge fusion requirements and show the scope of the concepts used in data and process modelling. It has shown the practicability of hybrid solutions grouping generic homogeneous smart detection devices enhanced by heterogeneous support devices in their deployment, forming ad hoc networks that integrate WSNs and radio frequency identification (RFID) technology. The novelty in this work is the web-based support system architecture proposed in this framework that is based on the use of intelligent agent modelling and multi-agent systems, and the decoupling of the processes supporting the multi-sensor data fusion from those supporting different context applications. Although this decoupling is essential to appropriately distribute the different fusion functions, the integration of several dimensions of policy settings for the modelling of knowledge processes, and intelligent and pro-active decision making activities, requires the organisation of interactive fusion functions deployed upstream to a safety and emergency response.Saudi government, represented by the Ministry of Interior and General Directorate of Civil Defenc
Framework of Six Sigma implementation analysis on SMEs in Malaysia for information technology services, products and processes
For the past two decades, the majority of Malaysia’s IT companies have been widely adopting a Quality Assurance (QA) approach as a basis for self-improvement and internal-assessment in IT project management. Quality Control (QC) is a comprehensive top-down observation approach used to fulfill requirements for quality outputs which focuses on the aspect of process outputs evaluation. However in the Malaysian context, QC and combination of QA and QC as a means of quality improvement approaches have not received significant attention. This research study aims to explore the possibility of integrating QC and QA+QC approaches through Six Sigma quality management standard to provide tangible and measureable business results by continuous process improvement to boost customer satisfactions.
The research project adopted an exploratory case study approach on three Malaysian IT companies in the business area of IT Process, IT Service and IT Product. Semi-structured interviews, online surveys, self-administered questionnaires, job observations, document analysis
and on-the-job-training are amongst the methodologies employed in these case studies. These collected data and viewpoints along with findings from an extensive literature review were used to benchmark quality improvement initiatives, best practices and to develop a Six Sigma
framework for the context of the SMEs in the Malaysian IT industry.
This research project contributed to both the theory and practice of implementing and integrating Six Sigma in IT products, services and processes. The newly developed framework has been proven capable of providing a general and fundamental start-up decision by demonstrating
how a company with and without formal QIM can be integrated and implemented with Six Sigma practices to close the variation gap between QA and QC.
This framework also takes into consideration those companies with an existing QIM for a new face-lift migration without having to drop their existing QIM. This can be achieved by integrating a new QIM which addresses most weaknesses of the current QIM while retaining most of the current business routine strengths. This framework explored how Six Sigma can be expanded and extended to
include secondary external factors that are critical to successful QIM implementation. A vital segment emphasizes Six Sigma as a QA+QC approach in IT processes; and the ability to properly manage IT processes will result in overall performance improvement to IT Products and IT
Services. The developed Six Sigma implementation framework can serve as a baseline for SMEs to better manage, control and track business performance and product quality; and at the same time creates clearer insights and un-biased views of Six Sigma implementation onto the IT industries to drive towards operational excellence
Framework of Six Sigma implementation analysis on SMEs in Malaysia for information technology services, products and processes
For the past two decades, the majority of Malaysia’s IT companies have been widely adopting a Quality Assurance (QA) approach as a basis for self-improvement and internal-assessment in IT project management. Quality Control (QC) is a comprehensive top-down observation approach used to fulfill requirements for quality outputs which focuses on the aspect of process outputs evaluation. However in the Malaysian context, QC and combination of QA and QC as a means of quality improvement approaches have not received significant attention. This research study aims to explore the possibility of integrating QC and QA+QC approaches through Six Sigma quality management standard to provide tangible and measureable business results by continuous process improvement to boost customer satisfactions.
The research project adopted an exploratory case study approach on three Malaysian IT companies in the business area of IT Process, IT Service and IT Product. Semi-structured interviews, online surveys, self-administered questionnaires, job observations, document analysis
and on-the-job-training are amongst the methodologies employed in these case studies. These collected data and viewpoints along with findings from an extensive literature review were used to benchmark quality improvement initiatives, best practices and to develop a Six Sigma
framework for the context of the SMEs in the Malaysian IT industry.
This research project contributed to both the theory and practice of implementing and integrating Six Sigma in IT products, services and processes. The newly developed framework has been proven capable of providing a general and fundamental start-up decision by demonstrating
how a company with and without formal QIM can be integrated and implemented with Six Sigma practices to close the variation gap between QA and QC.
This framework also takes into consideration those companies with an existing QIM for a new face-lift migration without having to drop their existing QIM. This can be achieved by integrating a new QIM which addresses most weaknesses of the current QIM while retaining most of the current business routine strengths. This framework explored how Six Sigma can be expanded and extended to
include secondary external factors that are critical to successful QIM implementation. A vital segment emphasizes Six Sigma as a QA+QC approach in IT processes; and the ability to properly manage IT processes will result in overall performance improvement to IT Products and IT
Services. The developed Six Sigma implementation framework can serve as a baseline for SMEs to better manage, control and track business performance and product quality; and at the same time creates clearer insights and un-biased views of Six Sigma implementation onto the IT industries to drive towards operational excellence
Data Spaces
This open access book aims to educate data space designers to understand what is required to create a successful data space. It explores cutting-edge theory, technologies, methodologies, and best practices for data spaces for both industrial and personal data and provides the reader with a basis for understanding the design, deployment, and future directions of data spaces. The book captures the early lessons and experience in creating data spaces. It arranges these contributions into three parts covering design, deployment, and future directions respectively. The first part explores the design space of data spaces. The single chapters detail the organisational design for data spaces, data platforms, data governance federated learning, personal data sharing, data marketplaces, and hybrid artificial intelligence for data spaces. The second part describes the use of data spaces within real-world deployments. Its chapters are co-authored with industry experts and include case studies of data spaces in sectors including industry 4.0, food safety, FinTech, health care, and energy. The third and final part details future directions for data spaces, including challenges and opportunities for common European data spaces and privacy-preserving techniques for trustworthy data sharing. The book is of interest to two primary audiences: first, researchers interested in data management and data sharing, and second, practitioners and industry experts engaged in data-driven systems where the sharing and exchange of data within an ecosystem are critical
Data-stream driven Fuzzy-granular approaches for system maintenance
Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability
Master\u27s Thesis and Field Study Abstracts, July 1998-June 2000
This publication, the fifteenth in a series which began in 1957, contains the abstracts of Master\u27s Theses and Field Studies completed by graduate students of St. Cloud State University. The bulletin contains those theses and field studies completed during the period from July of 1998 through June of 2000.
A bound copy of each thesis or field study is on file in the James W. Miller Learning Resources Center, which houses the library on this campus. The library copy of each thesis and field study is available for use on an interlibrary loan basis.
Copies of this bulletin may be obtained from the Office of Graduate Studies, 121 Administrative Services, St. Cloud State University, 720 S. Fourth Avenue, St. Cloud, Minnesota, 56301-4498
Citizen Science and Geospatial Capacity Building
This book is a collection of the articles published the Special Issue of ISPRS International Journal of Geo-Information on “Citizen Science and Geospatial Capacity Building”. The articles cover a wide range of topics regarding the applications of citizen science from a geospatial technology perspective. Several applications show the importance of Citizen Science (CitSci) and volunteered geographic information (VGI) in various stages of geodata collection, processing, analysis and visualization; and for demonstrating the capabilities, which are covered in the book. Particular emphasis is given to various problems encountered in the CitSci and VGI projects with a geospatial aspect, such as platform, tool and interface design, ontology development, spatial analysis and data quality assessment. The book also points out the needs and future research directions in these subjects, such as; (a) data quality issues especially in the light of big data; (b) ontology studies for geospatial data suited for diverse user backgrounds, data integration, and sharing; (c) development of machine learning and artificial intelligence based online tools for pattern recognition and object identification using existing repositories of CitSci and VGI projects; and (d) open science and open data practices for increasing the efficiency, decreasing the redundancy, and acknowledgement of all stakeholders
- …