102,193 research outputs found

    Interoperability and information sharing

    Get PDF
    Communication and information sharing are two of the most pressing issues facing the public safety community today. In previous chapters of this volume, authors have made note of the changing public safety landscape as it relates to the need for enhanced information and intelligence sharing among a broad cross-section of organizations. Public safety organizations, particularly law enforcement agencies, have been quick to adopt emerging technologies that have allowed for greater communication and information sharing capacities. While substantial improvements have been made over the decades that enhanced communication and information sharing, many challenges remain in the move to seamlessly integrated communication capacities. The key challenge in the upcoming decades relates to the technical and cultural changes necessary to achieve integrated communication systems. There is no shortage of resources given to increasing the communications capacity of the public safety community, yet serious challenges remain in the degree of interoperability within and across public safety domains. Interoperability has in many ways become the defining issue in the arenas of communications and information sharing. This chapter will provide an overview of critical historical events that placed questions of interoperability and information sharing on the national agenda. The chapter will also provide an overview of national models for information sharing

    Ontology-based data semantic management and application in IoT- and cloud-enabled smart homes

    Get PDF
    The application of emerging technologies of Internet of Things (IoT) and cloud computing have increasing the popularity of smart homes, along with which, large volumes of heterogeneous data have been generating by home entities. The representation, management and application of the continuously increasing amounts of heterogeneous data in the smart home data space have been critical challenges to the further development of smart home industry. To this end, a scheme for ontology-based data semantic management and application is proposed in this paper. Based on a smart home system model abstracted from the perspective of implementing users’ household operations, a general domain ontology model is designed by defining the correlative concepts, and a logical data semantic fusion model is designed accordingly. Subsequently, to achieve high-efficiency ontology data query and update in the implementation of the data semantic fusion model, a relational-database-based ontology data decomposition storage method is developed by thoroughly investigating existing storage modes, and the performance is demonstrated using a group of elaborated ontology data query and update operations. Comprehensively utilizing the stated achievements, ontology-based semantic reasoning with a specially designed semantic matching rule is studied as well in this work in an attempt to provide accurate and personalized home services, and the efficiency is demonstrated through experiments conducted on the developed testing system for user behavior reasoning

    Iterative criteria-based approach to engineering the requirements of software development methodologies

    Get PDF
    Software engineering endeavours are typically based on and governed by the requirements of the target software; requirements identification is therefore an integral part of software development methodologies. Similarly, engineering a software development methodology (SDM) involves the identification of the requirements of the target methodology. Methodology engineering approaches pay special attention to this issue; however, they make little use of existing methodologies as sources of insight into methodology requirements. The authors propose an iterative method for eliciting and specifying the requirements of a SDM using existing methodologies as supplementary resources. The method is performed as the analysis phase of a methodology engineering process aimed at the ultimate design and implementation of a target methodology. An initial set of requirements is first identified through analysing the characteristics of the development situation at hand and/or via delineating the general features desirable in the target methodology. These initial requirements are used as evaluation criteria; refined through iterative application to a select set of relevant methodologies. The finalised criteria highlight the qualities that the target methodology is expected to possess, and are therefore used as a basis for de. ning the final set of requirements. In an example, the authors demonstrate how the proposed elicitation process can be used for identifying the requirements of a general object-oriented SDM. Owing to its basis in knowledge gained from existing methodologies and practices, the proposed method can help methodology engineers produce a set of requirements that is not only more complete in span, but also more concrete and rigorous

    From Sensor to Observation Web with Environmental Enablers in the Future Internet

    Get PDF
    This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities’ environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term ?envirofied? Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data fusion and model based computation, and the related propagation of information uncertainty. Approaches to security, standardization and harmonization, all essential for sustainable solutions, are summarized from the perspective of the Environmental Usage Area. The paper concludes with an overview of emerging, high impact applications in the environmental areas concerning land ecosystems (biodiversity), air quality (atmospheric conditions) and water ecosystems (marine asset management)

    Precise vehicle location as a fundamental parameter for intelligent selfaware rail-track maintenance systems

    Get PDF
    The rail industry in the UK is undergoing substantial changes in response to a modernisation vision for 2040. Development and implementation of these will lead to a highly automated and safe railway. Real-time regulation of traffic will optimise the performance of the network, with trains running in succession within an adjacent movable safety zone. Critically, maintenance will use intelligent trainborne and track-based systems. These will provide accurate and timely information for condition based intervention at precise track locations, reducing possession downtime and minimising the presence of workers in operating railways. Clearly, precise knowledge of trains’ real-time location is of paramount importance. The positional accuracy demand of the future railway is less than 2m. A critical consideration of this requirement is the capability to resolve train occupancy in adjacent tracks, with the highest degree of confidence. A finer resolution is required for locating faults such as damage or missing parts, precisely. Location of trains currently relies on track signalling technology. However, these systems mostly provide an indication of the presence of trains within discrete track sections. The standard Global Navigation Satellite Systems (GNSS), cannot precisely and reliably resolve location as required either. Within the context of the needs of the future railway, state of the art location technologies and systems were reviewed and critiqued. It was found that no current technology is able to resolve location as required. Uncertainty is a significant factor. A new integrated approach employing complimentary technologies and more efficient data fusion process, can potentially offer a more accurate and robust solution. Data fusion architectures enabling intelligent self-aware rail-track maintenance systems are proposed

    Conversational Sensing

    Full text link
    Recent developments in sensing technologies, mobile devices and context-aware user interfaces have made it possible to represent information fusion and situational awareness as a conversational process among actors - human and machine agents - at or near the tactical edges of a network. Motivated by use cases in the domain of security, policing and emergency response, this paper presents an approach to information collection, fusion and sense-making based on the use of natural language (NL) and controlled natural language (CNL) to support richer forms of human-machine interaction. The approach uses a conversational protocol to facilitate a flow of collaborative messages from NL to CNL and back again in support of interactions such as: turning eyewitness reports from human observers into actionable information (from both trained and untrained sources); fusing information from humans and physical sensors (with associated quality metadata); and assisting human analysts to make the best use of available sensing assets in an area of interest (governed by management and security policies). CNL is used as a common formal knowledge representation for both machine and human agents to support reasoning, semantic information fusion and generation of rationale for inferences, in ways that remain transparent to human users. Examples are provided of various alternative styles for user feedback, including NL, CNL and graphical feedback. A pilot experiment with human subjects shows that a prototype conversational agent is able to gather usable CNL information from untrained human subjects

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    Data mining and fusion

    No full text
    • 

    corecore