7,988 research outputs found

    CHORUS Deliverable 2.2: Second report - identification of multi-disciplinary key issues for gap analysis toward EU multimedia search engines roadmap

    Get PDF
    After addressing the state-of-the-art during the first year of Chorus and establishing the existing landscape in multimedia search engines, we have identified and analyzed gaps within European research effort during our second year. In this period we focused on three directions, notably technological issues, user-centred issues and use-cases and socio- economic and legal aspects. These were assessed by two central studies: firstly, a concerted vision of functional breakdown of generic multimedia search engine, and secondly, a representative use-cases descriptions with the related discussion on requirement for technological challenges. Both studies have been carried out in cooperation and consultation with the community at large through EC concertation meetings (multimedia search engines cluster), several meetings with our Think-Tank, presentations in international conferences, and surveys addressed to EU projects coordinators as well as National initiatives coordinators. Based on the obtained feedback we identified two types of gaps, namely core technological gaps that involve research challenges, and “enablers”, which are not necessarily technical research challenges, but have impact on innovation progress. New socio-economic trends are presented as well as emerging legal challenges

    10381 Summary and Abstracts Collection -- Robust Query Processing

    Get PDF
    Dagstuhl seminar 10381 on robust query processing (held 19.09.10 - 24.09.10) brought together a diverse set of researchers and practitioners with a broad range of expertise for the purpose of fostering discussion and collaboration regarding causes, opportunities, and solutions for achieving robust query processing. The seminar strove to build a unified view across the loosely-coupled system components responsible for the various stages of database query processing. Participants were chosen for their experience with database query processing and, where possible, their prior work in academic research or in product development towards robustness in database query processing. In order to pave the way to motivate, measure, and protect future advances in robust query processing, seminar 10381 focused on developing tests for measuring the robustness of query processing. In these proceedings, we first review the seminar topics, goals, and results, then present abstracts or notes of some of the seminar break-out sessions. We also include, as an appendix, the robust query processing reading list that was collected and distributed to participants before the seminar began, as well as summaries of a few of those papers that were contributed by some participants

    Wireless sensor data processing for on-site emergency response

    Get PDF
    This thesis is concerned with the problem of processing data from Wireless Sensor Networks (WSNs) to meet the requirements of emergency responders (e.g. Fire and Rescue Services). A WSN typically consists of spatially distributed sensor nodes to cooperatively monitor the physical or environmental conditions. Sensor data about the physical or environmental conditions can then be used as part of the input to predict, detect, and monitor emergencies. Although WSNs have demonstrated their great potential in facilitating Emergency Response, sensor data cannot be interpreted directly due to its large volume, noise, and redundancy. In addition, emergency responders are not interested in raw data, they are interested in the meaning it conveys. This thesis presents research on processing and combining data from multiple types of sensors, and combining sensor data with other relevant data, for the purpose of obtaining data of greater quality and information of greater relevance to emergency responders. The current theory and practice in Emergency Response and the existing technology aids were reviewed to identify the requirements from both application and technology perspectives (Chapter 2). The detailed process of information extraction from sensor data and sensor data fusion techniques were reviewed to identify what constitutes suitable sensor data fusion techniques and challenges presented in sensor data processing (Chapter 3). A study of Incident Commanders’ requirements utilised a goal-driven task analysis method to identify gaps in current means of obtaining relevant information during response to fire emergencies and a list of opportunities for WSN technology to fill those gaps (Chapter 4). A high-level Emergency Information Management System Architecture was proposed, including the main components that are needed, the interaction between components, and system function specification at different incident stages (Chapter 5). A set of state-awareness rules was proposed, and integrated with Kalman Filter to improve the performance of filtering. The proposed data pre-processing approach achieved both improved outlier removal and quick detection of real events (Chapter 6). A data storage mechanism was proposed to support timely response to queries regardless of the increase in volume of data (Chapter 7). What can be considered as “meaning” (e.g. events) for emergency responders were identified and a generic emergency event detection model was proposed to identify patterns presenting in sensor data and associate patterns with events (Chapter 8). In conclusion, the added benefits that the technical work can provide to the current Emergency Response is discussed and specific contributions and future work are highlighted (Chapter 9)

    Data semantic enrichment for complex event processing over IoT Data Streams

    Get PDF
    This thesis generalizes techniques for processing IoT data streams, semantically enrich data with contextual information, as well as complex event processing in IoT applications. A case study for ECG anomaly detection and signal classification was conducted to validate the knowledge foundation

    iTeleScope: Intelligent Video Telemetry and Classification in Real-Time using Software Defined Networking

    Full text link
    Video continues to dominate network traffic, yet operators today have poor visibility into the number, duration, and resolutions of the video streams traversing their domain. Current approaches are inaccurate, expensive, or unscalable, as they rely on statistical sampling, middle-box hardware, or packet inspection software. We present {\em iTelescope}, the first intelligent, inexpensive, and scalable SDN-based solution for identifying and classifying video flows in real-time. Our solution is novel in combining dynamic flow rules with telemetry and machine learning, and is built on commodity OpenFlow switches and open-source software. We develop a fully functional system, train it in the lab using multiple machine learning algorithms, and validate its performance to show over 95\% accuracy in identifying and classifying video streams from many providers including Youtube and Netflix. Lastly, we conduct tests to demonstrate its scalability to tens of thousands of concurrent streams, and deploy it live on a campus network serving several hundred real users. Our system gives unprecedented fine-grained real-time visibility of video streaming performance to operators of enterprise and carrier networks at very low cost.Comment: 12 pages, 16 figure

    Develop Best Practices for Designing Internal Business Database-Driven Web Applications

    Get PDF
    When developing using newer technology, it is important for smaller information technology organizations to have universally accepted set of best practices to be able to successfully complete that type of endeavor. How can these universally accepted set of best practices be developed? Conducting research on accepted best practices can build the basis for your theories and assumptions. Next, in the context of your applications, develop an example application in the newer technology to test your theories and assumptions. Build the application like a construction project, the initial design is the blueprint, the database is the foundation and the user interface is the actual building. When you get right down to it, the principals of simplicity, consistency and user interaction are always best practices in developing applications

    ComplexWorld Position Paper

    Get PDF
    The Complex ATM Position Paper is the common research vehicle that defines the high-level, strategic scientific vision for the ComplexWorld Network. The purpose of this document is to provide an orderly and consistent scientific framework for the WP-E complexity theme. The specific objectives of the position paper are to: - analyse the state of the art within the different research areas relevant to the network, identifying the major accomplishments and providing a comprehensive set of references, including the main publications and research projects; - include a complete list of , a list of application topics, and an analysis of which techniques are best suited to each one of those applications; - identify and perform an in-depth analysis of the most promising research avenues and the major research challenges lying at the junction of ATM and complex systems domains, with particular attention to their impact and potential benefits for the ATM community; - identify areas of common interest and synergies with other SESAR activities, with special attention to the research topics covered by other WP-E networks. An additional goal for future versions of this position paper is to develop an indicative roadmap on how these research challenges should be accomplished, providing a guide on how to leverage on different aspects of the complexity research in Air Transport

    Dynamics in Logistics

    Get PDF
    This open access book highlights the interdisciplinary aspects of logistics research. Featuring empirical, methodological, and practice-oriented articles, it addresses the modelling, planning, optimization and control of processes. Chiefly focusing on supply chains, logistics networks, production systems, and systems and facilities for material flows, the respective contributions combine research on classical supply chain management, digitalized business processes, production engineering, electrical engineering, computer science and mathematical optimization. To celebrate 25 years of interdisciplinary and collaborative research conducted at the Bremen Research Cluster for Dynamics in Logistics (LogDynamics), in this book hand-picked experts currently or formerly affiliated with the Cluster provide retrospectives, present cutting-edge research, and outline future research directions

    The Spatial Historian: Creating a Spatially Aware Historical Research System

    Get PDF
    The intent of this study is to design a geospatial information system capable of facilitating the extraction and analysis of the fragmentary snapshots of history contained in hand-written historical documents. This customized system necessarily bypasses off-the-shelf GIS in order to support these unstructured primary historical research materials and bring long dormant spatial stories previously hidden in archives, libraries, and other documentary storage locations to life. The software platform discussed here integrates the tasks of information extraction, data management, and analysis while simultaneously giving primary emphasis to supporting the spatial and humanistic analysis and interpretation of the data contents. The premise of this research study is that by integrating the collection of data, the extraction of content, and the analysis of information from what has traditionally been post-data collection analysis and research process, more efficient processing and more effective historical research can be achieved

    Wireless sensor network as a distribute database

    Get PDF
    Wireless sensor networks (WSN) have played a role in various fields. In-network data processing is one of the most important and challenging techniques as it affects the key features of WSNs, which are energy consumption, nodes life circles and network performance. In the form of in-network processing, an intermediate node or aggregator will fuse or aggregate sensor data, which are collected from a group of sensors before transferring to the base station. The advantage of this approach is to minimize the amount of information transferred due to lack of computational resources. This thesis introduces the development of a hybrid in-network data processing for WSNs to fulfil the WSNs constraints. An architecture for in-network data processing were proposed in clustering level, data compression level and data mining level. The Neighbour-aware Multipath Cluster Aggregation (NMCA) is designed in the clustering level, which combines cluster-based and multipath approaches to process different packet loss rates. The data compression schemes and Optimal Dynamic Huffman (ODH) algorithm compressed data in the cluster head for the compressed level. A semantic data mining for fire detection was designed for extracting information from the raw data by the semantic data-mining model is developed to improve data accuracy and extract the fire event in the simulation. A demo in-door location system with in-network data processing approach is built to test the performance of the energy reduction of our designed strategy. In conclusion, the added benefits that the technical work can provide for in-network data processing is discussed and specific contributions and future work are highlighted
    • …
    corecore