2,185 research outputs found

    Warehousing and Analyzing Streaming Data Quality Information

    Get PDF
    The development of integrative IS architectures focuses typically on solving problems related to the functionality of the system. It is attempted to design optimally flexible interfaces that can achieve the most agile architecture. The quality of the data that will be exchanged across these interfaces is often disregarded (implicitly or explicitly). This results in distributed applications which are functionally correct but cannot be deployed due to the low quality of the data involved. In order to avoid wrong business decisions due to ‘dirty data’, quality characteristics have to be captured, processed, and provided to the respective business task. However, the issue of how to efficiently provide applications with information about data quality is still an open research problem. Our approach tackles the problems posed by data quality deficiencies by presenting a novel concept to stream and warehouse data together with its describing data quality information

    Limits or Integration? – Manufacturing Execution Systems and Operational Business Intelligence

    Get PDF
    Manufacturing Execution Systems (MES) and Operational Business Intelligence (OpBI) analyze and control operationalactivities in different organizational application fields. This raises the question how far these concepts are interrelated incontext of company-wide process coordination and analysis. The goal of this paper is the evaluation and conceptualclassification of MES and OpBI to base subsequent research actions. A literature review is conducted to recognize if arelationship of the concepts is taken into account in academics and to look for research gaps. Therefore, a representativenumber of articles have been extracted from selected scientific databases. The review results in four publications illuminatingonly single correlation aspects. This leads to the conclusion that further research in context of MES and OpBI is needed

    Effective Use Methods for Continuous Sensor Data Streams in Manufacturing Quality Control

    Get PDF
    This work outlines an approach for managing sensor data streams of continuous numerical data in product manufacturing settings, emphasizing statistical process control, low computational and memory overhead, and saving information necessary to reduce the impact of nonconformance to quality specifications. While there is extensive literature, knowledge, and documentation about standard data sources and databases, the high volume and velocity of sensor data streams often makes traditional analysis unfeasible. To that end, an overview of data stream fundamentals is essential. An analysis of commonly used stream preprocessing and load shedding methods follows, succeeded by a discussion of aggregation procedures. Stream storage and querying systems are the next topics. Further, existing machine learning techniques for data streams are presented, with a focus on regression. Finally, the work describes a novel methodology for managing sensor data streams in which data stream management systems save and record aggregate data from small time intervals, and individual measurements from the stream that are nonconforming. The aggregates shall be continually entered into control charts and regressed on. To conserve memory, old data shall be periodically reaggregated at higher levels to reduce memory consumption

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    Reducing the delivery lead time in a food distribution SME through the implementation of six sigma methodology

    Get PDF
    Purpose – Six sigma is a systematic data driven approach to reduce the defect and improve the quality in any type of business. The purpose of this paper is to present the findings from the application of six sigma in a food service “small to medium sized enterprise” (SME) in a lean environment to reduce the waste in this field. Design/methodology/approach – A simplified version of six sigma is adopted through the application of appropriate statistical tools in order to focus on customer's requirements to identify the defect, the cause of the defect and improve the delivery process by implementing the optimum solution. Findings – The result suggests that modification in layout utilization reduced the number of causes of defect by 40 percent resulting in jumping from 1.44 sigma level to 2.09 Sigma level which is substantial improvement in SME. Research limitations/implications – Simplicity of six sigma is important to enabling any SME to identify the problem and minimize its cause through a systematic approach. Practical implications – Integrating of supply chain objectives with any quality initiatives such as lean and six sigma has a substantial effect on achieving to the targets. Originality/value – This paper represents a potential area in which six sigma methodology along side the lean management can promote supply chain management objectives for a food distribution SME

    Challenging Problems in Data Mining and Data Warehousing

    Get PDF
    Data mining is a process which is used by companies to turn raw data into useful information. By using software to look for patterns in large batches of data, businesses can learn more about their customers and develop more effective marketing strategies as well as increase sales and decrease costs. It depends on constructive data collection and warehousing as well as computer processing. Data mining used to analyze patterns and relationships in data based on what users request. For example, data mining software can be used to create classes of information. When companies centralize their data into one database or program, it is known as data warehousing. Accompanied a data warehouse, an organization may spin off segments of the data for particular users and utilize. While, in other cases, analysts may begin with the type of data they want and create a data warehouse based on those specs. Regardless of how businesses and other entities systemize their data, they use it to support management's decision-making processes
    • 

    corecore