3,940 research outputs found

    A family of experiments to validate measures for UML activity diagrams of ETL processes in data warehouses

    Get PDF
    In data warehousing, Extract, Transform, and Load (ETL) processes are in charge of extracting the data from the data sources that will be contained in the data warehouse. Their design and maintenance is thus a cornerstone in any data warehouse development project. Due to their relevance, the quality of these processes should be formally assessed early in the development in order to avoid populating the data warehouse with incorrect data. To this end, this paper presents a set of measures with which to evaluate the structural complexity of ETL process models at the conceptual level. This study is, moreover, accompanied by the application of formal frameworks and a family of experiments whose aim is to theoretical and empirically validate the proposed measures, respectively. Our experiments show that the use of these measures can aid designers to predict the effort associated with the maintenance tasks of ETL processes and to make ETL process models more usable. Our work is based on Unified Modeling Language (UML) activity diagrams for modeling ETL processes, and on the Framework for the Modeling and Evaluation of Software Processes (FMESP) framework for the definition and validation of the measures.In data warehousing, Extract, Transform, and Load (ETL) processes are in charge of extracting the data from the data sources that will be contained in the data warehouse. Their design and maintenance is thus a cornerstone in any data warehouse development project. Due to their relevance, the quality of these processes should be formally assessed early in the development in order to avoid populating the data warehouse with incorrect data. To this end, this paper presents a set of measures with which to evaluate the structural complexity of ETL process models at the conceptual level. This study is, moreover, accompanied by the application of formal frameworks and a family of experiments whose aim is to theoretical and empirically validate the proposed measures, respectively. Our experiments show that the use of these measures can aid designers to predict the effort associated with the maintenance tasks of ETL processes and to make ETL process models more usable. Our work is based on Unified Modeling Language (UML) activity diagrams for modeling ETL processes, and on the Framework for the Modeling and Evaluation of Software Processes (FMESP) framework for the definition and validation of the measures

    Model-driven performance evaluation for service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Software quality aspects such as performance are of central importance for the integration of heterogeneous, distributed service-based systems. Empirical performance evaluation is a process of measuring and calculating performance metrics of the implemented software. We present an approach for the empirical, model-based performance evaluation of services and service compositions in the context of model-driven service engineering. Temporal databases theory is utilised for the empirical performance evaluation of model-driven developed service systems

    Assessing the environmental impact of logistics sites through CO2eq footprint computation

    Get PDF
    The environmental sustainability of logistics facilities is widely acknowledged as an important issue, but a comprehensive standardised methodology for assessing their environmental impact is lacking. This study proposes a structured model for quantifying both consumptions and generated greenhouse gas (GHG) emissions, adopting a three-phase methodology that combines multiple methods. A literature-based conceptual framework was leveraged to design an analytical model, and in-depth interviews with 11 senior logistics managers were conducted. The study offers a replicable methodology that considers heterogeneous sources of consumption and related end-use types, further splitting consumptions and emissions by warehouses' functional areas. It offers a set of Environmental Performance Indicators (EPIs) that could bolster a clearer understanding of the warehouse environmental performance. A robust tool is offered to managers to support their decision-making processes, allowing for both internal assessments and benchmarking with competitors or other players along the supply chain, thus contributing to shape company's, or even supply chain, sustainability strategies

    Managing Warehouse Utilization: An Analysis of Key Warehouse Resources

    Get PDF
    The warehousing industry is extremely important to businesses and the economy as a whole, and while there is a great deal of literature exploring individual operations within warehouses, such as warehouse layout and design, order picking, etc., there is very little literature exploring warehouse operations from a systems approach. This study uses the Theory of Constraints (TOC) to develop a focused resource management approach to increasing warehouse capacity and throughput, and thus overall warehouse performance, in an environment of limited warehouse resources. While TOC was originally developed for reducing operational bottlenecks in manufacturing, it has allowed companies in other industries, such as banking, health care, and the military, to save millions of dollars. However, the use of TOC has been limited to case studies and individual situations, which typically are not generalizable. Since the basic steps of TOC are iterative in nature and were not designed for survey research, modifications to the original theory are necessary in order to provide insight into industry-wide problems. This study further develops TOC\u27s logistics paradigm and modifies it for use with survey data, which was collected from a sample of warehouse managers. Additionally, it provides a process for identifying potentially constrained key warehouse resources, which served as a foundation of this study. The findings of the study confirm that TOC\u27s methods of focused resource capacity management and goods flow scheduling coordination with supply chain partners can be an important approach for warehouse managers to use in overcoming resource capacity constraints to increase warehouse performance

    A UML Profile for Variety and Variability Awareness in Multidimensional Design: An application to Agricultural Robots

    Get PDF
    Variety and variability are an inherent source of information wealth in schemaless sources, and executing OLAP sessions on multidimensional data in their presence has recently become an object of research. However, all models devised so far propose a ``rigid'' view of the multidimensional content, without taking into account variety and variability. To fill this gap, in this paper we propose V-ICSOLAP, an extension of the ICSOLAP UML profile that supports extensibility and type/name variability for each multidimensional element, as well as complex data types for measures and levels. The real case study we use to motivate and illustrate our approach is that of trajectory analysis for agricultural robots. As a proof-of-concept for V-ICSOLAP, we propose an implementation that relies on the PostgreSQL multi-model DBMS and we evaluate its performances. We also provide a validation of our UML profile by ranking it against other meta-models based on a set of quality metrics

    Current Trends and Future Directions In The Practice Of High-Level Data Modeling: An Empirical Study

    Get PDF
    Large-scale organizations are increasingly promoting more collaborative and collective work practices across organizational boarders. A predominant way to achieve better collaboration in largescale heterogeneous contexts is to establish an integrated and standardized technological infrastructure. Ethnographically inspired studies, on the other hand, have challenged such perspective and illustrated that generic technology does not fit in local contexts and needs to be worked-around. Similarly, this paper empirically exemplifies local workarounds and illustrates ongoing and persistently imperfect integration of a collaborative infrastructure in a global oil and gas company. More importantly, however, our analysis focuses on how integrated technology is used across contexts. We illustrate how local workarounds, as a result of tight technological integration, shape use patterns across contexts. Integrated systems establish interdependencies across contexts, thus, the use implies cross-contextual rather than local enactment. Since the trajectory of enactment is influenced by cross-contextual constrains, our study is addressing the existing overemphasis on studying/analysing the use of technology in isolated local contexts. Practically, our study suggests considering workarounds as an intrinsic part of every day work, which should be calculated as additional costs of making the generic technology to work in practice

    Examining Quality Factors Influencing the Success of Data Warehouse

    Get PDF
    Increased organizational dependence on data warehouse (DW) systems has drived the management attention towards improving data warehouse systems to a success. However, the successful implementation rate of the data warehouse systems is low and many firms do not achieve intended goals. A recent study shows that improves and evaluates data warehouse success is one of the top concerns facing IT/DW executives. Nevertheless, there is a lack of research that addresses the issue of the data warehouse systems success. In addition, it is important for organizations to learn about quality needs to be emphasized before the actual data warehouse is built. It is also important to determine what aspects of data warehouse systems success are critical to organizations to help IT/DW executives to devise effective data warehouse success improvement strategies. Therefore, the purpose of this study is to further the understanding of the factors which are critical to evaluate the success of data warehouse systems. The study attempted to develop a comprehensive model for the success of data warehouse systems by adapting the updated DeLone and McLean IS Success Model. Researcher models the relationship between the quality factors on the one side and the net benefits of data warehouse on the other side. This study used quantitative method to test the research hypotheses by survey data. The data were collected by using a web-based survey. The sample consisted of 244 members of The Data Warehouse Institution (TDWI) working in variety industries around the world. The questionnaire measured six independent variables and one dependent variable. The independent variables were meant to measure system quality, information quality, service quality, relationship quality, user quality, and business quality. The dependent variable was meant to measure the net benefits of data warehouse systems. Analysis using descriptive analysis, factor analysis, correlation analysis and regression analysis resulted in the support of all hypotheses. The research results indicated that there are statistically positive causal relationship between each quality factors and the net benefits of the data warehouse systems. These results imply that the net benefits of the data warehouse systems increases when the overall qualities were increased. Yet, little thought seems to have been given to what the data warehouse success is, what is necessary to achieve the success of data warehouse, and what benefits can be realistically expected. Therefore, it appears nearly certain and plausible that the way data warehouse systems success is implemented in the future could be changed

    Preliminary Results in a Multi-site Empirical Study on Cross-organizational ERP Size and Effort Estimation

    Get PDF
    This paper reports on initial findings in an empirical study carried out with representatives of two ERP vendors, six ERP adopting organizations, four ERP implementation consulting companies, and two ERP research and advisory services firms. Our study’s goal was to gain understanding of the state-of-the practice in size and effort estimation of cross-organizational ERP projects. Based on key size and effort estimation challenges identified in a previously published literature survey, we explored some difficulties, fallacies and pitfalls these organizations face. We focused on collecting empirical evidence from the participating ERP market players to assess specific facts about the state-of-the-art ERP size and effort estimation practices. Our study adopted a qualitative research method based on an asynchronous online focus group
    corecore