1,412 research outputs found

    Integrated Models and Tools for Design and Management of Global Supply Chain

    Get PDF
    In modern and global supply chain, the increasing trend toward product variety, level of service, short delivery delay and response time to consumers, highlight the importance to set and configure smooth and efficient logistic processes and operations. In order to comply such purposes the supply chain management (SCM) theory entails a wide set of models, algorithms, procedure, tools and best practices for the design, the management and control of articulated supply chain networks and logistics nodes. The purpose of this Ph.D. dissertation is going in detail on the principle aspects and concerns of supply chain network and warehousing systems, by proposing and illustrating useful methods, procedures and support-decision tools for the design and management of real instance applications, such those currently face by enterprises. In particular, after a comprehensive literature review of the principal warehousing issues and entities, the manuscript focuses on design top-down procedure for both less-than-unit-load OPS and unit-load storage systems. For both, decision-support software platforms are illustrated as useful tools to address the optimization of the warehousing performances and efficiency metrics. The development of such interfaces enables to test the effectiveness of the proposed hierarchical top-down procedure with huge real case studies, taken by industry applications. Whether the large part of the manuscript deals with micro concerns of warehousing nodes, also macro issues and aspects related to the planning, design, and management of the whole supply chain are enquired and discussed. The integration of macro criticalities, such as the design of the supply chain infrastructure and the placement of the logistic nodes, with micro concerns, such the design of warehousing nodes and the management of material handling, is addressed through the definition of integrated models and procedures, involving the overall supply chain and the whole product life cycle. A new integrated perspective should be applied in study and planning of global supply chains. Each aspect of the reality influences the others. Each product consumed by a customer tells a story, made by activities, transformations, handling, processes, traveling around the world. Each step of this story accounts costs, time, resources exploitation, labor, waste, pollution. The economical and environmental sustainability of the modern global supply chain is the challenge to face

    Progress in Material Handling Research: 2010

    Get PDF
    Table of Content

    Toward an Engineering Discipline of Warehouse Design

    Get PDF
    Warehouses today are complex dynamic engineered systems, incorporating automation, mechanization, equipment, fixtures, computers, networks, products and people, and they can support the flow of tens or hundreds of thousands of different items to enable fulfilling thousands or tens of thousands of orders daily. In that sense, they represent a design challenge that is not terribly different from the design of other complex dynamic engineered systems, such as a modern passenger airplane, an automobile, or a unique building. What is different is that the design of these other complex dynamic engineered systems typically follows some engineering design discipline. Here, we argue for the development of a corresponding engineering discipline of warehouse design

    Extraction transformation load (ETL) solution for data integration: a case study of rubber import and export information

    Get PDF
    Data integration is important in consolidating all the data in the organization or outside the organization to provide a unified view of the organization's information. Extraction Transformation Load (ETL) solution is the back-end process of data integration which involves collecting data from various data sources, preparing and transforming the data according to business requirements and loading them into a Data Warehouse (DW). This paper explains the integration of the rubber import and export data between Malaysian Rubber Board (MRB) and Royal Malaysian Customs Department (Customs) using the ETL solution. Microsoft SQL Server Integration Services (SSIS) and Microsoft SQL Server Agent Jobs have been used as the ETL tool and ETL scheduling

    Bosch's industry 4.0 advanced Data Analytics: historical and predictive data integration for decision support

    Get PDF
    Industry 4.0, characterized by the development of automation and data exchanging technologies, has contributed to an increase in the volume of data, generated from various data sources, with great speed and variety. Organizations need to collect, store, process, and analyse this data in order to extract meaningful insights from these vast amounts of data. By overcoming these challenges imposed by what is currently known as Big Data, organizations take a step towards optimizing business processes. This paper proposes a Big Data Analytics architecture as an artefact for the integration of historical data - from the organizational business processes - and predictive data - obtained by the use of Machine Learning models -, providing an advanced data analytics environment for decision support. To support data integration in a Big Data Warehouse, a data modelling method is also proposed. These proposals were implemented and validated with a demonstration case in a multinational organization, Bosch Car Multimedia in Braga. The obtained results highlight the ability to take advantage of large amounts of historical data enhanced with predictions that support complex decision support scenarios.This work has been supported by FCT -Fundacao para a Ciencia e Tecnologia within the Project Scope: UIDB/00319/2020, the Doctoral scholarships PD/BDE/135100/2017 and PD/BDE/135105/2017, and European Structural and Investment Funds in the FEDER component, through the Operational Competitiveness and Internationalization Programme (COMPETE 2020) [Project n degrees 039479; Funding Reference: POCI-01-0247-FEDER039479]. The authors also wish to thank the automotive electronics company staff involved with this project for providing the data and valuable domain feedback. This paper uses icons made by Freepik, from www.flaticon.com

    Business intelligence-centered software as the main driver to migrate from spreadsheet-based analytics

    Get PDF
    Internship Report presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceNowadays, companies are handling and managing data in a way that they weren’t ten years ago. The data deluge is, as a mere consequence of that, the constant day-to-day challenge for them - having to create agile and scalable data solutions to tackle this reality. The main trigger of this project was to support the decision-making process of a customer-centered marketing team (called Customer Voice) in the Company X by developing a complete, holistic Business Intelligence solution that goes all the way from ETL processes to data visualizations based on that team’s business needs. Having this context into consideration, the focus of the internship was to make use of BI, ETL techniques to migrate their data stored in spreadsheets — where they performed data analysis — and shift the way they see the data into a more dynamic, sophisticated, and suitable way in order to help them make data-driven strategic decisions. To ensure that there was credibility throughout the development of this project and its subsequent solution, it was necessary to make an exhaustive literature review to help me frame this project in a more realistic and logical way. That being said, this report made use of scientific literature that explained the evolution of the ETL workflows, tools, and limitations across different time periods and generations, how it was transformed from manual to real-time data tasks together with data warehouses, the importance of data quality and, finally, the relevance of ETL processes optimization and new ways of approaching data integrations by using modern, cloud architectures

    Augmenting data warehousing architectures with hadoop

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementAs the volume of available data increases exponentially, traditional data warehouses struggle to transform this data into actionable knowledge. Data strategies that include the creation and maintenance of data warehouses have a lot to gain by incorporating technologies from the Big Data’s spectrum. Hadoop, as a transformation tool, can add a theoretical infinite dimension of data processing, feeding transformed information into traditional data warehouses that ultimately will retain their value as central components in organizations’ decision support systems. This study explores the potentialities of Hadoop as a data transformation tool in the setting of a traditional data warehouse environment. Hadoop’s execution model, which is oriented for distributed parallel processing, offers great capabilities when the amounts of data to be processed require the infrastructure to expand. Horizontal scalability, which is a key aspect in a Hadoop cluster, will allow for proportional growth in processing power as the volume of data increases. Through the use of a Hive on Tez, in a Hadoop cluster, this study transforms television viewing events, extracted from Ericsson’s Mediaroom Internet Protocol Television infrastructure, into pertinent audience metrics, like Rating, Reach and Share. These measurements are then made available in a traditional data warehouse, supported by a traditional Relational Database Management System, where they are presented through a set of reports. The main contribution of this research is a proposed augmented data warehouse architecture where the traditional ETL layer is replaced by a Hadoop cluster, running Hive on Tez, with the purpose of performing the heaviest transformations that convert raw data into actionable information. Through a typification of the SQL statements, responsible for the data transformation processes, we were able to understand that Hadoop, and its distributed processing model, delivers outstanding performance results associated with the analytical layer, namely in the aggregation of large data sets. Ultimately, we demonstrate, empirically, the performance gains that can be extracted from Hadoop, in comparison to an RDBMS, regarding speed, storage usage and scalability potential, and suggest how this can be used to evolve data warehouses into the age of Big Data

    A Framework for Applying Reinforcement Learning to Deadlock Handling in Intralogistics

    Get PDF
    Intralogistics systems, while complex, are crucial for a range of industries. One of their challenges is deadlock situations that can disrupt operations and decrease efficiency. This paper presents a four-stage framework for applying reinforcement learning algorithms to manage deadlocks in such systems. The stages include Problem Formulation, Model Selection, Algorithm Selection, and System Deployment. We carefully identify the problem, select an appropriate model to represent the system, choose a suitable reinforcement learning algorithm, and finally deploy the solution. Our approach provides a structured method to tackle deadlocks, improving system resilience and responsiveness. This comprehensive guide can serve researchers and practitioners alike, offering a new avenue for enhancing intralogistics performance. Future research can explore the framework’s effectiveness and applicability across different systems
    corecore