1,279 research outputs found

    Business intelligence-centered software as the main driver to migrate from spreadsheet-based analytics

    Get PDF
    Internship Report presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceNowadays, companies are handling and managing data in a way that they weren’t ten years ago. The data deluge is, as a mere consequence of that, the constant day-to-day challenge for them - having to create agile and scalable data solutions to tackle this reality. The main trigger of this project was to support the decision-making process of a customer-centered marketing team (called Customer Voice) in the Company X by developing a complete, holistic Business Intelligence solution that goes all the way from ETL processes to data visualizations based on that team’s business needs. Having this context into consideration, the focus of the internship was to make use of BI, ETL techniques to migrate their data stored in spreadsheets — where they performed data analysis — and shift the way they see the data into a more dynamic, sophisticated, and suitable way in order to help them make data-driven strategic decisions. To ensure that there was credibility throughout the development of this project and its subsequent solution, it was necessary to make an exhaustive literature review to help me frame this project in a more realistic and logical way. That being said, this report made use of scientific literature that explained the evolution of the ETL workflows, tools, and limitations across different time periods and generations, how it was transformed from manual to real-time data tasks together with data warehouses, the importance of data quality and, finally, the relevance of ETL processes optimization and new ways of approaching data integrations by using modern, cloud architectures

    A Building Information Modeling (BIM)-centric Digital Ecosystem for Smart Airport Life Cycle Management

    Get PDF
    An increasing number of new airport infrastructure construction and improvement projects are being delivered in today\u27s modern world. However, value creation is a recurring issue due to inefficiencies in managing capital expenditures (CapEx) and operating expenses (OpEx), while trying to optimize project constraints of scope, time, cost, quality, and resources. In this new era of smart infrastructure, digitalization transforms the way projects are planned and delivered. Building Information Modeling (BIM) is a key digital process technique that has become an imperative for today\u27s Architecture, Engineering, Construction and Operations (AECO) sector. This research suggests a BIM-centric digital ecosystem by detailing technical and strategic aspects of Airport BIM implementation and digital technology integration from a life cycle perspective. This research provides a novel approach for consistent and continuous use of digital information between business and functional levels of an airport by developing a digital platform solution that will enable seamless flow of information across functions. Accordingly, this study targets to achieve three objectives: 1- To provide a scalable know-how of BIM-enabled digital transformation; 2- To guide airport owners and major stakeholders towards converging information siloes for airport life cycle data management by an Airport BIM Framework; 3- To develop a BIM-based digital platform architecture towards realization of an airport digital twin for airport infrastructure life cycle management. Airport infrastructures can be considered as a System of Systems (SoS). As such, Model Based Systems Engineering (MBSE) with Systems Modeling Language (SysML) is selected as the key methodology towards designing a digital ecosystem. Applying MBSE principles leads to forming an integrating framework for managing the digital ecosystem. Furthermore, this research adopts convergent parallel mixed methods to collect and analyze multiple forms of data. Data collection tools include extensive literature and industry review; an online questionnaire; semi-structured interviews with airport owner parties; focus group discussions; first-hand observations; and document reviews. Data analysis stage includes multiple explanatory case study analyses, thematic analysis, project mapping, percent coverage analysis for coded themes to achieve Objective 1; thematic analysis, cluster analysis, framework analysis, and non-parametric statistical analysis for Objective 2; and qualitative content analysis, non-parametric statistical analysis to accomplish Objective 3. This research presents a novel roadmap toward facilitation of smart airports with alignment and integration of disruptive technologies with business and operational aspects of airports. Multiple comprehensive case study analyses on international large-hub airports and triangulation of organization-level and project-level results systematically generate scalable technical and strategic guidelines for BIM implementation. The proposed platform architecture will incentivize major stakeholders for value-creation, data sharing, and control throughout a project life cycle. Introducing scalability and minimizing complexity for end-users through a digital platform approach will lead to a more connected environment. Consequently, a digital ecosystem enables sophisticated interaction between people, places, and assets. Model-driven approach provides an effective strategy for enhanced decision-making that helps optimization of project resources and allows fast adaptation to emerging business and operational demands. Accordingly, airport sustainability measures -economic vitality, operational efficiency, natural resources, and social responsibility- will improve due to higher levels of efficiency in CapEx and OpEx. Changes in business models for large capital investments and introducing sustainability to supply chains are among the anticipated broader impacts of this study

    Workflow Behavior Auditing for Mission Centric Collaboration

    Get PDF
    Successful mission-centric collaboration depends on situational awareness in an increasingly complex mission environment. To support timely and reliable high level mission decisions, auditing tools need real-time data for effective assessment and optimization of mission behaviors. In the context of a battle rhythm, mission health can be measured from workflow generated activities. Though battle rhythm collaboration is dynamic and global, a potential enabling technology for workflow behavior auditing exists in process mining. However, process mining is not adequate to provide mission situational awareness in the battle rhythm environment since event logs may contain dynamic mission states, noise and timestamp inaccuracy. Therefore, we address a few key near-term issues. In sequences of activities parsed from network traffic streams, we identify mission state changes in the workflow shift detection algorithm. In segments of unstructured event logs that contain both noise and relevant workflow data, we extract and rank workflow instances for the process analyst. When confronted with timestamp inaccuracy in event logs from semi automated, distributed workflows, we develop the flower chain network and discovery algorithm to improve behavioral conformance. For long term adoption of process mining in mission centric collaboration, we develop and demonstrate an experimental framework for logging uncertainty testing. We show that it is highly feasible to employ process mining techniques in environments with dynamic mission states and logging uncertainty. Future workflow behavior auditing technology will benefit from continued algorithmic development, new data sources and system prototypes to propel next generation mission situational awareness, giving commanders new tools to assess and optimize workflows, computer systems and missions in the battle space environment

    EXPLOITING KASPAROV'S LAW: ENHANCED INFORMATION SYSTEMS INTEGRATION IN DOD SIMULATION-BASED TRAINING ENVIRONMENTS

    Get PDF
    Despite recent advances in the representation of logistics considerations in DOD staff training and wargaming simulations, logistics information systems (IS) remain underrepresented. Unlike many command and control (C2) systems, which can be integrated with simulations through common protocols (e.g., OTH-Gold), many logistics ISs require manpower-intensive human-in-the-loop (HitL) processes for simulation-IS (sim-IS) integration. Where automated sim-IS integration has been achieved, it often does not simulate important sociotechnical system (STS) dynamics, such as information latency and human error, presenting decision-makers with an unrealistic representation of logistics C2 capabilities in context. This research seeks to overcome the limitations of conventional sim-IS interoperability approaches by developing and validating a new approach for sim-IS information exchange through robotic process automation (RPA). RPA software supports the automation of IS information exchange through ISs’ existing graphical user interfaces. This “outside-in” approach to IS integration mitigates the need for engineering changes in ISs (or simulations) for automated information exchange. In addition to validating the potential for an RPA-based approach to sim-IS integration, this research presents recommendations for a Distributed Simulation Engineering and Execution Process (DSEEP) overlay to guide the engineering and execution of sim-IS environments.Major, United States Marine CorpsApproved for public release. Distribution is unlimited

    Solid-state microwave processor for food treatment

    Full text link
    [EN] Uneven heating and hot spots, irregular matching conditions and deterioration of organoleptic qualities are typical drawbacks of magnetron-based food processing with microwave radiation. The proposed “Kopernicook” modular architecture, based on multiple solid-state generators governed by a distributed software platform, allows highly accurate parametric control, full customization of radiation patterns and dynamic self-regulating workflows. The first results, validated with industrial applications, show great flexibility of operation, optimal energy consumption and different ideas for future developments in terms of radiation patterns and feedback-triggered algorithms aimed at maximally efficient processes.Fiore, M.; Di Modugno, N.; Pellegrini, F.; Roselli, M. (2019). Solid-state microwave processor for food treatment. En AMPERE 2019. 17th International Conference on Microwave and High Frequency Heating. Editorial Universitat Politècnica de València. 152-158. https://doi.org/10.4995/AMPERE2019.2019.9862OCS15215

    Conformance Checking and Simulation-based Evolutionary Optimization for Deployment and Reconfiguration of Software in the Cloud

    Get PDF
    Many SaaS providers nowadays want to leverage the cloud's capabilities also for their existing applications, for example, to enable sound scalability and cost-effectiveness. This thesis provides the approach CloudMIG that supports SaaS providers to migrate those applications to IaaS and PaaS-based cloud environments. CloudMIG consists of a step-by-step process and focuses on two core components. (1) Restrictions imposed by specific cloud environments (so-called cloud environment constraints (CECs)), such as a limited file system access or forbidden method calls, can be validated by an automatic conformance checking approach. (2) A cloud deployment option (CDO) determines which cloud environment, cloud resource types, deployment architecture, and runtime reconfiguration rules for exploiting a cloud's elasticity should be used. The implied performance and costs can differ in orders of magnitude. CDOs can be automatically optimized with the help of our simulation-based genetic algorithm CDOXplorer. Extensive lab experiments and an experiment in an industrial context show CloudMIG's applicability and the excellent performance of its two core components

    Interoperability of Enterprise Software and Applications

    Get PDF

    An Approach for Guiding Developers to Performance and Scalability Solutions

    Get PDF
    This thesis proposes an approach that enables developers who are novices in software performance engineering to solve software performance and scalability problems without the assistance of a software performance expert. The contribution of this thesis is the explicit consideration of the implementation level to recommend solutions for software performance and scalability problems. This includes a set of description languages for data representation and human computer interaction and a workflow

    SCADA and related technologies

    Get PDF
    Presented at SCADA and related technologies for irrigation district modernization, II: a USCID water management conference held on June 6-9, 2007 in Denver, Colorado.SCADA systems in irrigation districts have focused on remote monitoring and remote control. In many districts, the remote control is manual, but in others the automation of structures is enabled through the usage of distributed control for the automation of individual structures. This paper presents the concept of an expanded, "umbrella" SCADA system that will perform the standard functions of remote control and remote monitoring, and will also incorporate information flow in the field for operators. The umbrella SCADA system will mesh the equipment-equipment information into an equipment-program-personnel network
    • …
    corecore