264 research outputs found

    Relatório de Estágio - Solução de BI Roaming Data Science (RoaDS) em ambiente Vodafone

    Get PDF
    A telecom company (Vodafone), had the need to implement a Business Intelligence solution for Roaming data across a wide set of different data sources. Based on the data visualization of this solution, its key users with decision power, can make a business analysis and needs of infrastructure and software expansion. This document aims to expose the scientific papers produced with the various stages of production of the solution (state of the art, architecture design and implementation results), this Business Intelligence solution was designed and implemented with OLAP methodologies and technologies in a Data Warehouse composed of Data Marts arranged in constellation, the visualization layer was custom made in JavaScript (VueJS). As a base for the results a questionnaire was created to be filled in by the key users of the solution. Based on this questionnaire it was possible to ascertain that user acceptance was satisfactory. The proposed objectives for the implementation of the BI solution with all the requirements was achieved with the infrastructure itself created from scratch in Kubernetes. This BI platform can be expanded using column storage databases created specifically with OLAP workloads in mind, removing the need for an OLAP cube layer. Based on Machine Learning algorithms, the platform will be able to perform the predictions needed to make decisions about Vodafone's Roaming infrastructure

    Predicting your next OLAP query based on recent analytical sessions

    Get PDF
    International audienceIn Business Intelligence systems, users interact with data warehouses by formulating OLAP queries aimed at exploring multidimensional data cubes. Being able to predict the most likely next queries would provide a way to recommend interesting queries to users on the one hand, and could improve the efficiency of OLAP sessions on the other. In particular, query recommendation would proactively guide users in data exploration and improve the quality of their interactive experience. In this paper, we propose a framework to predict the most likely next query and recommend this to the user. Our framework relies on a probabilistic user behavior model built by analyzing previous OLAP sessions and exploiting a query similarity metric. To gain insight in the recommendation precision and on what parameters it depends, we evaluate our approach using different quality assessments

    Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Get PDF
    Background. Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. Methods. SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results. Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01) from SPSS-GIS for satisfaction and time (p < .002). Descriptive results indicated that participants had greater success in answering the tasks when using SOVAT as compared to SPSS-GIS. Conclusion. Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the combined use of SPSS and GIS. The results from this study indicate a potential for OLAP-GIS decision support systems as a valuable tool for CHA data analysis. © 2008 Scotch et al; licensee BioMed Central Ltd

    The Data Lakehouse: Data Warehousing and More

    Full text link
    Relational Database Management Systems designed for Online Analytical Processing (RDBMS-OLAP) have been foundational to democratizing data and enabling analytical use cases such as business intelligence and reporting for many years. However, RDBMS-OLAP systems present some well-known challenges. They are primarily optimized only for relational workloads, lead to proliferation of data copies which can become unmanageable, and since the data is stored in proprietary formats, it can lead to vendor lock-in, restricting access to engines, tools, and capabilities beyond what the vendor offers. As the demand for data-driven decision making surges, the need for a more robust data architecture to address these challenges becomes ever more critical. Cloud data lakes have addressed some of the shortcomings of RDBMS-OLAP systems, but they present their own set of challenges. More recently, organizations have often followed a two-tier architectural approach to take advantage of both these platforms, leveraging both cloud data lakes and RDBMS-OLAP systems. However, this approach brings additional challenges, complexities, and overhead. This paper discusses how a data lakehouse, a new architectural approach, achieves the same benefits of an RDBMS-OLAP and cloud data lake combined, while also providing additional advantages. We take today's data warehousing and break it down into implementation independent components, capabilities, and practices. We then take these aspects and show how a lakehouse architecture satisfies them. Then, we go a step further and discuss what additional capabilities and benefits a lakehouse architecture provides over an RDBMS-OLAP

    Utilizing RxNorm to Support Practical Computing Applications: Capturing Medication History in Live Electronic Health Records

    Full text link
    RxNorm was utilized as the basis for direct-capture of medication history data in a live EHR system deployed in a large, multi-state outpatient behavioral healthcare provider in the United States serving over 75,000 distinct patients each year across 130 clinical locations. This tool incorporated auto-complete search functionality for medications and proper dosage identification assistance. The overarching goal was to understand if and how standardized terminologies like RxNorm can be used to support practical computing applications in live EHR systems. We describe the stages of implementation, approaches used to adapt RxNorm's data structure for the intended EHR application, and the challenges faced. We evaluate the implementation using a four-factor framework addressing flexibility, speed, data integrity, and medication coverage. RxNorm proved to be functional for the intended application, given appropriate adaptations to address high-speed input/output (I/O) requirements of a live EHR and the flexibility required for data entry in multiple potential clinical scenarios. Future research around search optimization for medication entry, user profiling, and linking RxNorm to drug classification schemes holds great potential for improving the user experience and utility of medication data in EHRs.Comment: Appendix (including SQL/DDL Code) available by author request. Keywords: RxNorm; Electronic Health Record; Medication History; Interoperability; Unified Medical Language System; Search Optimizatio

    Towards Prescriptive Analytics in Cyber-Physical Systems

    Get PDF
    More and more of our physical world today is being monitored and controlled by so-called cyber-physical systems (CPSs). These are compositions of networked autonomous cyber and physical agents such as sensors, actuators, computational elements, and humans in the loop. Today, CPSs are still relatively small-scale and very limited compared to CPSs to be witnessed in the future. Future CPSs are expected to be far more complex, large-scale, wide-spread, and mission-critical, and found in a variety of domains such as transportation, medicine, manufacturing, and energy, where they will bring many advantages such as the increased efficiency, sustainability, reliability, and security. To unleash their full potential, CPSs need to be equipped with, among other features, the support for automated planning and control, where computing agents collaboratively and continuously plan and control their actions in an intelligent and well-coordinated manner to secure and optimize a physical process, e.g., electricity flow in the power grid. In today’s CPSs, the control is typically automated, but the planning is solely performed by humans. Unfortunately, it is intractable and infeasible for humans to plan every action in a future CPS due to the complexity, scale, and volatility of a physical process. Due to these properties, the control and planning has to be continuous and automated in future CPSs. Humans may only analyse and tweak the system’s operation using the set of tools supporting prescriptive analytics that allows them (1) to make predictions, (2) to get the suggestions of the most prominent set of actions (decisions) to be taken, and (3) to analyse the implications as if such actions were taken. This thesis considers the planning and control in the context of a large-scale multi-agent CPS. Based on the smart-grid use-case, it presents a so-called PrescriptiveCPS – which is (the conceptual model of) a multi-agent, multi-role, and multi-level CPS automatically and continuously taking and realizing decisions in near real-time and providing (human) users prescriptive analytics tools to analyse and manage the performance of the underlying physical system (or process). Acknowledging the complexity of CPSs, this thesis provides contributions at the following three levels of scale: (1) the level of a (full) PrescriptiveCPS, (2) the level of a single PrescriptiveCPS agent, and (3) the level of a component of a CPS agent software system. At the CPS level, the contributions include the definition of PrescriptiveCPS, according to which it is the system of interacting physical and cyber (sub-)systems. Here, the cyber system consists of hierarchically organized inter-connected agents, collectively managing instances of so-called flexibility, decision, and prescription models, which are short-lived, focus on the future, and represent a capability, an (user’s) intention, and actions to change the behaviour (state) of a physical system, respectively. At the agent level, the contributions include the three-layer architecture of an agent software system, integrating the number of components specially designed or enhanced to support the functionality of PrescriptiveCPS. At the component level, the most of the thesis contribution is provided. The contributions include the description, design, and experimental evaluation of (1) a unified multi-dimensional schema for storing flexibility and prescription models (and related data), (2) techniques to incrementally aggregate flexibility model instances and disaggregate prescription model instances, (3) a database management system (DBMS) with built-in optimization problem solving capability allowing to formulate optimization problems using SQL-like queries and to solve them “inside a database”, (4) a real-time data management architecture for processing instances of flexibility and prescription models under (soft or hard) timing constraints, and (5) a graphical user interface (GUI) to visually analyse the flexibility and prescription model instances. Additionally, the thesis discusses and exemplifies (but provides no evaluations of) (1) domain-specific and in-DBMS generic forecasting techniques allowing to forecast instances of flexibility models based on historical data, and (2) powerful ways to analyse past, current, and future based on so-called hypothetical what-if scenarios and flexibility and prescription model instances stored in a database. Most of the contributions at this level are based on the smart-grid use-case. In summary, the thesis provides (1) the model of a CPS with planning capabilities, (2) the design and experimental evaluation of prescriptive analytics techniques allowing to effectively forecast, aggregate, disaggregate, visualize, and analyse complex models of the physical world, and (3) the use-case from the energy domain, showing how the introduced concepts are applicable in the real world. We believe that all this contribution makes a significant step towards developing planning-capable CPSs in the future.Mehr und mehr wird heute unsere physische Welt überwacht und durch sogenannte Cyber-Physical-Systems (CPS) geregelt. Dies sind Kombinationen von vernetzten autonomen cyber und physischen Agenten wie Sensoren, Aktoren, Rechenelementen und Menschen. Heute sind CPS noch relativ klein und im Vergleich zu CPS der Zukunft sehr begrenzt. Zukünftige CPS werden voraussichtlich weit komplexer, größer, weit verbreiteter und unternehmenskritischer sein sowie in einer Vielzahl von Bereichen wie Transport, Medizin, Fertigung und Energie – in denen sie viele Vorteile wie erhöhte Effizienz, Nachhaltigkeit, Zuverlässigkeit und Sicherheit bringen – anzutreffen sein. Um ihr volles Potenzial entfalten zu können, müssen CPS unter anderem mit der Unterstützung automatisierter Planungs- und Steuerungsfunktionalität ausgestattet sein, so dass Agents ihre Aktionen gemeinsam und kontinuierlich auf intelligente und gut koordinierte Weise planen und kontrollieren können, um einen physischen Prozess wie den Stromfluss im Stromnetz sicherzustellen und zu optimieren. Zwar sind in den heutigen CPS Steuerung und Kontrolle typischerweise automatisiert, aber die Planung wird weiterhin allein von Menschen durchgeführt. Leider ist diese Aufgabe nur schwer zu bewältigen, und es ist für den Menschen schlicht unmöglich, jede Aktion in einem zukünftigen CPS auf Basis der Komplexität, des Umfangs und der Volatilität eines physikalischen Prozesses zu planen. Aufgrund dieser Eigenschaften müssen Steuerung und Planung in CPS der Zukunft kontinuierlich und automatisiert ablaufen. Der Mensch soll sich dabei ganz auf die Analyse und Einflussnahme auf das System mit Hilfe einer Reihe von Werkzeugen konzentrieren können. Derartige Werkzeuge erlauben (1) Vorhersagen, (2) Vorschläge der wichtigsten auszuführenden Aktionen (Entscheidungen) und (3) die Analyse und potentiellen Auswirkungen der zu fällenden Entscheidungen. Diese Arbeit beschäftigt sich mit der Planung und Kontrolle im Rahmen großer Multi-Agent-CPS. Basierend auf dem Smart-Grid als Anwendungsfall wird ein sogenanntes PrescriptiveCPS vorgestellt, welches einem Multi-Agent-, Multi-Role- und Multi-Level-CPS bzw. dessen konzeptionellem Modell entspricht. Diese PrescriptiveCPS treffen und realisieren automatisch und kontinuierlich Entscheidungen in naher Echtzeit und stellen Benutzern (Menschen) Prescriptive-Analytics-Werkzeuge und Verwaltung der Leistung der zugrundeliegenden physischen Systeme bzw. Prozesse zur Verfügung. In Anbetracht der Komplexität von CPS leistet diese Arbeit Beiträge auf folgenden Ebenen: (1) Gesamtsystem eines PrescriptiveCPS, (2) PrescriptiveCPS-Agenten und (3) Komponenten eines CPS-Agent-Software-Systems. Auf CPS-Ebene umfassen die Beiträge die Definition von PrescriptiveCPS als ein System von wechselwirkenden physischen und cyber (Sub-)Systemen. Das Cyber-System besteht hierbei aus hierarchisch organisierten verbundenen Agenten, die zusammen Instanzen sogenannter Flexibility-, Decision- und Prescription-Models verwalten, welche von kurzer Dauer sind, sich auf die Zukunft konzentrieren und Fähigkeiten, Absichten (des Benutzers) und Aktionen darstellen, die das Verhalten des physischen Systems verändern. Auf Agenten-Ebene umfassen die Beiträge die Drei-Ebenen-Architektur eines Agentensoftwaresystems sowie die Integration von Komponenten, die insbesondere zur besseren Unterstützung der Funktionalität von PrescriptiveCPS entwickelt wurden. Der Schwerpunkt dieser Arbeit bilden die Beiträge auf der Komponenten-Ebene, diese umfassen Beschreibung, Design und experimentelle Evaluation (1) eines einheitlichen multidimensionalen Schemas für die Speicherung von Flexibility- and Prescription-Models (und verwandten Daten), (2) der Techniken zur inkrementellen Aggregation von Instanzen eines Flexibilitätsmodells und Disaggregation von Prescription-Models, (3) eines Datenbankmanagementsystem (DBMS) mit integrierter Optimierungskomponente, die es erlaubt, Optimierungsprobleme mit Hilfe von SQL-ähnlichen Anfragen zu formulieren und sie „in einer Datenbank zu lösen“, (4) einer Echtzeit-Datenmanagementarchitektur zur Verarbeitung von Instanzen der Flexibility- and Prescription-Models unter (weichen oder harten) Zeitvorgaben und (5) einer grafische Benutzeroberfläche (GUI) zur Visualisierung und Analyse von Instanzen der Flexibility- and Prescription-Models. Darüber hinaus diskutiert und veranschaulicht diese Arbeit beispielhaft ohne detaillierte Evaluation (1) anwendungsspezifische und im DBMS integrierte Vorhersageverfahren, die die Vorhersage von Instanzen der Flexibility- and Prescription-Models auf Basis historischer Daten ermöglichen, und (2) leistungsfähige Möglichkeiten zur Analyse von Vergangenheit, Gegenwart und Zukunft auf Basis sogenannter hypothetischer „What-if“-Szenarien und der in der Datenbank hinterlegten Instanzen der Flexibility- and Prescription-Models. Die meisten der Beiträge auf dieser Ebene basieren auf dem Smart-Grid-Anwendungsfall. Zusammenfassend befasst sich diese Arbeit mit (1) dem Modell eines CPS mit Planungsfunktionen, (2) dem Design und der experimentellen Evaluierung von Prescriptive-Analytics-Techniken, die eine effektive Vorhersage, Aggregation, Disaggregation, Visualisierung und Analyse komplexer Modelle der physischen Welt ermöglichen und (3) dem Anwendungsfall der Energiedomäne, der zeigt, wie die vorgestellten Konzepte in der Praxis Anwendung finden. Wir glauben, dass diese Beiträge einen wesentlichen Schritt in der zukünftigen Entwicklung planender CPS darstellen.Mere og mere af vores fysiske verden bliver overvåget og kontrolleret af såkaldte cyber-fysiske systemer (CPSer). Disse er sammensætninger af netværksbaserede autonome IT (cyber) og fysiske (physical) agenter, såsom sensorer, aktuatorer, beregningsenheder, og mennesker. I dag er CPSer stadig forholdsvis små og meget begrænsede i forhold til de CPSer vi kan forvente i fremtiden. Fremtidige CPSer forventes at være langt mere komplekse, storstilede, udbredte, og missionskritiske, og vil kunne findes i en række områder såsom transport, medicin, produktion og energi, hvor de vil give mange fordele, såsom øget effektivitet, bæredygtighed, pålidelighed og sikkerhed. For at frigøre CPSernes fulde potentiale, skal de bl.a. udstyres med støtte til automatiseret planlægning og kontrol, hvor beregningsagenter i samspil og løbende planlægger og styrer deres handlinger på en intelligent og velkoordineret måde for at sikre og optimere en fysisk proces, såsom elforsyningen i elnettet. I nuværende CPSer er styringen typisk automatiseret, mens planlægningen udelukkende er foretaget af mennesker. Det er umuligt for mennesker at planlægge hver handling i et fremtidigt CPS på grund af kompleksiteten, skalaen, og omskifteligheden af en fysisk proces. På grund af disse egenskaber, skal kontrol og planlægning være kontinuerlig og automatiseret i fremtidens CPSer. Mennesker kan kun analysere og justere systemets drift ved hjælp af det sæt af værktøjer, der understøtter præskriptive analyser (prescriptive analytics), der giver dem mulighed for (1) at lave forudsigelser, (2) at få forslagene fra de mest fremtrædende sæt handlinger (beslutninger), der skal tages, og (3) at analysere konsekvenserne, hvis sådanne handlinger blev udført. Denne afhandling omhandler planlægning og kontrol i forbindelse med store multi-agent CPSer. Baseret på en smart-grid use case, præsenterer afhandlingen det såkaldte PrescriptiveCPS hvilket er (den konceptuelle model af) et multi-agent, multi-rolle, og multi-level CPS, der automatisk og kontinuerligt tager beslutninger i nær-realtid og leverer (menneskelige) brugere præskriptiveanalyseværktøjer til at analysere og håndtere det underliggende fysiske system (eller proces). I erkendelse af kompleksiteten af CPSer, giver denne afhandling bidrag til følgende tre niveauer: (1) niveauet for et (fuldt) PrescriptiveCPS, (2) niveauet for en enkelt PrescriptiveCPS agent, og (3) niveauet for en komponent af et CPS agent software system. På CPS-niveau, omfatter bidragene definitionen af PrescriptiveCPS, i henhold til hvilken det er det system med interagerende fysiske- og IT- (under-) systemer. Her består IT-systemet af hierarkisk organiserede forbundne agenter der sammen styrer instanser af såkaldte fleksibilitet (flexibility), beslutning (decision) og præskriptive (prescription) modeller, som henholdsvis er kortvarige, fokuserer på fremtiden, og repræsenterer en kapacitet, en (brugers) intention, og måder til at ændre adfærd (tilstand) af et fysisk system. På agentniveau omfatter bidragene en tre-lags arkitektur af et agent software system, der integrerer antallet af komponenter, der er specielt konstrueret eller udbygges til at understøtte funktionaliteten af PrescriptiveCPS. Komponentniveauet er hvor afhandlingen har sit hovedbidrag. Bidragene omfatter beskrivelse, design og eksperimentel evaluering af (1) et samlet multi- dimensionelt skema til at opbevare fleksibilitet og præskriptive modeller (og data), (2) teknikker til trinvis aggregering af fleksibilitet modelinstanser og disaggregering af præskriptive modelinstanser (3) et database management system (DBMS) med indbygget optimeringsproblemløsning (optimization problem solving) der gør det muligt at formulere optimeringsproblemer ved hjælp af SQL-lignende forespørgsler og at løse dem "inde i en database", (4) en realtids data management arkitektur til at behandle instanser af fleksibilitet og præskriptive modeller under (bløde eller hårde) tidsbegrænsninger, og (5) en grafisk brugergrænseflade (GUI) til visuelt at analysere fleksibilitet og præskriptive modelinstanser. Derudover diskuterer og eksemplificerer afhandlingen (men giver ingen evalueringer af) (1) domæne-specifikke og in-DBMS generiske prognosemetoder der gør det muligt at forudsige instanser af fleksibilitet modeller baseret på historiske data, og (2) kraftfulde måder at analysere tidligere-, nutids- og fremtidsbaserede såkaldte hypotetiske hvad-hvis scenarier og fleksibilitet og præskriptive modelinstanser gemt i en database. De fleste af bidragene på dette niveau er baseret på et smart-grid brugsscenarie. Sammenfattende giver afhandlingen (1) modellen for et CPS med planlægningsmulighed, (2) design og eksperimentel evaluering af præskriptive analyse teknikker der gør det muligt effektivt at forudsige, aggregere, disaggregere, visualisere og analysere komplekse modeller af den fysiske verden, og (3) brugsscenariet fra energiområdet, der viser, hvordan de indførte begreber kan anvendes i den virkelige verden. Vi mener, at dette bidrag udgør et betydeligt skridt i retning af at udvikle CPSer til planlægningsbrug i fremtiden

    A Self-Service Supporting Business Intelligence and Big Data Analytics Architecture

    Get PDF
    Self-service Business Intelligence (SSBI) is an emerging topic for many companies. Casual users should be enabled to independently build their own analyses and reports. This accelerates and simplifies the decision-making processes. Although recent studies began to discuss parts of a self-service environment, none of these present a comprehensive architecture. Following a design science research approach, this study proposes a new self-service oriented BI architecture in order to address this gap. Starting from an in-depth literature review, an initial model was developed and improved by qualitative data analysis from interviews with 18 BI and IT specialists form companies across different industries. The proposed architecture model demonstrates the interaction between introduced self-service elements with each other and with traditional BI components. For example, we look at the integration of collaboration rooms and a self-learning knowledge database that aims to be a source for a report recommender

    Web Mining for Web Personalization

    Get PDF
    Web personalization is the process of customizing a Web site to the needs of specific users, taking advantage of the knowledge acquired from the analysis of the user\u27s navigational behavior (usage data) in correlation with other information collected in the Web context, namely, structure, content, and user profile data. Due to the explosive growth of the Web, the domain of Web personalization has gained great momentum both in the research and commercial areas. In this article we present a survey of the use of Web mining for Web personalization. More specifically, we introduce the modules that comprise a Web personalization system, emphasizing the Web usage mining module. A review of the most common methods that are used as well as technical issues that occur is given, along with a brief overview of the most popular tools and applications available from software vendors. Moreover, the most important research initiatives in the Web usage mining and personalization areas are presented
    • …
    corecore