480 research outputs found

    SODA: Generating SQL for Business Users

    Full text link
    The purpose of data warehouses is to enable business analysts to make better decisions. Over the years the technology has matured and data warehouses have become extremely successful. As a consequence, more and more data has been added to the data warehouses and their schemas have become increasingly complex. These systems still work great in order to generate pre-canned reports. However, with their current complexity, they tend to be a poor match for non tech-savvy business analysts who need answers to ad-hoc queries that were not anticipated. This paper describes the design, implementation, and experience of the SODA system (Search over DAta Warehouse). SODA bridges the gap between the business needs of analysts and the technical complexity of current data warehouses. SODA enables a Google-like search experience for data warehouses by taking keyword queries of business users and automatically generating executable SQL. The key idea is to use a graph pattern matching algorithm that uses the metadata model of the data warehouse. Our results with real data from a global player in the financial services industry show that SODA produces queries with high precision and recall, and makes it much easier for business users to interactively explore highly-complex data warehouses.Comment: VLDB201

    The evolution of business analytics : based on case study research

    Get PDF
    While business analytics is becoming more significant and widely used by companies from increasing industries, for many the concept remains a complex illusion. The field of business analytics is considerably generic and fragmented, leaving managers confused and ultimately inhibited to make valuable decisions. This paper presents an evolutionary depiction of business analytics, using real-world case studies to illustrate a distinct overview that describes where the phenomenon was derived from, where it currently stands, and where it is heading towards. This paper provides eight case studies, representing three different eras: yesterday (1950s to 1990s), today (2000s to 2020s), and tomorrow (2030s to 2050s). Through cross-case analysis we have identified concluding patterns that lay as foundation for the discussion on future development within business analytics. We argue based on our findings that automatization of business processes will most likely continue to increase. AI is expanding in numerous areas, each specializing in a complex task, previously reserved by professionals. However, patterns show that new occupations linked to artificial intelligence will most probably be created. For the training of intelligent systems, data will most likely be requested more than ever. The increasing data will likely cause complications in current data infrastructures, causing the need for stronger networks and systems. The systems will need to process, store, and manage the great amount of various data types in real-time, while maintaining high security. Furthermore, data privacy concerns have become more significant in recent years, although, the case study research indicates that it has not limited corporations access to data. On the contrary, corporations, people, and devices will most likely become even more connected than ever before.nhhma

    Data Quality Management in Corporate Practice

    Get PDF
    The 21st century is characterized by a rising quantity and importance of Data and Infor-mation. Companies utilize these in order to gain and maintain competitive advantages. Therefore, the Data and Information is required both in high quantity as well as quality. But while the amount of Data collected is steadily increasing, this does not necessarily mean the same is true for Data Quality. In order to assure high Data Quality, the concept of Data Quality Management (DQM) has been established, incorporating such elements as the assessment of Data Quality as well as its improvement. In order to discuss the issue of Data Quality Management, this paper pursues the following goals: (1) Systematic literature search for publications regarding Data Quality Management (Scientific contributions, Practice reports etc.) (2) Provision of a structured overview of the identified references and the research mate-rial (3) Analysis and evaluation of the scientific contributions with regards to methodology and theoretical foundation (4) Current expression of DQM in practice, differentiated by organization type and in-dustry (based upon the entire research material) as well as assessment of the situation (how well are the design recommendations based upon research results) (5) Summary of unresolved issues and challenges, based upon the research materia

    Virtual Decoupling for IT/Business Alignment - Conceptual Foundations, Architecture Design and Implementation Example

    Get PDF
    IT/business alignment is one of the main topics of information systems research. If IT artifacts and business-related artifacts are coupled point-to-point, however, complex architectures become unmanageable over time. In computer science, concepts like the ANSI/SPARC three-level database architecture propose an architecture layer which decouples external views on data and the implementation view of data. In this paper, a similar approach for IT/business alignment is proposed. The proposed alignment architecture is populated by enterprise services as elementary artifacts. Enterprise services link software components and process activities. They are aggregated into applications and subsequently into domains for planning/design and communication purposes. Most design approaches for the construction of enterprise services, applications and domains are top-down, i.e. they decompose complex artifacts on a stepwise basis. As an alternative which takes into account coupling semantics, we propose a bottom-up approach which is demonstrated for the identification of domains. Our approach is evaluated using a telecommunications equipment case stud

    Process oriented data virtualization design model for business processes evaluation (PODVDM) research in progress

    Get PDF
    During process enactment in the business process management (BPM) lifecycle, information collected on execution plans are stored in the form of log files and database tables by using information systems (IS).In the past decade, a new approach based on the applications of Business Intelligence (BI) in business process management has emerged. The approach implements process-oriented data warehouse and mining techniques.However, the main issue is providing the right information at the right time to facilitate process evaluation that can be used for performance analysis and improve business process.Existing techniques have limitations, including huge data in database log files, performance of Process Warehouse (PW), which is highly dependent on specific design), complexity of PW design, lack of convergence between business processes and PW specifications, and the need for real data during process evaluation stage.Objects such as processes, storage, and data repositories can be virtualized to address these limitations.The main aim of this study is to propose a process-oriented data virtualization design model for process evaluation in BPM.The model will be validated through expert reviews and prototype development as well as through a case study.In this paper, we describe the research motivation, questions, approach, and methodology related to addressing the described limitations by designing a model for evaluation in business processes using the Data Virtualization technique

    Data virtualization design model for near real time decision making in business intelligence environment

    Get PDF
    The main purpose of Business Intelligence (BI) is to focus on supporting an organization‘s strategic, operational and tactical decisions by providing comprehensive, accurate and vivid data to the decision makers. A data warehouse (DW), which is considered as the input for decision making system activities is created through a complex process known as Extract, Transform and Load (ETL). ETL operates at pre-defined times and requires time to process and transfer data. However, providing near real time information to facilitate the data integration in supporting decision making process is a known issue. Inaccessibility to near realtime information could be overcome with Data Virtualization (DV) as it provides unified, abstracted, near real time, and encapsulated view of information for querying. Nevertheless, currently, there are lack of studies on the BI model for developing and managing data in virtual manner that can fulfil the organization needs. Therefore, the main aim of this study is to propose a DV model for near-real time decision making in BI environment. Design science research methodology was adopted to accomplish the research objectives. As a result of this study, a model called Data Virtualization Development Model (DVDeM) is proposed that addresses the phases and components which affect the BI environment. To validate the model, expert reviews and focus group discussions were conducted. A prototype based on the proposed model was also developed, and then implemented in two case studies. Also, an instrument was developed to measure the usability of the prototype in providing near real time data. In total, 60 participants were involved and the findings indicated that 93% of the participants agreed that the DVDeM based prototype was able to provide near real-time data for supporting decision-making process. From the studies, the findings also showed that the majority of the participants (more than 90%) in both of education and business sectors, have affirmed the workability of the DVDeM and the usability of the prototype in particular able to deliver near real-time decision-making data. Findings also indicate theoretical and practical contributions for developers to develop efficient BI applications using DV technique. Also, the mean values for each measurement item are greater than 4 indicating that the respondents agreed with the statement for each measurement item. Meanwhile, it was found that the mean scores for overall usability attributes of DVDeM design model fall under "High" or "Fairly High". Therefore, the results show sufficient indications that by adopting DVDeM model in developing a system, the usability of the produced system is perceived by the majority of respondents as high and is able to support near real time decision making data

    Ambidexterity in Information Systems Research: Overview of Conceptualizations, Antecedents, and Outcomes

    Get PDF
    Organizations that are not efficient and innovative today quickly become irrelevant tomorrow. Ambidexterity (i.e., simultaneously conducting two seemingly contradicting activities, such as exploitation and exploration) helps organizations to overcome this challenge and, hence, has become increasingly popular with manifold applications in information systems (IS) research. However, we lack a systematic understanding of ambidexterity research, its research streams, and their future trajectory. Hence, we conduct a systematic literature review on ambidexterity in IS research and identify six distinct research streams that use an ambidexterity lens: IT-enabled organizational ambidexterity, ambidextrous IT capability, ambidexterity in IS development, ambidextrous IS strategy, ambidextrous inter-organizational relationships, and organizational ambidexterity in IS. We present the current state of research in each stream. More so, we comprehensively overview application areas, conceptualizations, antecedents for, and outcomes of ambidexterity. Hence, this study contributes to the emergent theme of ambidexterity in IS research

    Neighbourhoods in Transition

    Get PDF
    This open access book is focused on the intersection between urban brownfields and the sustainability transitions of metreopolitan areas, cities and neighbourhoods. It provides both a theoretical and practical approach to the topic, offering a thorough introduction to urban brownfields and regeneration projects as well as an operational monitoring tool. Neighbourhoods in Transition begins with an overview of historic urban development and strategic areas in the hearts of towns to be developed. It then defines several key issues related to the topic, including urban brownfields, regeneration projects, and sustainability issues related to neighbourhood development. The second part of this book is focused on support tools, explaining the challenges faced, the steps involved in a regeneration process, and offering an operational monitoring tool. It applies the unique tool to case studies in three selected neighbourhoods and the outcomes of one case study are also presented and discussed, highlighting its benefits. The audience for this book will be both professional and academic. It will support researchers as an up-to-date reference book on urban brownfield regeneration projects, and also the work of architects, urban designers, urban planners and engineers involved in sustainability transitions of the built environment
    • …
    corecore