3,086 research outputs found

    BIG DATA AND ANALYTICS AS A NEW FRONTIER OF ENTERPRISE DATA MANAGEMENT

    Get PDF
    Big Data and Analytics (BDA) promises significant value generation opportunities across industries. Even though companies increase their investments, their BDA initiatives fall short of expectations and they struggle to guarantee a return on investments. In order to create business value from BDA, companies must build and extend their data-related capabilities. While BDA literature has emphasized the capabilities needed to analyze the increasing volumes of data from heterogeneous sources, EDM researchers have suggested organizational capabilities to improve data quality. However, to date, little is known how companies actually orchestrate the allocated resources, especially regarding the quality and use of data to create value from BDA. Considering these gaps, this thesis – through five interrelated essays – investigates how companies adapt their EDM capabilities to create additional business value from BDA. The first essay lays the foundation of the thesis by investigating how companies extend their Business Intelligence and Analytics (BI&A) capabilities to build more comprehensive enterprise analytics platforms. The second and third essays contribute to fundamental reflections on how organizations are changing and designing data governance in the context of BDA. The fourth and fifth essays look at how companies provide high quality data to an increasing number of users with innovative EDM tools, that are, machine learning (ML) and enterprise data catalogs (EDC). The thesis outcomes show that BDA has profound implications on EDM practices. In the past, operational data processing and analytical data processing were two “worlds” that were managed separately from each other. With BDA, these "worlds" are becoming increasingly interdependent and organizations must manage the lifecycles of data and analytics products in close coordination. Also, with BDA, data have become the long-expected, strategically relevant resource. As such data must now be viewed as a distinct value driver separate from IT as it requires specific mechanisms to foster value creation from BDA. BDA thus extends data governance goals: in addition to data quality and regulatory compliance, governance should facilitate data use by broadening data availability and enabling data monetization. Accordingly, companies establish comprehensive data governance designs including structural, procedural, and relational mechanisms to enable a broad network of employees to work with data. Existing EDM practices therefore need to be rethought to meet the emerging BDA requirements. While ML is a promising solution to improve data quality in a scalable and adaptable way, EDCs help companies democratize data to a broader range of employees

    Taming Corporate Data

    Get PDF
    Today, every company is faced with a large amount of data that reaches it through multiple communication channels. To tame these data, the company (1) needs to carefully consider the data it receives to decide what to do with it; (2) must evaluate which procedures and tools to use; (3) must invest in Human Resources to manage data and information. We need new facilities to implement, we need new professionals, the Chief Data Officer and the Data Scientist, we need to increase final user knowledge by means of Data Literacy

    Towards a Data Governance Framework for Third Generation Platforms

    Get PDF
    The fourth industrial revolution considers data as a business asset and therefore this is placed as a central element of the software architecture (data as a service) that will support the horizontal and vertical digitalization of industrial processes. The large volume of data that the environment generates, its heterogeneity and complexity, as well as its reuse for later processes (e.g. analytics, IA) requires the adoption of policies, directives and standards for its right governance. Furthermore, the issues related to the use of resources in the cloud computing must be taken into account with the aim of meeting the requirements of performance and security of the different processes. This article, in the absence of frameworks adapted to this new architecture, proposes an initial schema for developing an effective data governance programme for third generation platforms, that means, a conceptual tool which guides organizations to define, design, develop and deploy services aligned with its vision and business goals in I4.0 era.This work is partially funded by Spanish Government through the research project TIN2017-86520-C3-3-R

    Towards better organizational analytics capability:a maturity model

    Get PDF
    Abstract. Data and analytics are changing the markets. Significant improvements in competitiveness can be achieved through utilizing data and analytics. Data and analytics can be used to support in all levels of decision making from operational to strategic levels. However, studies suggest that organizations are failing to realize these benefits. Many of the analytics initiatives fail and only a small partition of organizations’ data is used in decision making. This happens mostly because utilizing data and analytics in larger scale is a difficult and complex matter. Companies need to harness multiple resources and capabilities in a business context and use them synergistically to deliver value. Capabilities must be developed step by step and cannot be bought. Bottlenecks like siloed data, lack of commitment and lack of understanding slow down the development. The focus of this thesis is to gain insight on how these resources and capabilities can be managed and understood better to pursue a position where modern applications of data and analytics could be utilized even better. The study is conducted in two parts. In the first part, the terminology, disciplines, analytics capabilities, and success factors of data and analytics development are examined through the literature. Then a comprehensive tool for identifying and reviewing these analytics capabilities is built through analyzing and combining existing tools and earlier insights. This tool, organizational analytics maturity model, and other findings are then reviewed and complemented with empirical interviews. The main findings of this thesis were mapped analytics capabilities, success factors of analytics, and the organizational analytics maturity model. These results help practitioners and researchers to better understand the complexity of the subject and what dimensions must be taken into account when pursuing success with data and analytics.Kohti parempaa organisaation analytiikkakyvykkyyttĂ€ : maturiteettimalli. TiivistelmĂ€. Datan ja analytiikka muuttaa eri organisaatioiden vĂ€listĂ€ kilpailua. Huomattavia parannuksia kilpailukyvyssĂ€ voidaan saada aikaan oikeanlaisella datan ja analytiikan hyödyntĂ€misellĂ€. Data ja analytiikkaa voidaan kĂ€yttÀÀ kaikilla pÀÀtöksen teon asteilla operatiivisista pÀÀtöksistĂ€ strategiselle tasolle asti. TĂ€stĂ€ huolimatta tutkimukset osoittavat, ettĂ€ organisaatiot eivĂ€t ole onnistuneet saavuttamaan nĂ€itĂ€ hyötyjĂ€. Monet analytiikka-aloitteet epĂ€onnistuvat ja vain pientĂ€ osaa yritysten kerÀÀmĂ€stĂ€ datasta hyödynnetÀÀn pÀÀtöksenteossa. TĂ€mĂ€ johtuu pÀÀosin siitĂ€, ettĂ€ datan ja analytiikan hyödyntĂ€minen isossa kontekstissa on vaikeaa ja monimutkaista. Organisaatioiden tĂ€ytyy valjastaa useita resursseja ja kyvykkyyksiĂ€ liiketoimintakontekstissa ja kĂ€yttÀÀ nĂ€itĂ€ synergisesti tuottaakseen arvoa. NĂ€itĂ€ kyvykkyyksiĂ€ ei voida ostaa suoraan, vaan ne joudutaan asteittain kehittĂ€mÀÀn osaksi organisaatiota. Kehitykseen liittyy myös paljon ongelmakohtia, jotka hidastavat kokonaiskehitystĂ€. Siiloutunut data ja sitoutumisen ja ymmĂ€rryksen puute ovat esimerkkejĂ€ kehityksen kompastuskivistĂ€. TĂ€mĂ€n opinnĂ€ytteen tarkoitus on syventÀÀ ymmĂ€rrystĂ€ siitĂ€, miten nĂ€itĂ€ resursseja ja kyvykkyyksiĂ€ hallitaan ja ymmĂ€rretÀÀn paremmin. Miten organisaatio pÀÀsee tilaan, jossa se voi hyödyntÀÀ moderneja datan ja analytiikan mahdollisuuksia? Tutkimus muodostuu kahdesta osasta. EnsimmĂ€isessĂ€ osassa kĂ€sitellÀÀn terminologia, analytiikkakyvykkyydet ja niiden menestystekijĂ€t. Sen jĂ€lkeen luodaan kokonaisvaltainen työkalu, organisaation analytiikkamaturiteettimalli, kyvykkyyksien tunnistamiseksi ja kehittĂ€miseksi. TĂ€mĂ€ malli rakennetaan ensimmĂ€isten löydösten pohjalta. Tutkimuksen toisessa osassa aiemmat löydökset ja rakennettu malli validoidaan ja tĂ€ydennetÀÀn empiirisillĂ€ haastatteluilla. TĂ€mĂ€n työn pÀÀlöydökset ovat kartoitetut analytiikkakyvykkyydet, niiden menestystekijĂ€t ja organisaation analytiikkamaturiteettimalli. NĂ€mĂ€ löydökset auttavat ammattilaisia ja tutkijoita ymmĂ€rtĂ€mÀÀn paremmin aiheen monimutkaisuuden ja mitĂ€ dimensioita tulee ottaa huomioon, kun pyritÀÀn menestykseen datan ja analytiikan avulla

    Data ownership revisited: clarifying data accountabilities in times of big data and analytics

    Get PDF
    Today, a myriad of data is generated via connected devices and digital applications. In order to benefit from these data, companies have to develop their capabilities related to big data and analytics (BDA). A critical factor that is often cited concerning the “soft” aspects of BDA is data ownership, i.e., clarifying the fundamental rights and responsibilities for data. IS research has investigated data ownership for operational systems and data warehouses, where the purpose of data processing is known. In the BDA context, defining accountabilities for data is more challenging because data are stored in data lakes and used for previously unknown purposes. Based on four case studies, we identify ownership principles and three distinct types: data, data platform, and data product ownership. Our research answers fundamental questions about how data management changes with BDA and lays the foundation for future research on data and analytics governance

    Spatial Big Data Analytics: The New Boundaries of Retail Location Decision-Making

    Get PDF
    This dissertation examines the current state and evolution of retail location decision-making (RLDM) in Canada. The major objectives are: (i) To explore the type and scale of location decisions that retail firms are currently undertaking; (ii) To identify the availability and use of technology and Spatial Big Data (SBD) within the decision-making process; (iii) To identify the awareness, availability, use, adoption and development of SBD; and, (iv) To assess the implications of SBD in RLDM. These objectives were investigated by using a three stage multi-method research process. First, an online survey of retail location decision makers across a range of sizes and sub-sectors was administered. Secondly, structured interviews were conducted with 24 retail location decision makers, and lastly, three in-depth cases studies were undertaken in order to highlight the changes to RLDM over the last decade and to develop a deeper understanding of RLDM. This dissertation found that within the last decade RLDM changed in three main ways: (i) There has been an increase in the availability and use of technology and SBD within the decision-making process; (ii) The type and scale of location decisions that a firm undertakes remain relatively unchanged even with the growth of new data; and, (iii) The range of location research methods that are employed within retail firms is only just beginning to change given the presence of new data sources and data analytics technology. Traditional practices still dominate the RLDM process. While the adoption of SBD applications is starting to appear within retail planning, they are not widespread. Traditional data sources, such as those highlighted in past studies by Hernandez and Emmons (2012) and Byrom et al. (2001) are still the most commonly used data sources. It was evident that at the heart of SBD adoption is a data environment that promotes transparency and a clear corporate strategy. While most retailers are aware of the new SBD techniques that exist, they are not often adopted and routinized

    Spatial Big Data Analytics: The New Boundaries of Retail Location Decision-Making

    Get PDF
    This dissertation examines the current state and evolution of retail location decision-making (RLDM) in Canada. The major objectives are: (i) To explore the type and scale of location decisions that retail firms are currently undertaking; (ii) To identify the availability and use of technology and Spatial Big Data (SBD) within the decision-making process; (iii) To identify the awareness, availability, use, adoption and development of SBD; and, (iv) To assess the implications of SBD in RLDM. These objectives were investigated by using a three stage multi-method research process. First, an online survey of retail location decision makers across a range of sizes and sub-sectors was administered. Secondly, structured interviews were conducted with 24 retail location decision makers, and lastly, three in-depth cases studies were undertaken in order to highlight the changes to RLDM over the last decade and to develop a deeper understanding of RLDM. This dissertation found that within the last decade RLDM changed in three main ways: (i) There has been an increase in the availability and use of technology and SBD within the decision-making process; (ii) The type and scale of location decisions that a firm undertakes remain relatively unchanged even with the growth of new data; and, (iii) The range of location research methods that are employed within retail firms is only just beginning to change given the presence of new data sources and data analytics technology. Traditional practices still dominate the RLDM process. While the adoption of SBD applications is starting to appear within retail planning, they are not widespread. Traditional data sources, such as those highlighted in past studies by Hernandez and Emmons (2012) and Byrom et al. (2001) are still the most commonly used data sources. It was evident that at the heart of SBD adoption is a data environment that promotes transparency and a clear corporate strategy. While most retailers are aware of the new SBD techniques that exist, they are not often adopted and routinized

    Challenges and opportunities to develop a smart city: A case study of Gold Coast, Australia

    Get PDF
    With the rapid growth of information and communication technologies, there is a growing interest in developing smart cities with a focus on the knowledge economy, use of sensors and mobile technologies to plan and manage cities. The proponents argue that these emerging technologies have potential application in efficiently managing the environment and infrastructure, promoting economic development and actively engaging the public, thus contributing to building safe, healthy, sustainable and resilient cities. However, are there other important elements in addition to technologies which can contribute to the creation of smart cities? What are some of the challenges and opportunities for developing a smart city? This paper aims to answer these questions by developing a conceptual framework for smart cities. The framework is then applied to the city of Gold Coast to identify challenges and opportunities for developing the city into a ‘smart city’. Gold Coast is a popular tourist city of about 600,000 populations in South East Queensland, Australia, at the southern end of the 240km long coastal conurbation that is centred by Brisbane. Recently, IBM has nominated Gold Coast as one of the three cities in Australia for its Smarter Cities Challenge Grant. The grant will provide the Gold Coast City Council with the opportunity to collaborate with a group of experts from IBM to develop strategies for enhancing its ICT arrangements for disaster response capabilities. Gold Coast, meanwhile, has potential to diversify its economy from being centred on tourism to a knowledge economy with focus on its educational institutions, investments in cultural precincts and high quality lifestyle amenities. These provide a unique opportunity for building Gold Coast as an important smart city in the region. As part of the research methodology, the paper will review relevant policies of the council. Finally, lessons will be drawn from the case study for other cities which seek to establish themselves as smart cities

    Extract, Transform, and Load data from Legacy Systems to Azure Cloud

    Get PDF
    Internship report presented as partial requirement for obtaining the Master’s degree in Information Management, with a specialization in Knowledge Management and Business IntelligenceIn a world with continuously evolving technologies and hardened competitive markets, organisations need to continually be on guard to grasp cutting edge technology and tools that will help them to surpass any competition that arises. Modern data platforms that incorporate cloud technologies, support organisations to strive and get ahead of their competitors by providing solutions that help them capture and optimally use untapped data, and scalable storages to adapt to ever-growing data quantities. Also, adopt data processing and visualisation tools that help to improve the decision-making process. With many cloud providers available in the market, from small players to major technology corporations, this offers much flexibility to organisations to choose the best cloud technology that will align with their use cases and overall products and services strategy. This internship came up at the time when one of Accenture’s significant client in the financial industry decided to migrate from legacy systems to a cloud-based data infrastructure that is Microsoft Azure cloud. During this internship, development of the data lake, which is a core part of the MDP, was done to understand better the type of challenges that can be faced when migrating data from on-premise legacy systems to a cloud-based infrastructure. Also, provided in this work, are the main recommendations and guidelines when it comes to performing a large scale data migration
    • 

    corecore