43,351 research outputs found

    Leveraging big data analytics for competitive advantage in South African banking

    Get PDF
    Big data is considered a form of capital and source of competitive advantage. This proliferation of data has the promise of transforming business process, altering corporate ecosystems and unlocking business value through the strategic and operational implications of better informed decision making and enhanced organisational responsiveness. Furthermore, the big data era has new implications for understanding consumer behaviour and formulating marketing strategy. Despite the promise of the big data revolution being a source of competitive advantage and superior organisational performance, organisations lack the ability to create difficult to match capabilities to effectively leverage big data for competitive advantage. The research explores leveraging big data analytics for competitive advantage in South African banking. Data was collected through 11 semi-structured, in depth interviews with experts from the South African banking industry. Thematic analysis of the qualitative interview data provided insights into leveraging big data analytics for competitive advantage in South African banking, particularly, the industries utilisation of big data analytics, the adequacy of the methodologies employed for the processing of big data and the resource and capability requirements. The research establishes that in strategizing around leveraging big data capabilities, it is imperative that South African banking is cognisant of the key role that top management emphasis, inter-departmental dynamics and organisational design plays in the development and leveraging of these capabilities. Furthermore, the research accentuates the importance of the embodiment of a data-oriented culture to facilitate the organisation to sense, seize and execute on opportunities.Mini Dissertation (MBA)--University of Pretoria, 2018.ms2019Gordon Institute of Business Science (GIBS)MB

    Essential Micro-foundations for Contemporary Business Operations: Top Management Tangible Competencies, Relationship-based Business Networks and Environmental Sustainability

    Get PDF
    Although various studies have emphasized linkages between firm competencies, networks and sustainability at organizational level, the links between top management tangible competencies (e.g., contemporary relevant quantitative-focused education such as big data analytics and data-driven applications linked with the internet of things, relevant experience and analytical business applications), relationship-based business networks (RBNs) and environmental sustainability have not been well established at micro-level, and there is a literature gap in terms of investigating these relationships. This study examines these links based on the unique data collected from 175 top management representatives (chief executive officers and managing directors) working in food import and export firms headquartered in the UK and New Zealand. Our results from structural equation modelling indicate that top management tangible competencies (TMTCs) are the key determinants for building RBNs, mediating the correlation between TMTCs and environmental sustainability. Directly, the competencies also play a vital role towards environmental practices. The findings further depict that relationship-oriented firms perform better compared to those which focus less on such networks. Consequently, our findings provide a deeper understanding of the micro-foundations of environmental sustainability based on TMTCs rooted in the resource-based view and RBNs entrenched in the social network theory. We discuss the theoretical and practical implications of our findings, and we provide suggestions for future research

    Creating business value from big data and business analytics : organizational, managerial and human resource implications

    Get PDF
    This paper reports on a research project, funded by the EPSRC’s NEMODE (New Economic Models in the Digital Economy, Network+) programme, explores how organizations create value from their increasingly Big Data and the challenges they face in doing so. Three case studies are reported of large organizations with a formal business analytics group and data volumes that can be considered to be ‘big’. The case organizations are MobCo, a mobile telecoms operator, MediaCo, a television broadcaster, and CityTrans, a provider of transport services to a major city. Analysis of the cases is structured around a framework in which data and value creation are mediated by the organization’s business analytics capability. This capability is then studied through a sociotechnical lens of organization/management, process, people, and technology. From the cases twenty key findings are identified. In the area of data and value creation these are: 1. Ensure data quality, 2. Build trust and permissions platforms, 3. Provide adequate anonymization, 4. Share value with data originators, 5. Create value through data partnerships, 6. Create public as well as private value, 7. Monitor and plan for changes in legislation and regulation. In organization and management: 8. Build a corporate analytics strategy, 9. Plan for organizational and cultural change, 10. Build deep domain knowledge, 11. Structure the analytics team carefully, 12. Partner with academic institutions, 13. Create an ethics approval process, 14. Make analytics projects agile, 15. Explore and exploit in analytics projects. In technology: 16. Use visualization as story-telling, 17. Be agnostic about technology while the landscape is uncertain (i.e., maintain a focus on value). In people and tools: 18. Data scientist personal attributes (curious, problem focused), 19. Data scientist as ‘bricoleur’, 20. Data scientist acquisition and retention through challenging work. With regards to what organizations should do if they want to create value from their data the paper further proposes: a model of the analytics eco-system that places the business analytics function in a broad organizational context; and a process model for analytics implementation together with a six-stage maturity model

    How can SMEs benefit from big data? Challenges and a path forward

    Get PDF
    Big data is big news, and large companies in all sectors are making significant advances in their customer relations, product selection and development and consequent profitability through using this valuable commodity. Small and medium enterprises (SMEs) have proved themselves to be slow adopters of the new technology of big data analytics and are in danger of being left behind. In Europe, SMEs are a vital part of the economy, and the challenges they encounter need to be addressed as a matter of urgency. This paper identifies barriers to SME uptake of big data analytics and recognises their complex challenge to all stakeholders, including national and international policy makers, IT, business management and data science communities. The paper proposes a big data maturity model for SMEs as a first step towards an SME roadmap to data analytics. It considers the ‘state-of-the-art’ of IT with respect to usability and usefulness for SMEs and discusses how SMEs can overcome the barriers preventing them from adopting existing solutions. The paper then considers management perspectives and the role of maturity models in enhancing and structuring the adoption of data analytics in an organisation. The history of total quality management is reviewed to inform the core aspects of implanting a new paradigm. The paper concludes with recommendations to help SMEs develop their big data capability and enable them to continue as the engines of European industrial and business success. Copyright © 2016 John Wiley & Sons, Ltd.Peer ReviewedPostprint (author's final draft

    From big data to big performance – exploring the potential of big data for enhancing public organizations’ performance : a systematic literature review

    Get PDF
    This article examines the possibilities for increasing organizational performance in the public sector using Big Data by conducting a systematic literature review. It includes the results of 36 scientific articles published between January 2012 and July 2019. The results show a tendency to explain the relationship between big data and organizational performance through the Resource-Based View of the Firm or the Dynamic Capabilities View, arguing that perfor-mance improvement in an organization stems from unique capabilities. In addition, the results show that Big Data performance improvement is influenced by better organizational decision making. Finally, it identifies three dimensions that seem to play a role in this process: the human dimension, the organizational dimension, and the data dimension. From these findings, implications for both practice and theory are derived

    IPO Ready? Illuminating the Dark Box of Private Equity

    Get PDF
    The use of public equity data can help combat the challenges private equity funds currently face regarding data availability. The goal is to create a model to provide guidance to both investors and entrepreneurs in the decision-making process. The data gathered would provide insight on how close a private company is to a successful Initial Public Offering (IPO). The idea is that a model, showing the average financial metrics of companies within certain industries during an IPO, can provide new perceptiveness as to how the private company is performing

    The Transformation of Accounting Information Systems Curriculum in the Last Decade

    Get PDF
    Accounting information systems (AIS) are an extremely important component of accounting and accounting education. The purpose of the current study is to examine the transformation of accounting information systems (AIS) curriculum in the last decade. The motivation for this research comes from the vast advances made in the world of information technology (IT) and information systems (IS). The specific research questions addressed in the current study are: (1) how has AIS curriculum changed in the 18 years since SOX? (2) How has AIS curriculum adjusted in recent years with the emergence of the new hot-button topic big data/data analytics? Overall, this study finds that the core of AIS curriculum has not significantly changed over the last decade. However, more emphasis is being placed on topics such as enterprise wide systems/ERP, IT audits, computer fraud, and transaction-processing. Related, several new topical coverages have been introduced such as business analysts and big data/data analytics. The key contribution of this paper is to provide accounting students and accounting educators with useful information regarding the most significant shifts in AIS over the last decade and insight into the most valuable current AIS topics

    Applied business analytics approach to IT projects – Methodological framework

    Full text link
    The design and implementation of a big data project differs from a typical business intelligence project that might be presented concurrently within the same organization. A big data initiative typically triggers a large scale IT project that is expected to deliver the desired outcomes. The industry has identified two major methodologies for running a data centric project, in particular SEMMA (Sample, Explore, Modify, Model and Assess) and CRISP-DM (Cross Industry Standard Process for Data Mining). More general, the professional organizations PMI (Project Management Institute) and IIBA (International Institute of Business Analysis) have defined their methods for project management and business analysis based on the best current industry practices. However, big data projects place new challenges that are not considered by the existing methodologies. The building of end-to-end big data analytical solution for optimization of the supply chain, pricing and promotion, product launch, shop potential and customer value is facing both business and technical challenges. The most common business challenges are unclear and/or poorly defined business cases; irrelevant data; poor data quality; overlooked data granularity; improper contextualization of data; unprepared or bad prepared data; non-meaningful results; lack of skill set. Some of the technical challenges are related to lag of resources and technology limitations; availability of data sources; storage difficulties; security issues; performance problems; little flexibility; and ineffective DevOps. This paper discusses an applied business analytics approach to IT projects and addresses the above-described aspects. The authors present their work on research and development of new methodological framework and analytical instruments applicable in both business endeavors, and educational initiatives, targeting big data. The proposed framework is based on proprietary methodology and advanced analytics tools. It is focused on the development and the implementation of practical solutions for project managers, business analysts, IT practitioners and Business/Data Analytics students. Under discussion are also the necessary skills and knowledge for the successful big data business analyst, and some of the main organizational and operational aspects of the big data projects, including the continuous model deployment

    Privacy, Public Goods, and the Tragedy of the Trust Commons: A Response to Professors Fairfield and Engel

    Get PDF
    User trust is an essential resource for the information economy. Without it, users would not provide their personal information and digital businesses could not operate. Digital companies do not protect this trust sufficiently. Instead, many take advantage of it for short-term gain. They act in ways that, over time, will undermine user trust. In so doing, they act against their own best interest. This Article shows that companies behave this way because they face a tragedy of the commons. When a company takes advantage of user trust for profit, it appropriates the full benefit of this action. However, it shares the cost with all other companies that rely on the wellspring of user trust. Each company, acting rationally, has an incentive to appropriate as much of the trust resource as it can. That is why such companies collect, analyze, and “monetize” our personal information in such an unrestrained way. This behavior poses a longer term risk. User trust is like a fishery. It can withstand a certain level of exploitation and renew itself. But over-exploitation can cause it to collapse. Were digital companies collectively to undermine user trust this would not only hurt the users, it would damage the companies themselves. This Article explores commons-management theory for potential solutions to this impending tragedy of the trust commons

    To boardrooms and sustainability: the changing nature of segmentation

    Get PDF
    Market segmentation is the process by which customers in markets with some heterogeneity are grouped into smaller homogeneous segments of more ‘similar’ customers. A market segment is a group of individuals, groups or organisations sharing similar characteristics and buying behaviour that cause them to have relatively similar needs and purchasing behaviour. Segmentation is not a new concept: for six decades marketers have, in various guises, sought to break-down a market into sub-groups of users, each sharing common needs, buying behavior and marketing requirements. However, this approach to target market strategy development has been rejuvenated in the past few years. Various reasons account for this upsurge in the usage of segmentation, examination of which forms the focus of this white paper. Ready access to data enables faster creation of a segmentation and the testing of propositions to take to market. ‘Big data’ has made the re-thinking of target market segments and value propositions inevitable, desirable, faster and more flexible. The resulting information has presented companies with more topical and consumer-generated insights than ever before. However, many marketers, analytics directors and leadership teams feel over-whelmed by the sheer quantity and immediacy of such data. Analytical prowess in consultants and inside client organisations has benefited from a stepchange, using new heuristics and faster computing power, more topical data and stronger market insights. The approach to segmentation today is much smarter and has stretched well away from the days of limited data explored only with cluster analysis. The coverage and wealth of the solutions are unimaginable when compared to the practices of a few years ago. Then, typically between only six to ten segments were forced into segmentation solutions, so that an organisation could cater for these macro segments operationally as well as understand them intellectually. Now there is the advent of what is commonly recognised as micro segmentation, where the complexity of business operations and customer management requires highly granular thinking. In support of this development, traditional agency/consultancy roles have transitioned into in-house business teams led by data, campaign and business change planners. The challenge has shifted from developing a granular segmentation solution that describes all customers and prospects, into one of enabling an organisation to react to the granularity of the solution, deploying its resources to permit controlled and consistent one-to-one interaction within segments. So whilst the cost of delivering and maintaining the solution has reduced with technology advances, a new set of systems, costs and skills in channel and execution management is required to deliver on this promise. These new capabilities range from rich feature creative and content management solutions, tailored copy design and deployment tools, through to instant messaging middleware solutions that initiate multi-streams of activity in a variety of analytical engines and operational systems. Companies have recruited analytics and insight teams, often headed by senior personnel, such as an Insight Manager or Analytics Director. Indeed, the situations-vacant adverts for such personnel out-weigh posts for brand and marketing managers. Far more companies possess the in-house expertise necessary to help with segmentation analysis. Some organisations are also seeking to monetise one of the most regularly under-used latent business assets… data. Developing the capability and culture to bring data together from all corners of a business, the open market, commercial sources and business partners, is a step-change, often requiring a Chief Data Officer. This emerging role has also driven the professionalism of data exploration, using more varied and sophisticated statistical techniques. CEOs, CFOs and COOs increasingly are the sponsor of segmentation projects as well as the users of the resulting outputs, rather than CMOs. CEOs because recession has forced re-engineering of value propositions and the need to look after core customers; CFOs because segmentation leads to better and more prudent allocation of resources – especially NPD and marketing – around the most important sub-sets of a market; COOs because they need to better look after key customers and improve their satisfaction in service delivery. More and more it is recognised that with a new segmentation comes organisational realignment and change, so most business functions now have an interest in a segmentation project, not only the marketers. Largely as a result of the digital era and the growth of analytics, directors and company leadership teams are becoming used to receiving more extensive market intelligence and quickly updated customer insight, so leading to faster responses to market changes, customer issues, competitor moves and their own performance. This refreshing of insight and a leadership team’s reaction to this intelligence often result in there being more frequent modification of a target market strategy and segmentation decisions. So many projects set up to consider multi-channel strategy and offerings; digital marketing; customer relationship management; brand strategies; new product and service development; the re-thinking of value propositions, and so forth, now routinely commence with a segmentation piece in order to frame the ongoing work. Most organisations have deployed CRM systems and harnessed associated customer data. CRM first requires clarity in segment priorities. The insights from a CRM system help inform the segmentation agenda and steer how they engage with their important customers or prospects. The growth of CRM and its ensuing data have assisted the ongoing deployment of segmentation. One of the biggest changes for segmentation is the extent to which it is now deployed by practitioners in the public and not-for-profit sectors, who are harnessing what is termed social marketing, in order to develop and to execute more shrewdly their targeting, campaigns and messaging. For Marketing per se, the interest in the marketing toolkit from non-profit organisations, has been big news in recent years. At the very heart of the concept of social marketing is the market segmentation process. The extreme rise in the threat to security from global unrest, terrorism and crime has focused the minds of governments, security chiefs and their advisors. As a result, significant resources, intellectual capability, computing and data management have been brought to bear on the problem. The core of this work is the importance of identifying and profiling threats and so mitigating risk. In practice, much of this security and surveillance work harnesses the tools developed for market segmentation and the profiling of different consumer behaviours. This white paper presents the findings from interviews with leading exponents of segmentation and also the insights from a recent study of marketing practitioners relating to their current imperatives and foci. More extensive views of some of these ‘leading lights’ have been sought and are included here in order to showcase the latest developments and to help explain both the ongoing surge of segmentation and the issues under-pinning its practice. The principal trends and developments are thereby presented and discussed in this paper
    corecore