150,279 research outputs found

    Privacy Management and Optimal Pricing in People-Centric Sensing

    Full text link
    With the emerging sensing technologies such as mobile crowdsensing and Internet of Things (IoT), people-centric data can be efficiently collected and used for analytics and optimization purposes. This data is typically required to develop and render people-centric services. In this paper, we address the privacy implication, optimal pricing, and bundling of people-centric services. We first define the inverse correlation between the service quality and privacy level from data analytics perspectives. We then present the profit maximization models of selling standalone, complementary, and substitute services. Specifically, the closed-form solutions of the optimal privacy level and subscription fee are derived to maximize the gross profit of service providers. For interrelated people-centric services, we show that cooperation by service bundling of complementary services is profitable compared to the separate sales but detrimental for substitutes. We also show that the market value of a service bundle is correlated with the degree of contingency between the interrelated services. Finally, we incorporate the profit sharing models from game theory for dividing the bundling profit among the cooperative service providers.Comment: 16 page

    Creating business value from big data and business analytics : organizational, managerial and human resource implications

    Get PDF
    This paper reports on a research project, funded by the EPSRC’s NEMODE (New Economic Models in the Digital Economy, Network+) programme, explores how organizations create value from their increasingly Big Data and the challenges they face in doing so. Three case studies are reported of large organizations with a formal business analytics group and data volumes that can be considered to be ‘big’. The case organizations are MobCo, a mobile telecoms operator, MediaCo, a television broadcaster, and CityTrans, a provider of transport services to a major city. Analysis of the cases is structured around a framework in which data and value creation are mediated by the organization’s business analytics capability. This capability is then studied through a sociotechnical lens of organization/management, process, people, and technology. From the cases twenty key findings are identified. In the area of data and value creation these are: 1. Ensure data quality, 2. Build trust and permissions platforms, 3. Provide adequate anonymization, 4. Share value with data originators, 5. Create value through data partnerships, 6. Create public as well as private value, 7. Monitor and plan for changes in legislation and regulation. In organization and management: 8. Build a corporate analytics strategy, 9. Plan for organizational and cultural change, 10. Build deep domain knowledge, 11. Structure the analytics team carefully, 12. Partner with academic institutions, 13. Create an ethics approval process, 14. Make analytics projects agile, 15. Explore and exploit in analytics projects. In technology: 16. Use visualization as story-telling, 17. Be agnostic about technology while the landscape is uncertain (i.e., maintain a focus on value). In people and tools: 18. Data scientist personal attributes (curious, problem focused), 19. Data scientist as ‘bricoleur’, 20. Data scientist acquisition and retention through challenging work. With regards to what organizations should do if they want to create value from their data the paper further proposes: a model of the analytics eco-system that places the business analytics function in a broad organizational context; and a process model for analytics implementation together with a six-stage maturity model

    Exploring Data Analytics Capability Building: An IS Success and Resource-Based View

    Get PDF
    Business organizations are increasing their investments in resources related to data analytics. However, these investments would not be transferred to business value unless business organizations use these resources efficiently for value creation. Prior literature has suggested that data analytics capabilities (DAC) are critical in generating value from data and data analytics for business organizations. However, prior research has primarily centered on examining the role of resources in data and data-related technologies and ignored the importance of data services and the quality of data-related resources in building DAC and in generating value. Thus, this study develops a data analytics capability building model based on the resource-based view and the information systems success model to explain how data quality, data system quality, and data service quality in business organizations may help build DAC, which could enhance business organizations’ business performance

    Learning Data Quality Analytics for Financial Services

    Get PDF
    Financial institutions put tremendous efforts on the data analytics work associated with the risk data in recent years. Their analytical reports are yet to be accepted by regulators in financial services industry till early 2019. In particular, the enhancement needs to meet the regulatory requirement the APRA CPG 235. To improve data quality, we assist in the data quality analytics by developing a machine learning model to identify current issues and predict future issues. This helps to remediate data as early as possible for the mitigation of risk of re-occurrence. The analytical dimensions are customer related risks (market, credit, operational & liquidity risks) and business segments (private, wholesale & retail banks). The model is implemented with multiple Long Short-Term Memory ( LSTM ) Recurrent Neural Network ( RNNs ) to find the best one for the quality & prediction analytics. They are evaluated by divergent algorithms and cross-validation techniques

    A design science framework for research in health analytics

    Get PDF
    Data analytics provide the ability to systematically identify patterns and insights from a variety of data as organizations pursue improvements in their processes, products, and services. Analytics can be classified based on their ability to: explore, explain, predict, and prescribe. When applied to the field of healthcare, analytics presents a new frontier for business intelligence. In 2013 alone, the Centers for Medicare and Medicaid Services (CMS) reported that the national health expenditure was $2.9 trillion, representing 17.4% of the total United States GDP. The Patient Protection and Affordable Care Act of 2010 (ACA) requires all hospitals to implement electronic medical record (EMR) technologies by year 2014 (Patient Protection and Affordable Care Act, 2010). Moreover, the ACA makes healthcare process and outcomes more transparent by making related data readily available for research. Enterprising organizations are employing analytics and analytical techniques to find patterns in healthcare data (I. R. Bardhan & Thouin, 2013; Hansen, Miron-Shatz, Lau, & Paton, 2014). The goal is to assess the cost and quality of care and identify opportunities for improvement for organizations as well as the healthcare system as a whole. Yet, there remains a need for research to systematically understand, explain, and predict the sources and impacts of the widely observed variance in the cost and quality of care available. This is a driving motivation for research in healthcare. This dissertation conducts a design theoretic examination of the application of advanced data analytics in healthcare. Heart Failure is the number one cause of death and the biggest contributor healthcare costs in the United States. An exploratory examination of the application of predictive analytics is conducted in order to understand the cost and quality of care provided to heart failure patients. The specific research question is addressed: How can we improve and expand upon our understanding of the variances in the cost of care and the quality of care for heart failure? Using state level data from the State Health Plan of North Carolina, a standard readmission model was assessed as a baseline measure for prediction, and advanced analytics were compared to this baseline. This dissertation demonstrates that advanced analytics can improve readmission predictions as well as expand understanding of the profile of a patient readmitted for heart failure. Implications are assessed for academics and practitioners

    The Spin-Off of Scientific Services of Novartis into a New, Independent Technology Company Offering Services to the Pharmaceutical, Chemical, and Nutrition Industry

    Get PDF
    Starting on October 1, 1999, the three sections 'Central Analytics', 'Physics', and 'Catalysis Synthesis Services' of the Scientific Services of Novartis will operate as an independent company. The new company will have about 180 employees and will offer services to customers in the pharmaceutical, chemical, and nutrition industry as well as to authorities and service firms active in these fields. The focus of activities for the new company is the chemical and physical characterization (analytics), optimization of products and processes, and the development and application of special synthetic methods, in particular by utilizing catalysis. Support is offered via single services, comprehensive service packages, or by taking over assignments for entire areas. The combination of a high scientific and technical standard built up on an ISO 9001 quality-management system, including cGMP and GLP, with an attractive working environment will be the basis for an innovative center of chemical and physical expertise

    Time Aware Knowledge Extraction for Microblog Summarization on Twitter

    Full text link
    Microblogging services like Twitter and Facebook collect millions of user generated content every moment about trending news, occurring events, and so on. Nevertheless, it is really a nightmare to find information of interest through the huge amount of available posts that are often noise and redundant. In general, social media analytics services have caught increasing attention from both side research and industry. Specifically, the dynamic context of microblogging requires to manage not only meaning of information but also the evolution of knowledge over the timeline. This work defines Time Aware Knowledge Extraction (briefly TAKE) methodology that relies on temporal extension of Fuzzy Formal Concept Analysis. In particular, a microblog summarization algorithm has been defined filtering the concepts organized by TAKE in a time-dependent hierarchy. The algorithm addresses topic-based summarization on Twitter. Besides considering the timing of the concepts, another distinguish feature of the proposed microblog summarization framework is the possibility to have more or less detailed summary, according to the user's needs, with good levels of quality and completeness as highlighted in the experimental results.Comment: 33 pages, 10 figure

    Translating Learning into Numbers: A Generic Framework for Learning Analytics

    Get PDF
    With the increase in available educational data, it is expected that Learning Analytics will become a powerful means to inform and support learners, teachers and their institutions in better understanding and predicting personal learning needs and performance. However, the processes and requirements behind the beneficial application of Learning and Knowledge Analytics as well as the consequences for learning and teaching are still far from being understood. In this paper, we explore the key dimensions of Learning Analytics (LA), the critical problem zones, and some potential dangers to the beneficial exploitation of educational data. We propose and discuss a generic design framework that can act as a useful guide for setting up Learning Analytics services in support of educational practice and learner guidance, in quality assurance, curriculum development, and in improving teacher effectiveness and efficiency. Furthermore, the presented article intends to inform about soft barriers and limitations of Learning Analytics. We identify the required skills and competences that make meaningful use of Learning Analytics data possible to overcome gaps in interpretation literacy among educational stakeholders. We also discuss privacy and ethical issues and suggest ways in which these issues can be addressed through policy guidelines and best practice examples

    Quadri-dimensional approach for data analytics in mobile networks

    Get PDF
    The telecommunication market is growing at a very fast pace with the evolution of new technologies to support high speed throughput and the availability of a wide range of services and applications in the mobile networks. This has led to a need for communication service providers (CSPs) to shift their focus from network elements monitoring towards services monitoring and subscribers’ satisfaction by introducing the service quality management (SQM) and the customer experience management (CEM) that require fast responses to reduce the time to find and solve network problems, to ensure efficiency and proactive maintenance, to improve the quality of service (QoS) and the quality of experience (QoE) of the subscribers. While both the SQM and the CEM demand multiple information from different interfaces, managing multiple data sources adds an extra layer of complexity with the collection of data. While several studies and researches have been conducted for data analytics in mobile networks, most of them did not consider analytics based on the four dimensions involved in the mobile networks environment which are the subscriber, the handset, the service and the network element with multiple interface correlation. The main objective of this research was to develop mobile network analytics models applied to the 3G packet-switched domain by analysing data from the radio network with the Iub interface and the core network with the Gn interface to provide a fast root cause analysis (RCA) approach considering the four dimensions involved in the mobile networks. This was achieved by using the latest computer engineering advancements which are Big Data platforms and data mining techniques through machine learning algorithms.Electrical and Mining EngineeringM. Tech. (Electrical Engineering
    • …
    corecore