4,417 research outputs found

    Event series prediction via non-homogeneous Poisson process modelling

    Get PDF
    Data streams whose events occur at random arrival times rather than at the regular, tick-tock intervals of traditional time series are increasingly prevalent. Event series are continuous, irregular and often highly sparse, differing greatly in nature to the regularly sampled time series traditionally the concern of hard sciences. As mass sets of such data have become more common, so interest in predicting future events in them has grown. Yet repurposing of traditional forecasting approaches has proven ineffective, in part due to issues such as sparsity, but often due to inapplicable underpinning assumptions such as stationarity and ergodicity. In this paper we derive a principled new approach to forecasting event series that avoids such assumptions, based upon: 1. the processing of event series datasets in order to produce a parameterized mixture model of non-homogeneous Poisson processes; and 2. application of a technique called parallel forecasting that uses these processes’ rate functions to directly generate accurate temporal predictions for new query realizations. This approach uses forerunners of a stochastic process to shed light on the distribution of future events, not for themselves, but for realizations that subsequently follow in their footsteps

    Performance modelling with adaptive hidden Markov models and discriminatory processor sharing queues

    Get PDF
    In modern computer systems, workload varies at different times and locations. It is important to model the performance of such systems via workload models that are both representative and efficient. For example, model-generated workloads represent realistic system behaviour, especially during peak times, when it is crucial to predict and address performance bottlenecks. In this thesis, we model performance, namely throughput and delay, using adaptive models and discrete queues. Hidden Markov models (HMMs) parsimoniously capture the correlation and burstiness of workloads with spatiotemporal characteristics. By adapting the batch training of standard HMMs to incremental learning, online HMMs act as benchmarks on workloads obtained from live systems (i.e. storage systems and financial markets) and reduce time complexity of the Baum-Welch algorithm. Similarly, by extending HMM capabilities to train on multiple traces simultaneously it follows that workloads of different types are modelled in parallel by a multi-input HMM. Typically, the HMM-generated traces verify the throughput and burstiness of the real data. Applications of adaptive HMMs include predicting user behaviour in social networks and performance-energy measurements in smartphone applications. Equally important is measuring system delay through response times. For example, workloads such as Internet traffic arriving at routers are affected by queueing delays. To meet quality of service needs, queueing delays must be minimised and, hence, it is important to model and predict such queueing delays in an efficient and cost-effective manner. Therefore, we propose a class of discrete, processor-sharing queues for approximating queueing delay as response time distributions, which represent service level agreements at specific spatiotemporal levels. We adapt discrete queues to model job arrivals with distributions given by a Markov-modulated Poisson process (MMPP) and served under discriminatory processor-sharing scheduling. Further, we propose a dynamic strategy of service allocation to minimise delays in UDP traffic flows whilst maximising a utility function.Open Acces

    Going Digital First while Safeguarding the Physical Core: How an Automotive Incumbent Searches for Relevance in Disruptive Times

    Get PDF
    Incumbent firms typically face significant risk of losing the relevance of their physical core when facing industry disruption driven by digital technologies. Existing literature emphasizes a digital first approach, whereby firm offerings are fundamentally redeveloped from a digital point of view, from the point of conception. While this prescription can help accelerate innovation, it does not tell us how incumbents might safeguard the relevance of their traditional physical core resources when going digital first. This is important, since major discontinuities in strategic repositioning, while often celebrated in digital innovation and transformation literature, create significant risks to firm survival. To this end, we conduct a grounded analysis of a European automotive firm’s innovation journey over an eight-year period. We contribute to the digital innovation and transformation literature by developing a process model explaining how a digital first approach can be employed in a way that also safeguards the physical core

    Organizational Ambidexterity and Not-for-profit Financial Performance

    Get PDF
    The purpose of this dissertation is to extend the concept of organizational ambidexterity (OA) into the domain of not-for-profit (NFP) organizations. These organizations are subject to many of the same demands as their for-profit counterparts, yet research has not been conducted on how NFPs manage the competing pressures of refining existing routines for efficiency with the need to grow and innovate. This dissertation includes two portions: a quantitative analysis of a large NFP-rating agency dataset and qualitative interviews with executive directors and managers from within the food banking industry to identify the processes in use at a sample of ambidextrous organizations. The quantitative study uses a financial outcome—fiscal performance—in order to assess the degree to which financial outcomes are affected by exploration and exploitation, two actions central to the ambidexterity paradigm. Results of this study indicate that although exploration and exploitation are related to fiscal performance within NFPs, the results vary greatly depending on the industry in question. The qualitative portion of the study indicates that three activities aid NFPs in engaging in exploration and exploitation: managing knowledge, retaining professional talent, and enabling leadership. This study concludes with implications for researchers and managers, as well as suggestions for future research extensions

    Digital Preservation Services : State of the Art Analysis

    Get PDF
    Research report funded by the DC-NET project.An overview of the state of the art in service provision for digital preservation and curation. Its focus is on the areas where bridging the gaps is needed between e-Infrastructures and efficient and forward-looking digital preservation services. Based on a desktop study and a rapid analysis of some 190 currently available tools and services for digital preservation, the deliverable provides a high-level view on the range of instruments currently on offer to support various functions within a preservation system.European Commission, FP7peer-reviewe

    Causal mechanisms that enable institutionalisation of open government data in Kenya

    Get PDF
    Open Government Data (OGD) has become a topic of prominence during the last decade. However, most governments have not realized the desired outcomes from OGD, which implies that the envisaged value streams have not been realized. This study defines three objectives that will help address this shortcoming. First, it seeks to identify the causal mechanisms that lead to effective institutionalization and sustainability of OGD initiatives in a developing country context. Second, it seeks to identify the social, economic, cultural, political structures and components that describe the OGD context. Third, it seeks to identify the underlying contextmechanism- outcome (CMO) configurations in the Kenya Open Data Initiative (KODI). The guiding philosophy for this qualitative study is critical realism, which is implemented using Pawson & Tilley's realist evaluation model. Data is obtained through observation of open data events, semi-structured interviews and documentary materials from websites and policy documents. Fereday & Muir-Cochrane's five-stage thematic analysis model is applied in conducting data analysis. Three main contributions arise from this study. The first contribution is the open data institutionalization analysis guide. This study collates several institutionalization concepts from literature with the aim of developing a lens for analyzing OGD initiatives. The second contribution is the identification of supporting mechanisms, including a description of the current CMO configurations. The resulting case study provides an in-depth account of KODI between 2011 and 2016. This will assist policy makers in understanding the current setup, identifying gaps, and establishing or supporting existing support structures and mechanisms. The third contribution is related to scarcity of empirical work based on critical realism in the field of information systems. This research will act as a reference point for future IS research, in determining how critical realism can be applied to conduct similar studies
    corecore