37,378 research outputs found

    A Literature Review on Predictive Monitoring of Business Processes

    Get PDF
    Oleme lĂ€bi vaadanud mitmesuguseid ennetava jĂ€lgimise meetodeid Ă€riprotsessides. Prognoositavate seirete eesmĂ€rk on aidata ettevĂ”tetel oma eesmĂ€rke saavutada, aidata neil valida Ă”ige Ă€rimudel, prognoosida tulemusi ja aega ning muuta Ă€riprotsessid riskantsemaks. Antud vĂ€itekirjaga oleme hoolikalt kogunud ja ĂŒksikasjalikult lĂ€bi vaadanud selle vĂ€itekirja teemal oleva kirjanduse. Kirjandusuuringu tulemustest ja tĂ€helepanekutest lĂ€htuvalt oleme hoolikalt kavandanud ennetava jĂ€lgimisraamistiku. Raamistik on juhendiks ettevĂ”tetele ja teadlastele, teadustöötajatele, kes uurivad selles valdkonnas ja ettevĂ”tetele, kes soovivad neid tehnikaid oma valdkonnas rakendada.The goal of predictive monitoring is to help the business achieve their goals, help them take the right business path, predict outcomes, estimate delivery time, and make business processes risk aware. In this thesis, we have carefully collected and reviewed in detail all literature which falls in this process mining category. The objective of the thesis is to design a Predictive Monitoring Framework and classify the different predictive monitoring techniques. The framework acts as a guide for researchers and businesses. Researchers who are investigating in this field and businesses who want to apply these techniques in their respective field

    Surveillance, big data and democracy: lessons for Australia from the US and UK

    Get PDF
    This article argues that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society. Introduction In the era of big data, where people find themselves surveilled in ever more finely granulated aspects of their lives, and where the data profiles built from an accumulation of data gathered about themselves and others are used to predict as well as shape their behaviours, the question of privacy protection arises constantly. In this article we interrogate whether the discourse of privacy is sufficient to address this new paradigm of information flow and control. What we confront in this area is a set of practices concerning the collection, aggregation, sharing, interrogation and uses of data on a scale that crosses private and public boundaries, jurisdictional boundaries, and importantly, the boundaries between reality and simulation. The consequences of these practices are emerging as sometimes useful and sometimes damaging to governments, citizens and commercial organisations. Understanding how to regulate this sphere of activity to address the harms, to create an infrastructure of accountability, and to bring more transparency to the practices mentioned, is a challenge of some complexity. Using privacy frameworks may not provide the solutions or protections that ultimately are being sought. This article is concerned with data gathering and surveillance practices, by business and government, and the implications for individual privacy in the face of widespread collection and use of big data. We will firstly outline the practices around data and the issues that arise from such practices. We then consider how courts in the United Kingdom (‘UK’) and the United States (‘US’) are attempting to frame these issues using current legal frameworks, and finish by considering the Australian context. Notably the discourse around privacy protection differs significantly across these jurisdictions, encompassing elements of constitutional rights and freedoms, specific legislative schemes, data protection, anti-terrorist and criminal laws, tort and equity. This lack of a common understanding of what is or what should be encompassed within privacy makes it a very fragile creature indeed. On the basis of the exploration of these issues, we conclude that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society

    Optimal Portfolio Management in Alaska: A Case Study on Risk Characteristics of Environmental Consulting Companies

    Get PDF
    A Project Submitted in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE in Project ManagementSharp declines in global oil prices have led to a marked contraction in Alaska’s natural resource dependent economy. This, coupled with record the State’s budgetary shortfalls and a decrease in incoming federal dollars, has created a climate where environmental consulting companies must accept riskier projects to balance portfolio growth and security. As a result, companies must adopt a risk-based portfolio management approach as both a high level strategy and a core management practice. It is important to specifically identify projects best suited for an organization’s tolerance for risk based off of the supply and demand of the industry in rapidly changing economic conditions. Therefore, the aims of this project report are to help environmental consulting companies identify risk characteristics and manage their portfolio, as well as develop a tool to guide decision-making and selecting projects best suited for a companies’ portfolio strategy. The results of this research may provide Alaska based environmental companies with a clear understanding of the types of projects that offer both development and financial security for an organization. This research paper will present the methodology, results, and an environmental consulting portfolio management tool.Title Page / Table of Contents / List of Exhibits / Abstract / Introduction / Background / Literature Review / Project Methodology / Research Methodology / Presentation and Analysis of Data from Survey / Data Validation From Survey / Conclusion / Recommendation / Project Conclusion / Recommendations for Further Research / References / Appendi

    Alarm-Based Prescriptive Process Monitoring

    Full text link
    Predictive process monitoring is concerned with the analysis of events produced during the execution of a process in order to predict the future state of ongoing cases thereof. Existing techniques in this field are able to predict, at each step of a case, the likelihood that the case will end up in an undesired outcome. These techniques, however, do not take into account what process workers may do with the generated predictions in order to decrease the likelihood of undesired outcomes. This paper proposes a framework for prescriptive process monitoring, which extends predictive process monitoring approaches with the concepts of alarms, interventions, compensations, and mitigation effects. The framework incorporates a parameterized cost model to assess the cost-benefit tradeoffs of applying prescriptive process monitoring in a given setting. The paper also outlines an approach to optimize the generation of alarms given a dataset and a set of cost model parameters. The proposed approach is empirically evaluated using a range of real-life event logs
    • 

    corecore