8,594 research outputs found

    Surveillance, big data and democracy: lessons for Australia from the US and UK

    Get PDF
    This article argues that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society. Introduction In the era of big data, where people find themselves surveilled in ever more finely granulated aspects of their lives, and where the data profiles built from an accumulation of data gathered about themselves and others are used to predict as well as shape their behaviours, the question of privacy protection arises constantly. In this article we interrogate whether the discourse of privacy is sufficient to address this new paradigm of information flow and control. What we confront in this area is a set of practices concerning the collection, aggregation, sharing, interrogation and uses of data on a scale that crosses private and public boundaries, jurisdictional boundaries, and importantly, the boundaries between reality and simulation. The consequences of these practices are emerging as sometimes useful and sometimes damaging to governments, citizens and commercial organisations. Understanding how to regulate this sphere of activity to address the harms, to create an infrastructure of accountability, and to bring more transparency to the practices mentioned, is a challenge of some complexity. Using privacy frameworks may not provide the solutions or protections that ultimately are being sought. This article is concerned with data gathering and surveillance practices, by business and government, and the implications for individual privacy in the face of widespread collection and use of big data. We will firstly outline the practices around data and the issues that arise from such practices. We then consider how courts in the United Kingdom (‘UK’) and the United States (‘US’) are attempting to frame these issues using current legal frameworks, and finish by considering the Australian context. Notably the discourse around privacy protection differs significantly across these jurisdictions, encompassing elements of constitutional rights and freedoms, specific legislative schemes, data protection, anti-terrorist and criminal laws, tort and equity. This lack of a common understanding of what is or what should be encompassed within privacy makes it a very fragile creature indeed. On the basis of the exploration of these issues, we conclude that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society

    Improvement of the Model of Using Analytical Procedures at Internal Auditing of a Bank

    Get PDF
    From 2014 the number of banks in Ukraine essentially decreased from 180 to 76 for 01.07.2019. The unstable situation was traced in the country for this period of time. Liquidation of 104 banks demonstrated that they couldn't manage their risks and make correct managerial decisions timely, and also the system of internal control functioned badly. Just this system includes a subdivision of internal auditing that didn't cope with timely revelation of inexactitudes, so reasonable recommendations as to managerial decisions weren't elaborated. For providing functions of internal auditing of a bank, the subdivision uses auditing procedures. They include analytical procedures that, in their turn, are principal for attaining aims of an auditing task. The article considers most urgent questions of using analytical procedures in internal bank auditing. The essence of the definition of “analytical procedures” has been considered and specified. Analytical procedures have been separated from the composition of auditing ones, and their theoretical aspect has been analyzed. The classification of methods of analytical procedures has been analyzed for getting auditing evidences. Analytical procedures consist of methods of internal system estimation and bank financial condition analysis and also analysis of their business-processes. Advantages and defects of methodical components of analytical procedures have been presented and analyzed. The stages of an auditing task have been studied. Analytical procedures are considered at three stages of internal auditing: planning, performing the engagement and resulting, demonstrated through the prism of economic analysis. International standards of the professional practice of internal auditing that regulate it are considered. Questions of working papers that generalize a result of using analytical procedures are separated

    A Literature Review on Predictive Monitoring of Business Processes

    Get PDF
    Oleme lĂ€bi vaadanud mitmesuguseid ennetava jĂ€lgimise meetodeid Ă€riprotsessides. Prognoositavate seirete eesmĂ€rk on aidata ettevĂ”tetel oma eesmĂ€rke saavutada, aidata neil valida Ă”ige Ă€rimudel, prognoosida tulemusi ja aega ning muuta Ă€riprotsessid riskantsemaks. Antud vĂ€itekirjaga oleme hoolikalt kogunud ja ĂŒksikasjalikult lĂ€bi vaadanud selle vĂ€itekirja teemal oleva kirjanduse. Kirjandusuuringu tulemustest ja tĂ€helepanekutest lĂ€htuvalt oleme hoolikalt kavandanud ennetava jĂ€lgimisraamistiku. Raamistik on juhendiks ettevĂ”tetele ja teadlastele, teadustöötajatele, kes uurivad selles valdkonnas ja ettevĂ”tetele, kes soovivad neid tehnikaid oma valdkonnas rakendada.The goal of predictive monitoring is to help the business achieve their goals, help them take the right business path, predict outcomes, estimate delivery time, and make business processes risk aware. In this thesis, we have carefully collected and reviewed in detail all literature which falls in this process mining category. The objective of the thesis is to design a Predictive Monitoring Framework and classify the different predictive monitoring techniques. The framework acts as a guide for researchers and businesses. Researchers who are investigating in this field and businesses who want to apply these techniques in their respective field

    Predictive Process Monitoring Methods: Which One Suits Me Best?

    Full text link
    Predictive process monitoring has recently gained traction in academia and is maturing also in companies. However, with the growing body of research, it might be daunting for companies to navigate in this domain in order to find, provided certain data, what can be predicted and what methods to use. The main objective of this paper is developing a value-driven framework for classifying existing work on predictive process monitoring. This objective is achieved by systematically identifying, categorizing, and analyzing existing approaches for predictive process monitoring. The review is then used to develop a value-driven framework that can support organizations to navigate in the predictive process monitoring field and help them to find value and exploit the opportunities enabled by these analysis techniques

    Implementing system-wide risk stratification approaches: a review of critical success and failure factors

    Get PDF
    An Evidence Check rapid review brokered by the Sax Institute for the NSW Agency for Clinical Innovatio

    Machine Learning in Business Process Monitoring: A Comparison of Deep Learning and Classical Approaches Used for Outcome Prediction

    Get PDF
    Predictive process monitoring aims at forecasting the behavior, performance, and outcomes of business processes at runtime. It helps identify problems before they occur and re-allocate resources before they are wasted. Although deep learning (DL) has yielded breakthroughs, most existing approaches build on classical machine learning (ML) techniques, particularly when it comes to outcome-oriented predictive process monitoring. This circumstance reflects a lack of understanding about which event log properties facilitate the use of DL techniques. To address this gap, the authors compared the performance of DL (i.e., simple feedforward deep neural networks and long short term memory networks) and ML techniques (i.e., random forests and support vector machines) based on five publicly available event logs. It could be observed that DL generally outperforms classical ML techniques. Moreover, three specific propositions could be inferred from further observations: First, the outperformance of DL techniques is particularly strong for logs with a high variant-to-instance ratio (i.e., many non-standard cases). Second, DL techniques perform more stably in case of imbalanced target variables, especially for logs with a high event-to-activity ratio (i.e., many loops in the control flow). Third, logs with a high activity-to-instance payload ratio (i.e., input data is predominantly generated at runtime) call for the application of long short term memory networks. Due to the purposive sampling of event logs and techniques, these findings also hold for logs outside this study

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed
    • 

    corecore