837 research outputs found

    Easier and faster is not always better: Grounded theory of a large-scale, system transformation on the clinical work of emergency nurses and physicians

    Get PDF
    Lean Thinking was pioneered during the 1980’s by the Toyota Motor Company as a method of process improvement for their production lines. Since the early 2000’s, there have been published reports of using Lean to redesign healthcare systems. Its effectiveness as a quality improvement method for healthcare has been contested due, in part, to our limited contextual understanding of how Lean affects the working conditions, and clinical workflow, of healthcare professionals. The objective of this dissertation was to explore how a Lean intervention may impact clinical work, and within what contexts. A realist grounded theory approach was used to explore the clinical work of nurses and physicians practicing in two adult emergency medicine departments (ED) from a single teaching hospital in Canada. The hospital has 1,000 beds, and its two ED annually treat about 128,000 patients. In 2013, both sites began a large-scale, Lean-driven, system transformation that was intended to make their ED work easier, faster and better. Three grounded theories (GT) were developed from interviews with 15 nurses and five physicians. The first GT describes ways in which the reconfigured ED disrupted professionals’ established practice routines and resulted in the intensification of their clinical work. Professionals also identified indications of deskilling of nurses’ work and described how the new, push-forward model of patient care detrimentally impacted their physical, cognitive, and emotional well-being. A major element of the Lean intervention was the construction of a three-zone, front cell at both sites. The second GT describes how the physical configuration of the front cell further intensified professionals’ clinical work by requiring them to actively search for spaces better-affording privacy and confidentiality for patient encounters. The third GT describes how professionals perceived their hospital fell short on demonstrating effective leadership throughout the development and execution of its Lean-driven plans. Of particular salience to nurses and doctors was how their institution had failed to deliver on a set of procedural and structural changes they recalled were promised to occur as a result of the Lean intervention. Rather than support nurses and physicians in their management of the complexities that characterize emergency medicine, the physical and process-based changes introduced by the Lean intervention acted to complicate further the environment under which they delivered patient care. The GT illuminated some unintended consequences associated with accelerating patient flow on the clinical workflow and perceived well-being of healthcare professionals. This dissertation identifies some areas for reconsideration by the ED departments along with ideas for future research. Keywords: Emergency medicine; Lean Thinking; Hospital; Grounded Theor

    Anonymization of Event Logs for Network Security Monitoring

    Get PDF
    A managed security service provider (MSSP) must collect security event logs from their customers’ network for monitoring and cybersecurity protection. These logs need to be processed by the MSSP before displaying it to the security operation center (SOC) analysts. The employees generate event logs during their working hours at the customers’ site. One challenge is that collected event logs consist of personally identifiable information (PII) data; visible in clear text to the SOC analysts or any user with access to the SIEM platform. We explore how pseudonymization can be applied to security event logs to help protect individuals’ identities from the SOC analysts while preserving data utility when possible. We compare the impact of using different pseudonymization functions on sensitive information or PII. Non-deterministic methods provide higher level of privacy but reduced utility of the data. Our contribution in this thesis is threefold. First, we study available architectures with different threat models, including their strengths and weaknesses. Second, we study pseudonymization functions and their application to PII fields; we benchmark them individually, as well as in our experimental platform. Last, we obtain valuable feedbacks and lessons from SOC analysts based on their experience. Existing works[43, 44, 48, 39] are generally restricting to the anonymization of the IP traces, which is only one part of the SOC analysts’ investigation of PCAP files inspection. In one of the closest work[47], the authors provide useful, practical anonymization methods for the IP addresses, ports, and raw logs

    Privacy in the Smart City - Applications, Technologies, Challenges and Solutions

    Get PDF
    Many modern cities strive to integrate information technology into every aspect of city life to create so-called smart cities. Smart cities rely on a large number of application areas and technologies to realize complex interactions between citizens, third parties, and city departments. This overwhelming complexity is one reason why holistic privacy protection only rarely enters the picture. A lack of privacy can result in discrimination and social sorting, creating a fundamentally unequal society. To prevent this, we believe that a better understanding of smart cities and their privacy implications is needed. We therefore systematize the application areas, enabling technologies, privacy types, attackers and data sources for the attacks, giving structure to the fuzzy term “smart city”. Based on our taxonomies, we describe existing privacy-enhancing technologies, review the state of the art in real cities around the world, and discuss promising future research directions. Our survey can serve as a reference guide, contributing to the development of privacy-friendly smart cities

    PROCESS CONFORMANCE TESTING: A METHODOLOGY TO IDENTIFY AND UNDERSTAND PROCESS VIOLATIONS IN ENACTMENT OF SOFTWARE PROCESSES

    Get PDF
    Today's software development is driven by software processes and practices that when followed increase the chances of building high quality software products. Not following these guidelines results in increased risk that the goal for the software's quality characteristics cannot be reached. Current process analysis approaches are limited in identifying and understanding process deviations and ultimately fail in comprehending why a process does not work in a given environment and what steps of the process have to be changed and tailored. In this work I will present a methodology for formulating, identifying and investigating process violations in the execution of software processes. The methodology, which can be thought of as "Process Conformance Testing", consists of a four step iterative model, compromising templates and tools. A strong focus is set on identifying violations in a cost efficient and unobtrusive manner by utilizing automatically collected data gathered through commonly used software development tools, such as version control systems. To evaluate the usefulness and correctness of the model a series of four studies have been conducted in both classroom and professional environments. A total of eight different software processes have been investigated and tested. The results of the studies show that the steps and iterative character of the methodology are useful for formulating and tailoring violation detection strategies and investigating violations in classroom study environments and professional environments. All the investigated processes were violated in some way, which emphasizes the importance of conformance measurement. This is especially important when running an empirical study to evaluate the effectiveness of a software process, as the experimenters want to make sure they are evaluating the specified process and not a variation of it. Violation detection strategies were tailored based upon analysis of the history of violations and feedback from then enactors and mangers yielding greater precision of identification of non-conformities. The overhead cost of the approach is shown to be feasible with a 3.4% (professional environment) and 12.1% (classroom environment) overhead. One interesting side result is that process enactors did not always follow the process for good reason, e.g. the process was not tailored for the environment, it was not specified at the right level of granularity, or was too difficult to follow. Two specific examples in this thesis are XP Pair Switching and Test Driven Development. In XP Pair Switching, the practice was violated because the frequency of switching was too high. The definition of Test Driven Development is simple and clear but requires a fair amount of discipline to follow, especially by novice programmers

    In the IP of the Beholder: Strategies for Active IPv6 Topology Discovery

    Get PDF
    Existing methods for active topology discovery within the IPv6 Internet largely mirror those of IPv4. In light of the large and sparsely populated address space, in conjunction with aggressive ICMPv6 rate limiting by routers, this work develops a different approach to Internet-wide IPv6 topology mapping. We adopt randomized probing techniques in order to distribute probing load, minimize the effects of rate limiting, and probe at higher rates. Second, we extensively analyze the efficiency and efficacy of various IPv6 hitlists and target generation methods when used for topology discovery, and synthesize new target lists based on our empirical results to provide both breadth (coverage across networks) and depth (to find potential subnetting). Employing our probing strategy, we discover more than 1.3M IPv6 router interface addresses from a single vantage point. Finally, we share our prober implementation, synthesized target lists, and discovered IPv6 topology results

    The Politics of Exhaustion: Immigration Control in the British-French Border Zone

    Get PDF
    Within a climate of growing anti-immigration and populist forces gaining traction across Europe, and in response to the increased number of prospective asylum seekers arriving in Europe, recent years have seen the continued hardening of borders and a disconcerting evolution of new forms of immigration control measures utilised by states. Based on extensive field research carried out amongst displaced people in Europe in 2016-2019, this article highlights the way in which individuals in northern France are finding themselves trapped in a violent border zone, unable to move forward whilst having no obvious alternative way out of their predicament. The article seeks to illustrate the violent dynamics inherent in the immigration control measures in this border zone, characterised by both direct physical violence as well as banalised and structural forms of violence, including state neglect through the denial of services and care. The author suggests that the raft of violent measures and micro practices authorities resort to in the French-British border zone could be understood as constituting one of the latest tools for European border control and obstruction of the access to asylum procedures; a Politics of Exhaustion

    Searching Places Unknown: Law Enforcement Jurisdiction on the Dark Web

    Get PDF
    The use of hacking tools by law enforcement to pursue criminal suspects who have anonymized their communications on the dark web presents a looming flashpoint between criminal procedure and international law. Criminal actors who use the dark web (for instance, to commit crimes or to evade authorities) obscure digital footprints left behind with third parties, rendering existing surveillance methods obsolete. In response, law enforcement has implemented hacking techniques that deploy surveillance software over the Internet to directly access and control criminals’ devices. The practical reality of the underlying technologies makes it inevitable that foreign-located computers will be subject to remote “searches” and “seizures.” The result may well be the greatest extraterritorial expansion of enforcement jurisdiction in U.S. law enforcement history. This Article examines how the government’s use of hacking tools on the dark web profoundly disrupts the legal architecture on which cross-border criminal investigations rest. These overseas cyberoperations raise increasingly difficult questions regarding who may authorize these activities, where they may be deployed, and against whom they may lawfully be executed. The rules of criminal procedure fail to regulate law enforcement hacking because they allow these critical decisions to be made by rank-and-file officials despite potentially disruptive foreign relations implications. This Article outlines a regulatory framework that reallocates decisionmaking to the institutional actors who are best suited to determine U.S. foreign policy and avoids sacrificing law enforcement’s ability to identify and locate criminal suspects who have taken cover on the dark web
    • …
    corecore