23,379 research outputs found

    Term testing: a case study

    Get PDF
    Purpose and background: The litigation world has many examples of cases where the volume of Electronically Stored Information (ESI) demands that litigators use automatic means to assist with document identification, classification, and filtering. This case study describes one such process for one case. This case study is not a comprehensive analysis of the entire case, only the Term Testing portion. Term Testing is an analytical practice of refining match terms by running in-depth analysis on a sampling of documents. The goal of term testing is to reduce the number of false negatives (relevant / privilege document with no match, also known as “misdetections”) and false positives (documents matched but not actually relevant / privilege) as much as possible. The case was an employment discrimination suit, against a government agency. The collection effort turned up common sources of ESI: hard drives, network shares, CDs and DVDs, and routine e-mail storage and backups. Initial collection, interviews, and reviews had revealed that a few key documents, such as old versions of policies, had not been retained or collected. Then an unexpected source of information was unearthed: one network administrator had been running an unauthorized “just-in-case” tracer on the email system, outside the agency’s document retention policies, which created dozens of tapes full of millions of encrypted compressed emails, covering more years than the agency’s routine email backups. The agency decided to process and review these tracer emails for the missing key documents, even though the overall volume of relevant documents would rise exponentially. The agency had clear motivation to reduce the volume of documents flowing into relevancy and privilege reviews, but had concerns about the defensibility of using an automated process to determine which documents would never be reviewed. The case litigators and Subject Matter Experts (SMEs) decided to use a process of Term Testing to ensure that automated filtering was both defensible and as accurate as possible

    Anomalous Thermoluminescent Kinetics of Irradiated Alkali Halides

    Get PDF
    Anomalous thermoluminescent kinetics of irradiated alkali halide

    Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness

    Get PDF
    <b>Background</b> In this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding - and sometimes preventing - disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.<p></p> <b>Discussion</b> As the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.<p></p> <b>Summary</b> Burden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts

    Lost in translation: a multi-level case study of the metamorphosis of meanings and action in public sector organisational innovation

    Get PDF
    This paper explores the early implementation of an organisational innovation in the UK National Health Service (NHS) - Treatment Centres (TCs) - designed to dramatically reduce waiting lists for elective care. The paper draws on case studies of eight TCs (each at varying stages of their development) and aims to explore how meanings about TCs are created and evolve, and how these meanings impact upon the development of the organisational innovation. Research on organisational meanings needs to take greater account of the fact that modern organisations like the NHS are complex multi-level phenomena, comprising layers of interlacing networks. To understand the pace, direction and impact of organisational innovation and change we need to study the interconnections between meanings across different organisational levels. The data presented in this paper show how the apparently simple, relatively unformed, concept of a TC framed by central government, is translated and transmuted by subsequent layers in the health service administration, and by players in local health economies and, ultimately in the TCs themselves, picking up new rationales, meanings, and significance as it goes. The developmental histories of TCs reveal a range of significant re-workings of macro policy with the result that there is considerable diversity and variation between local TC schemes. The picture is of important disconnections between meanings, that in many ways mirror Weick’s (1976) ‘loosely coupled systems’. The emergent meanings and the direction of micro-level development of TCs appear more strongly determined by interactions within the local TC environment, notably between what we identify as groups of ‘idealists’, ‘pragmatists’, ‘opportunists’ and ‘sceptics’ than by the framing (Goffman 1974) provided by macro and meso organisational levels. While this illustrates the limitations of top down and policy-driven attempts at change, and highlights the crucial importance of the front-line local ‘micro-systems’ (Donaldson & Mohr, 2000) in the overall scheme of implementing organisational innovations, the space or headroom provided by frames at the macro and meso levels can enable local change, albeit at variable speed and with uncertain outcomes

    WavePacket: A Matlab package for numerical quantum dynamics. III: Quantum-classical simulations and surface hopping trajectories

    Full text link
    WavePacket is an open-source program package for numerical simulations in quantum dynamics. Building on the previous Part I [Comp. Phys. Comm. 213, 223-234 (2017)] and Part II [Comp. Phys. Comm. 228, 229-244 (2018)] which dealt with quantum dynamics of closed and open systems, respectively, the present Part III adds fully classical and mixed quantum-classical propagations to WavePacket. In those simulations classical phase-space densities are sampled by trajectories which follow (diabatic or adiabatic) potential energy surfaces. In the vicinity of (genuine or avoided) intersections of those surfaces trajectories may switch between surfaces. To model these transitions, two classes of stochastic algorithms have been implemented: (1) J. C. Tully's fewest switches surface hopping and (2) Landau-Zener based single switch surface hopping. The latter one offers the advantage of being based on adiabatic energy gaps only, thus not requiring non-adiabatic coupling information any more. The present work describes the MATLAB version of WavePacket 6.0.2 which is essentially an object-oriented rewrite of previous versions, allowing to perform fully classical, quantum-classical and quantum-mechanical simulations on an equal footing, i.e., for the same physical system described by the same WavePacket input. The software package is hosted and further developed at the Sourceforge platform, where also extensive Wiki-documentation as well as numerous worked-out demonstration examples with animated graphics are available

    What makes a 'good group'? Exploring the characteristics and performance of undergraduate student groups

    Get PDF
    Group work forms the foundation for much of student learning within higher education, and has many educational, social and professional benefits. This study aimed to explore the determinants of success or failure for undergraduate student teams and to define a ‘good group’ through considering three aspects of group success: the task, the individuals, and the team. We employed a mixed methodology, combining demographic data with qualitative observations and task and peer evaluation scores. We determined associations between group dynamic and behaviour, demographic composition, member personalities and attitudes towards one another, and task success. We also employed a cluster analysis to create a model outlining the attributes of a good small group learning team in veterinary education. This model highlights that student groups differ in measures of their effectiveness as teams, independent of their task performance. On the basis of this, we suggest that groups who achieve high marks in tasks cannot be assumed to have acquired team working skills, and therefore if these are important as a learning outcome, they must be assessed directly alongside the task output

    Electron beam chemistry produces high purity metals

    Get PDF
    Application of radiation chemistry for deposition of metals by irradiation of aqueous solutions with high energy electrons is presented. Design of reaction vessel for irradiation of solution is illustrated. Features of radiochemical technique and procedures followed are described

    The effect of relative plasma plume delay on the properties of complex oxide films grown by multi-laser multi-target combinatorial pulsed laser deposition

    No full text
    We report the effects of relative time delay of plasma plumes on thin garnet crystal films fabricated by dual-beam, combinatorial pulsed laser deposition. Relative plume delay was found to affect both the lattice constant and elemental composition of mixed Gd3Ga5O12 (GGG) and Gd3Sc2Ga5O12 (GSGG) films. Further analysis of the plasmas was undertaken using a Langmuir probe, which revealed that for relative plume delays shorter than ~200 ”s, the second plume travels through a partial vacuum created by the first plume, leading to higher energy ion bombardment of the growing film. The resulting in-plane stresses are consistent with the transition to a higher value of lattice constant normal to the film plane that was observed around this delay value. At delays shorter than ~10 ”s, plume propagation was found to overlap, leading to scattering of lighter ions from the plume and a change in stoichiometry of the resultant films
    • 

    corecore