19,731 research outputs found
Term testing: a case study
Purpose and background: The litigation world has many examples of cases where the volume of Electronically Stored Information (ESI) demands that litigators use automatic means to assist with document identification, classification, and filtering. This case study describes one such process for one case. This case study is not a
comprehensive analysis of the entire case, only the Term Testing portion.
Term Testing is an analytical practice of refining match terms by running in-depth analysis on a sampling
of documents. The goal of term testing is to reduce the number of false negatives (relevant / privilege
document with no match, also known as “misdetections”) and false positives (documents matched but
not actually relevant / privilege) as much as possible.
The case was an employment discrimination suit, against a government agency. The collection effort
turned up common sources of ESI: hard drives, network shares, CDs and DVDs, and routine e-mail
storage and backups. Initial collection, interviews, and reviews had revealed that a few key documents,
such as old versions of policies, had not been retained or collected.
Then an unexpected source of information was unearthed: one network administrator had been running
an unauthorized “just-in-case” tracer on the email system, outside the agency’s document retention
policies, which created dozens of tapes full of millions of encrypted compressed emails, covering more
years than the agency’s routine email backups. The agency decided to process and review these tracer emails for the missing key documents, even though the overall volume of relevant documents would rise
exponentially.
The agency had clear motivation to reduce the volume of documents flowing into relevancy and privilege
reviews, but had concerns about the defensibility of using an automated process to determine which
documents would never be reviewed. The case litigators and Subject Matter Experts (SMEs) decided to
use a process of Term Testing to ensure that automated filtering was both defensible and as accurate as
possible
Anomalous Thermoluminescent Kinetics of Irradiated Alkali Halides
Anomalous thermoluminescent kinetics of irradiated alkali halide
WavePacket: A Matlab package for numerical quantum dynamics. III: Quantum-classical simulations and surface hopping trajectories
WavePacket is an open-source program package for numerical simulations in
quantum dynamics. Building on the previous Part I [Comp. Phys. Comm. 213,
223-234 (2017)] and Part II [Comp. Phys. Comm. 228, 229-244 (2018)] which dealt
with quantum dynamics of closed and open systems, respectively, the present
Part III adds fully classical and mixed quantum-classical propagations to
WavePacket. In those simulations classical phase-space densities are sampled by
trajectories which follow (diabatic or adiabatic) potential energy surfaces. In
the vicinity of (genuine or avoided) intersections of those surfaces
trajectories may switch between surfaces. To model these transitions, two
classes of stochastic algorithms have been implemented: (1) J. C. Tully's
fewest switches surface hopping and (2) Landau-Zener based single switch
surface hopping. The latter one offers the advantage of being based on
adiabatic energy gaps only, thus not requiring non-adiabatic coupling
information any more.
The present work describes the MATLAB version of WavePacket 6.0.2 which is
essentially an object-oriented rewrite of previous versions, allowing to
perform fully classical, quantum-classical and quantum-mechanical simulations
on an equal footing, i.e., for the same physical system described by the same
WavePacket input. The software package is hosted and further developed at the
Sourceforge platform, where also extensive Wiki-documentation as well as
numerous worked-out demonstration examples with animated graphics are
available
What makes a 'good group'? Exploring the characteristics and performance of undergraduate student groups
Group work forms the foundation for much of student learning within higher education, and has many educational, social and professional benefits. This study aimed to explore the determinants of success or failure for undergraduate student teams and to define a ‘good group’ through considering three aspects of group success: the task, the individuals, and the team. We employed a mixed methodology, combining demographic data with qualitative observations and task and peer evaluation scores. We determined associations between group dynamic and behaviour, demographic composition, member personalities and attitudes towards one another, and task success. We also employed a cluster analysis to create a model outlining the attributes of a good small group learning team in veterinary education. This model highlights that student groups differ in measures of their effectiveness as teams, independent of their task performance. On the basis of this, we suggest that groups who achieve high marks in tasks cannot be assumed to have acquired team working skills, and therefore if these are important as a learning outcome, they must be assessed directly alongside the task output
The effect of relative plasma plume delay on the properties of complex oxide films grown by multi-laser multi-target combinatorial pulsed laser deposition
We report the effects of relative time delay of plasma plumes on thin garnet crystal films fabricated by dual-beam, combinatorial pulsed laser deposition. Relative plume delay was found to affect both the lattice constant and elemental composition of mixed Gd3Ga5O12 (GGG) and Gd3Sc2Ga5O12 (GSGG) films. Further analysis of the plasmas was undertaken using a Langmuir probe, which revealed that for relative plume delays shorter than ~200 µs, the second plume travels through a partial vacuum created by the first plume, leading to higher energy ion bombardment of the growing film. The resulting in-plane stresses are consistent with the transition to a higher value of lattice constant normal to the film plane that was observed around this delay value. At delays shorter than ~10 µs, plume propagation was found to overlap, leading to scattering of lighter ions from the plume and a change in stoichiometry of the resultant films
From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument
<b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p>
<b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p>
<b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p>
<b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study
Unusual Phase Transitions and Magnetoelastic Coupling in TlFe1.6Se2 Single Crystals
Structural, magnetic, electrical transport, and heat capacity data are
reported for single crystals of TlFe1.6Se2. This compound crystallizes in a
tetragonal structure similar to the ThCr2Si2 structure, but with vacancies in
the Fe layer. The vacancies can be ordered or disordered depending on
temperature and thermal history. If the vacancies are ordered, the basal plane
lattice constant increases from a to \sqrt{5}a. Antiferromagnetic order with
the Fe spins along the c-axis occurs below T_N ~ 430K as shown by single
crystal neutron diffraction and the magnetic structure is reported. In
addition, for the vacancy ordered crystal, two other phase transitions are
found at T_1 ~ 140K, and T_2 ~ 100K. The phase transitions at T_1 and T_2 are
evident in heat capacity, magnetic susceptibility, resistivity data, a and c
lattice parameters, and in the unusual temperature dependence of the magnetic
order parameter determined from neutron scattering. The phase transitions at
T_1 and T_2 result in significant changes in the magnetic moment per iron, with
1.72(6)\mu_B observed at 300K, 2.07(9)\mu_B at 140\,K, 1.90(9)\,\mu_B at
115\,K, and 1.31(8)\mu_B for 5\,K if the same "block checkerboard" magnetic
structure is used at all temperatures. The phase transitions appear to be
driven by small changes in the c lattice constant, large magnetoelastic
coupling, and the localization of carriers with decreasing temperature.Comment: Accepted for publication in Physical Review
- …