5,522 research outputs found

    Discovery-led refinement in e-discovery investigations: sensemaking, cognitive ergonomics and system design.

    Get PDF
    Given the very large numbers of documents involved in e-discovery investigations, lawyers face a considerable challenge of collaborative sensemaking. We report findings from three workplace studies which looked at different aspects of how this challenge was met. From a sociotechnical perspective, the studies aimed to understand how investigators collectively and individually worked with information to support sensemaking and decision making. Here, we focus on discovery-led refinement; specifically, how engaging with the materials of the investigations led to discoveries that supported refinement of the problems and new strategies for addressing them. These refinements were essential for tractability. We begin with observations which show how new lines of enquiry were recursively embedded. We then analyse the conceptual structure of a line of enquiry and consider how reflecting this in e-discovery support systems might support scalability and group collaboration. We then focus on the individual activity of manual document review where refinement corresponded with the inductive identification of classes of irrelevant and relevant documents within a collection. Our observations point to the effects of priming on dealing with these efficiently and to issues of cognitive ergonomics at the human–computer interface. We use these observations to introduce visualisations that might enable reviewers to deal with such refinements more efficiently

    Smoothed Analysis of the Minimum-Mean Cycle Canceling Algorithm and the Network Simplex Algorithm

    Get PDF
    The minimum-cost flow (MCF) problem is a fundamental optimization problem with many applications and seems to be well understood. Over the last half century many algorithms have been developed to solve the MCF problem and these algorithms have varying worst-case bounds on their running time. However, these worst-case bounds are not always a good indication of the algorithms' performance in practice. The Network Simplex (NS) algorithm needs an exponential number of iterations for some instances, but it is considered the best algorithm in practice and performs best in experimental studies. On the other hand, the Minimum-Mean Cycle Canceling (MMCC) algorithm is strongly polynomial, but performs badly in experimental studies. To explain these differences in performance in practice we apply the framework of smoothed analysis. We show an upper bound of O(mn2log(n)log(ϕ))O(mn^2\log(n)\log(\phi)) for the number of iterations of the MMCC algorithm. Here nn is the number of nodes, mm is the number of edges, and ϕ\phi is a parameter limiting the degree to which the edge costs are perturbed. We also show a lower bound of Ω(mlog(ϕ))\Omega(m\log(\phi)) for the number of iterations of the MMCC algorithm, which can be strengthened to Ω(mn)\Omega(mn) when ϕ=Θ(n2)\phi=\Theta(n^2). For the number of iterations of the NS algorithm we show a smoothed lower bound of Ω(mmin{n,ϕ}ϕ)\Omega(m \cdot \min \{ n, \phi \} \cdot \phi).Comment: Extended abstract to appear in the proceedings of COCOON 201

    Gravitational Collapse and Fragmentation in Molecular Clouds with Adaptive Mesh Refinement

    Get PDF
    We describe a powerful methodology for numerical solution of 3-D self-gravitational hydrodynamics problems with extremely high resolution. Our method utilizes the technique of local adaptive mesh refinement (AMR), employing multiple grids at multiple levels of resolution. These grids are automatically and dynamically added and removed as necessary to maintain adequate resolution. This technology allows for the solution of problems in a manner that is both more efficient and more versatile than other fixed and variable resolution methods. The application of AMR to simulate the collapse and fragmentation of a molecular cloud, a key step in star formation, is discussed. Such simulations involve many orders of magnitude of variation in length scale as fragments form. In this paper we briefly describe the methodology and present an illustrative application for nonisothermal cloud collapse. We describe the numerical Jeans condition, a criterion for stability of self-gravitational hydrodynamics problems. We show the first well-resolved nonisothermal evolutionary sequence beginning with a perturbed dense molecular cloud core that leads to the formation of a binary system consisting of protostellar cores surrounded by distinct protostellar disks. The scale of the disks, of order 100 AU, is consistent with observations of gaseous disks surrounding single T-Tauri stars and debris disks surrounding systems such as β\beta Pictoris.Comment: 10 pages, 6 figures (color postscript). To appear in the proceedings of Numerical Astrophysics 1998, Tokyo, March 10-13, 199

    The gradient of potential vorticity, quaternions and an orthonormal frame for fluid particles

    Full text link
    The gradient of potential vorticity (PV) is an important quantity because of the way PV (denoted as qq) tends to accumulate locally in the oceans and atmospheres. Recent analysis by the authors has shown that the vector quantity \bdB = \bnabla q\times \bnabla\theta for the three-dimensional incompressible rotating Euler equations evolves according to the same stretching equation as for \bom the vorticity and \bB, the magnetic field in magnetohydrodynamics (MHD). The \bdB-vector therefore acts like the vorticity \bom in Euler's equations and the \bB-field in MHD. For example, it allows various analogies, such as stretching dynamics, helicity, superhelicity and cross helicity. In addition, using quaternionic analysis, the dynamics of the \bdB-vector naturally allow the construction of an orthonormal frame attached to fluid particles\,; this is designated as a quaternion frame. The alignment dynamics of this frame are particularly relevant to the three-axis rotations that particles undergo as they traverse regions of a flow when the PV gradient \bnabla q is large.Comment: Dedicated to Raymond Hide on the occasion of his 80th birthda

    The effect of posterior subtenon methylprednisolone acetate in the refractory diabetic macular edema: a prospective nonrandomized interventional case series

    Get PDF
    BACKGROUND: To investigate the efficacy of posterior subtenon methylprednisolone acetate injection in treatment of refractory diffuse clinically significant diabetic macular edema (CSME). METHODS: In a prospective, nonrandomized, interventional case series, 52 eyes were diagnosed with CSME and treated with at least two sessions of laser photocoagulation according to Early Treatment Diabetic Retinopathy Study guidelines. At least 3 months after laser therapy, eyes with a residual central macular thickness were offered posterior subtenon injection of 40 mg methylprednisolone acetate. Main outcome measures were visual acuity, macular thickness and intraocular pressure. Potential complications were monitored, including intraocular pressure response, cataract progression and scleral perforation. RESULTS: Mean baseline visual acuity (in logMAR) improved significantly (p = 0.003) from 0.8 ± 0.36 to 0.6 ± 0.41 at 3 months. Mean foveal thickness decreased from 388 ± 78 μm at baseline to 231 ± 40 μm after 3 months (p < 0.0001). Visual acuity improvement in eyes with CSME with extrafoveal hard exudates was significant (p = 0.0001), but not significant in eyes with CSME with subfoveal hard exudates (p = 0.32). Intraocular pressure increased from 14.7 ± 2.0 mmHg (range, 12–18 mmHg) to a maximum value of 15.9 ± 2.1 mmHg (range, 12–20 mmHg) during the follow-up period. Complications in two eyes developed focal conjunctival necrosis at the site of injection. CONCLUSION: Posterior subtenon methylprednisolone acetate may improve early visual outcome in diffuse diabetic macular edema that fails to respond to conventional laser photocoagulation. Visual acuity improvement in eyes with CSME with extrafoveal hard exudates was significant; and this improvement is depends on location of hard exudates. Further study is needed to assess the long-term efficacy, safety, and retreatment

    Classical Effective Field Theory for Weak Ultra Relativistic Scattering

    Full text link
    Inspired by the problem of Planckian scattering we describe a classical effective field theory for weak ultra relativistic scattering in which field propagation is instantaneous and transverse and the particles' equations of motion localize to the instant of passing. An analogy with the non-relativistic (post-Newtonian) approximation is stressed. The small parameter is identified and power counting rules are established. The theory is applied to reproduce the leading scattering angle for either a scalar interaction field or electro-magnetic or gravitational; to compute some subleading corrections, including the interaction duration; and to allow for non-zero masses. For the gravitational case we present an appropriate decomposition of the gravitational field onto the transverse plane together with its whole non-linear action. On the way we touch upon the relation with the eikonal approximation, some evidence for censorship of quantum gravity, and an algebraic ring structure on 2d Minkowski spacetime.Comment: 29 pages, 2 figures. v4: Duration of interaction is determined in Sec 4 and detailed in App C. Version accepted for publication in JHE

    Photonic Analogue of Two-dimensional Topological Insulators and Helical One-Way Edge Transport in Bi-Anisotropic Metamaterials

    Full text link
    Recent progress in understanding the topological properties of condensed matter has led to the discovery of time-reversal invariant topological insulators. Because of limitations imposed by nature, topologically non-trivial electronic order seems to be uncommon except in small-band-gap semiconductors with strong spin-orbit interactions. In this Article we show that artificial electromagnetic structures, known as metamaterials, provide an attractive platform for designing photonic analogues of topological insulators. We demonstrate that a judicious choice of the metamaterial parameters can create photonic phases that support a pair of helical edge states, and that these edge states enable one-way photonic transport that is robust against disorder.Comment: 13 pages, 3 figure

    The sign problem across the QCD phase transition

    Full text link
    The average phase factor of the QCD fermion determinant signals the strength of the QCD sign problem. We compute the average phase factor as a function of temperature and baryon chemical potential using a two-flavor NJL model. This allows us to study the strength of the sign problem at and above the chiral transition. It is discussed how the UA(1)U_A(1) anomaly affects the sign problem. Finally, we study the interplay between the sign problem and the endpoint of the chiral transition.Comment: 9 pages and 9 fig

    Absence of association between behavior problems in childhood and hypertension in midlife

    Get PDF
    Background It is known that behavior in childhood is associated with certain physical and mental health problems in midlife. However, there is limited evidence on the role of childhood behavior problems in the development of hypertension in adulthood. The present study aimed to examine whether behavior problems in childhood influenced the risk of hypertension in midlife in the United Kingdom 1958 birth cohort. Methods The 1958 British birth cohort comprised 17,638 individuals born in the first week of March 1958 in the United Kingdom. Behavior problems were assessed at 7, 11, and 16 years of age by parents and teachers. At age 45, blood pressure was measured and hypertension was recorded if blood pressure was ≥140/90 mm Hg or if the participants were informed by their health professionals that they had high blood pressure. Behavioral information was reported according to the Rutter Children's Behaviour Questionnaire (RCBQ) and the Bristol Social Adjustment Guide (BSAG). Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated to examine behavior problems in childhood in relation to hypertension at 45 years of age according to logistic regression analysis, with adjustment for sex, social class in childhood and adulthood, childhood cognition, birth weight, gestational age at birth, body mass index (BMI), smoking, alcohol consumption, and physical activity. Results Behavior problems reported by parents at 7, 11, and 16 years were not associated with hypertension in midlife (OR, 0.93; 95% CI, 0.81, 1.07; OR, 0.95; 95% CI, 0.81, 1.11; OR, 0.98; 95% CI, 0.85, 1.12, respectively). Similarly, teacher-reported behavior problems at 7, 11, and 16 years were not associated with hypertension in midlife (OR, 0.92; 95% CI, 0.72, 1.18; OR, 0.92; 95% CI, 0.84, 1.02; OR, 1.03; 95% CI, 0.92, 1.15, respectively). Further separate analyses showed similar results for males and females. Conclusion There is no association between behavior problems in childhood and hypertension in midlife

    Past Achievements and Future Challenges in 3D Photonic Metamaterials

    Full text link
    Photonic metamaterials are man-made structures composed of tailored micro- or nanostructured metallo-dielectric sub-wavelength building blocks that are densely packed into an effective material. This deceptively simple, yet powerful, truly revolutionary concept allows for achieving novel, unusual, and sometimes even unheard-of optical properties, such as magnetism at optical frequencies, negative refractive indices, large positive refractive indices, zero reflection via impedance matching, perfect absorption, giant circular dichroism, or enhanced nonlinear optical properties. Possible applications of metamaterials comprise ultrahigh-resolution imaging systems, compact polarization optics, and cloaking devices. This review describes the experimental progress recently made fabricating three-dimensional metamaterial structures and discusses some remaining future challenges
    corecore