657 research outputs found

    On Deterministic Sketching and Streaming for Sparse Recovery and Norm Estimation

    Full text link
    We study classic streaming and sparse recovery problems using deterministic linear sketches, including l1/l1 and linf/l1 sparse recovery problems (the latter also being known as l1-heavy hitters), norm estimation, and approximate inner product. We focus on devising a fixed matrix A in R^{m x n} and a deterministic recovery/estimation procedure which work for all possible input vectors simultaneously. Our results improve upon existing work, the following being our main contributions: * A proof that linf/l1 sparse recovery and inner product estimation are equivalent, and that incoherent matrices can be used to solve both problems. Our upper bound for the number of measurements is m=O(eps^{-2}*min{log n, (log n / log(1/eps))^2}). We can also obtain fast sketching and recovery algorithms by making use of the Fast Johnson-Lindenstrauss transform. Both our running times and number of measurements improve upon previous work. We can also obtain better error guarantees than previous work in terms of a smaller tail of the input vector. * A new lower bound for the number of linear measurements required to solve l1/l1 sparse recovery. We show Omega(k/eps^2 + klog(n/k)/eps) measurements are required to recover an x' with |x - x'|_1 <= (1+eps)|x_{tail(k)}|_1, where x_{tail(k)} is x projected onto all but its largest k coordinates in magnitude. * A tight bound of m = Theta(eps^{-2}log(eps^2 n)) on the number of measurements required to solve deterministic norm estimation, i.e., to recover |x|_2 +/- eps|x|_1. For all the problems we study, tight bounds are already known for the randomized complexity from previous work, except in the case of l1/l1 sparse recovery, where a nearly tight bound is known. Our work thus aims to study the deterministic complexities of these problems

    Relativistic Two-Body Processes in Axial-Charge Transitions

    Get PDF
    We study the contribution of two-body meson-exchange processes to axial charge transitions for nuclei in the lead, tin and oxygen regions. We conduct calculations in the Dirac-Hartree (the Walecka model) and the relativistic Hartree (where the full one-nucleon-loop effects are included) approximations. We present results indicating that one- and two-body processes enhance the matrix elements of the axial-charge operator by some (100+-20)% in all three regions studied. This agrees well with the fit of eighteen first-forbidden beta-decay transitions conducted by Warburton in the lead region. We also discuss some sensitivities present in the calculation.Comment: 23 pages, RevTeX format, 5 PostScript figures available on reques

    Modeling quark-hadron duality for relativistic, confined fermions

    Full text link
    We discuss a model for the study of quark-hadron duality in inclusive electron scattering based on solving the Dirac equation numerically for a scalar confining linear potential and a vector color Coulomb potential. We qualitatively reproduce the features of quark-hadron duality for all potentials considered, and discuss similarities and differences to previous models that simplified the situation by treating either the quarks or all particles as scalars. We discuss the scaling results for PWIA and FSI, and the approach to scaling using the analog of the Callan-Gross relation for y-scaling.Comment: 38 pages, 21 figure

    The Self Model and the Conception of Biological Identity in Immunology

    Get PDF
    The self/non-self model, first proposed by F.M. Burnet, has dominated immunology for sixty years now. According to this model, any foreign element will trigger an immune reaction in an organism, whereas endogenous elements will not, in normal circumstances, induce an immune reaction. In this paper we show that the self/non-self model is no longer an appropriate explanation of experimental data in immunology, and that this inadequacy may be rooted in an excessively strong metaphysical conception of biological identity. We suggest that another hypothesis, one based on the notion of continuity, gives a better account of immune phenomena. Finally, we underscore the mapping between this metaphysical deflation from self to continuity in immunology and the philosophical debate between substantialism and empiricism about identity

    National Institutes of Health Consensus Development Project on Criteria for Clinical Trials in Chronic Graft-Versus-Host Disease: III. The 2014 Biomarker Working Group Report

    Get PDF
    Biology-based markers to confirm or aid in the diagnosis or prognosis of chronic GVHD after allogeneic hematopoietic cell transplantation (HCT) or monitor its progression are critically needed to facilitate evaluation of new therapies. Biomarkers have been defined as any characteristic that is objectively measured and evaluated as an indicator of a normal biological or pathogenic process, a pharmacologic response to a therapeutic intervention. Applications of biomarkers in chronic GVHD clinical trials or patient management include: a) diagnosis and assessment of chronic GVHD disease activity, including distinguishing irreversible damage from continued disease activity, b) prognostic risk to develop chronic GVHD, and c) prediction of response to therapy. Sample collection for chronic GVHD biomarkers studies should be well-documented following established quality control guidelines for sample acquisition, processing, preservation and testing, at intervals that are both calendar- and event-driven. The consistent therapeutic treatment of subjects and standardized documentation needed to support biomarker studies are most likely to be provided in prospective clinical trials. To date, no chronic GVHD biomarkers have been qualified for utilization in clinical applications. Since our previous chronic GVHD Biomarkers Working Group report in 2005, an increasing number of chronic GVHD candidate biomarkers are available for further investigation. This paper provides a four-part framework for biomarker investigations: identification, verification, qualification, and application with terminology based on Food and Drug Administration and European Medicines Agency guidelines

    Λc+\Lambda^+_c- and Λb\Lambda_b-hypernuclei

    Full text link
    Λc+\Lambda^+_c- and Λb\Lambda_b-hypernuclei are studied in the quark-meson coupling (QMC) model. Comparisons are made with the results for Λ\Lambda-hypernuclei studied in the same model previously. Although the scalar and vector potentials felt by the Λ\Lambda, Λc+\Lambda_c^+ and Λb\Lambda_b in the corresponding hypernuclei multiplet which has the same baryon numbers are quite similar, the wave functions obtained, e.g., for 1s1/21s_{1/2} state, are very different. The Λc+\Lambda^+_c baryon density distribution in Λc+209^{209}_{\Lambda^+_c}Pb is much more pushed away from the center than that for the Λ\Lambda in Λ209^{209}_\LambdaPb due to the Coulomb force. On the contrary, the Λb\Lambda_b baryon density distributions in Λb\Lambda_b-hypernuclei are much larger near the origin than those for the Λ\Lambda in the corresponding Λ\Lambda-hypernuclei due to its heavy mass. It is also found that level spacing for the Λb\Lambda_b single-particle energies is much smaller than that for the Λ\Lambda and Λc+\Lambda^+_c.Comment: Latex, 14 pages, 4 figures, text was extended, version to appear in Phys. Rev.

    Machine Learning in Automated Text Categorization

    Full text link
    The automated categorization (or classification) of texts into predefined categories has witnessed a booming interest in the last ten years, due to the increased availability of documents in digital form and the ensuing need to organize them. In the research community the dominant approach to this problem is based on machine learning techniques: a general inductive process automatically builds a classifier by learning, from a set of preclassified documents, the characteristics of the categories. The advantages of this approach over the knowledge engineering approach (consisting in the manual definition of a classifier by domain experts) are a very good effectiveness, considerable savings in terms of expert manpower, and straightforward portability to different domains. This survey discusses the main approaches to text categorization that fall within the machine learning paradigm. We will discuss in detail issues pertaining to three different problems, namely document representation, classifier construction, and classifier evaluation.Comment: Accepted for publication on ACM Computing Survey

    Using the Wigner-Ibach Surmise to Analyze Terrace-Width Distributions: History, User's Guide, and Advances

    Full text link
    A history is given of the applications of the simple expression generalized from the surmise by Wigner and also by Ibach to extract the strength of the interaction between steps on a vicinal surface, via the terrace width distribution (TWD). A concise guide for use with experiments and a summary of some recent extensions are provided.Comment: 11 pages, 4 figures, reformatted (with revtex) version of refereed paper for special issue of Applied Physics A entitled "From Surface Science to Device Physics", in honor of the retirements of Prof. H. Ibach and Prof. H. L\"ut

    CP Phases in Correlated Production and Decay of Neutralinos in the Minimal Supersymmetric Standard Model

    Get PDF
    We investigate the associated production of neutralinos e+e−→χ~10χ~20e^+e^-\to\tilde{\chi}^0_1\tilde{\chi}^0_2 accompanied by the neutralino leptonic decay χ~20→χ~10ℓ+ℓ−\tilde{\chi}^0_2\to\tilde{\chi}^0_1 \ell^+\ell^-, taking into account initial beam polarization and production-decay spin correlations in the minimal supersymmetric standard model with general CP phases but without generational mixing in the slepton sector. The stringent constraints from the electron EDM on the CP phases are also included in the discussion. Initial beam polarizations lead to three CP--even distributions and one CP--odd distribution, which can be studied independently of the details of the neutralino decays. We find that the production cross section and the branching fractions of the leptonic neutralino decays are very sensitive to the CP phases. In addition, the production--decay spin correlations lead to several CP--even observables such as lepton invariant mass distribution, and lepton angular distribution, and one interesting T--odd (CP--odd) triple product of the initial electron momentum and two final lepton momenta, the size of which might be large enough to be measured at the high--luminosity future electron--positron collider or can play a complementary role in constraining the CP phases with the EDM constraints.Comment: Revtex, 37 pages, 12 eps figure

    Multiple Interactions and the Structure of Beam Remnants

    Full text link
    Recent experimental data have established some of the basic features of multiple interactions in hadron-hadron collisions. The emphasis is therefore now shifting, to one of exploring more detailed aspects. Starting from a brief review of the current situation, a next-generation model is developed, wherein a detailed account is given of correlated flavour, colour, longitudinal and transverse momentum distributions, encompassing both the partons initiating perturbative interactions and the partons left in the beam remnants. Some of the main features are illustrated for the Tevatron and the LHC.Comment: 69pp, 33 figure
    • …
    corecore