1,628 research outputs found

    Accelerated Projected Gradient Method for Linear Inverse Problems with Sparsity Constraints

    Full text link
    Regularization of ill-posed linear inverse problems via 1\ell_1 penalization has been proposed for cases where the solution is known to be (almost) sparse. One way to obtain the minimizer of such an 1\ell_1 penalized functional is via an iterative soft-thresholding algorithm. We propose an alternative implementation to 1\ell_1-constraints, using a gradient method, with projection on 1\ell_1-balls. The corresponding algorithm uses again iterative soft-thresholding, now with a variable thresholding parameter. We also propose accelerated versions of this iterative method, using ingredients of the (linear) steepest descent method. We prove convergence in norm for one of these projected gradient methods, without and with acceleration.Comment: 24 pages, 5 figures. v2: added reference, some amendments, 27 page

    Efficient Resolution of Anisotropic Structures

    Get PDF
    We highlight some recent new delevelopments concerning the sparse representation of possibly high-dimensional functions exhibiting strong anisotropic features and low regularity in isotropic Sobolev or Besov scales. Specifically, we focus on the solution of transport equations which exhibit propagation of singularities where, additionally, high-dimensionality enters when the convection field, and hence the solutions, depend on parameters varying over some compact set. Important constituents of our approach are directionally adaptive discretization concepts motivated by compactly supported shearlet systems, and well-conditioned stable variational formulations that support trial spaces with anisotropic refinements with arbitrary directionalities. We prove that they provide tight error-residual relations which are used to contrive rigorously founded adaptive refinement schemes which converge in L2L_2. Moreover, in the context of parameter dependent problems we discuss two approaches serving different purposes and working under different regularity assumptions. For frequent query problems, making essential use of the novel well-conditioned variational formulations, a new Reduced Basis Method is outlined which exhibits a certain rate-optimal performance for indefinite, unsymmetric or singularly perturbed problems. For the radiative transfer problem with scattering a sparse tensor method is presented which mitigates or even overcomes the curse of dimensionality under suitable (so far still isotropic) regularity assumptions. Numerical examples for both methods illustrate the theoretical findings

    Preceding rule induction with instance reduction methods

    Get PDF
    A new prepruning technique for rule induction is presented which applies instance reduction before rule induction. An empirical evaluation records the predictive accuracy and size of rule-sets generated from 24 datasets from the UCI Machine Learning Repository. Three instance reduction algorithms (Edited Nearest Neighbour, AllKnn and DROP5) are compared. Each one is used to reduce the size of the training set, prior to inducing a set of rules using Clark and Boswell's modification of CN2. A hybrid instance reduction algorithm (comprised of AllKnn and DROP5) is also tested. For most of the datasets, pruning the training set using ENN, AllKnn or the hybrid significantly reduces the number of rules generated by CN2, without adversely affecting the predictive performance. The hybrid achieves the highest average predictive accuracy

    Absence of lattice strain anomalies at the electronic topological transition in zinc at high pressure

    Full text link
    High pressure structural distortions of the hexagonal close packed (hcp) element zinc have been a subject of controversy. Earlier experimental results and theory showed a large anomaly in lattice strain with compression in zinc at about 10 GPa which was explained theoretically by a change in Fermi surface topology. Later hydrostatic experiments showed no such anomaly, resulting in a discrepancy between theory and experiment. We have computed the compression and lattice strain of hcp zinc over a wide range of compressions using the linearized augmented plane wave (LAPW) method paying special attention to k-point convergence. We find that the behavior of the lattice strain is strongly dependent on k-point sampling, and with large k-point sets the previously computed anomaly in lattice parameters under compression disappears, in agreement with recent experiments.Comment: 9 pages, 6 figures, Phys. Rev. B (in press

    Diffusion Resonances in Action Space for an Atom Optics Kicked Rotor with Decoherence

    Full text link
    We numerically investigate momentum diffusion rates for the pulse kicked rotor across the quantum to classical transition as the dynamics are made more macroscopic by increasing the total system action. For initial and late time rates we observe an enhanced diffusion peak which shifts and scales with changing kick strength, and we also observe distinctive peaks around quantum resonances. Our investigations take place in the context of a system of ultracold atoms which is coupled to its environment via spontaneous emission decoherence, and the effects should be realisable in ongoing experiments.Comment: 4 Pages, RevTeX 4, 5 Figures. Updated Figures, Minor Changes to text, Corrected Reference

    A mental number line in human newborns

    Get PDF
    Humans represent numbers on a mental number line with smaller numbers on the left and larger numbers on the right side. A left\u2010to\u2010right oriented spatial\u2013numerical association, (SNA), has been demonstrated in animals and infants. However, the possibility that SNA is learnt by early exposure to caregivers\u2019 directional biases is still open. We conducted two experiments: in Experiment 1, we tested whether SNA is present at birth and in Experiment 2, we studied whether it depends on the relative rather than the absolute magnitude of numerousness. Fifty\u2010five\u2010hour\u2010old newborns, once habituated to a number (12), spontaneously associated a smaller number (4) with the left and a larger number (36) with the right side (Experiment 1). SNA in neonates is not absolute but relative. The same number (12) was associated with the left side rather than the right side whenever the previously experienced number was larger (36) rather than smaller (4) (Experiment 2). Control on continuous physical variables showed that the effect is specific of discrete magnitudes. These results constitute strong evidence that in our species SNA originates from pre\u2010linguistic and biological precursors in the brain

    Gut microbiota‐dependent trimethylamine N‐oxide and cardiovascular outcomes in patients with prior myocardial infarction: A nested case control study from the PEGASUS‐TIMI 54 trial

    Get PDF
    Background Trimethylamine N‐oxide (TMAO) may have prothrombotic properties. We examined the association of TMAO quartiles with major adverse cardiovascular events (MACE) and the effect of TMAO on the efficacy of ticagrelor. Methods and Results PEGASUS‐TIMI 54 (Prevention of Cardiovascular Events in Patients With Prior Heart Attack Using Ticagrelor Compared to Placebo on a Background of Aspirin ‐ Thrombolysis in Myocardial Infarction 54) randomized patients with prior myocardial infarction to ticagrelor or placebo (median follow‐up 33 months). Baseline plasma concentrations of TMAO were measured in a nested case‐control study of 597 cases with cardiovascular death, myocardial infarction, or stroke (MACE) and 1206 controls matched for age, sex, and estimated glomerular filtration rate [eGFR]. Odds ratios (OR) were used for the association between TMAO quartiles and MACE, adjusting for baseline clinical characteristics (age, sex, eGFR, region, body mass index, hypertension, hypercholesterolemia, diabetes mellitus, smoking, peripheral artery disease, index event, aspirin dosage and treatment arm), and cardiovascular biomarkers (hs‐TnT [high‐sensitivity troponin T], hs‐CRP [high‐sensitivity C‐reactive protein], NT‐proBNP [N‐terminal‐pro‐B‐type natriuretic peptide]). Higher TMAO quartiles were associated with risk of MACE (OR for quartile 4 versus quartile 1, 1.43, 95% CI, 1.06–1.93, P trend=0.015). The association was driven by cardiovascular death (OR 2.25, 95% CI, 1.28–3.96, P trend=0.003) and stroke (OR 2.68, 95% CI, 1.39–5.17, P trend<0.001). After adjustment for clinical factors, the association persisted for cardiovascular death (ORadj 1.89, 95% CI, 1.03–3.45, P trend=0.027) and stroke (ORadj 2.01, 95% CI, 1.01–4.01, P trend=0.022), but was slightly attenuated after adjustment for cardiovascular biomarkers (cardiovascular death: ORadj 1.74, 95% CI, 0.88–3.45, P trend=0.079; and stroke: ORadj 1.82, 95% CI, 0.88–3.78, P trend=0.056). The reduction in MACE with ticagrelor was consistent across TMAO quartiles (P interaction=0.92). Conclusions Among patients with prior myocardial infarction, higher TMAO levels were associated with cardiovascular death and stroke but not with recurrent myocardial infarction. The efficacy of ticagrelor was consistent regardless of TMAO levels. Registration URL: https://www.clini​caltr​ials.gov; Unique identifiers: PEGASUS‐TIMI 54, NCT01225562

    On Deterministic Sketching and Streaming for Sparse Recovery and Norm Estimation

    Full text link
    We study classic streaming and sparse recovery problems using deterministic linear sketches, including l1/l1 and linf/l1 sparse recovery problems (the latter also being known as l1-heavy hitters), norm estimation, and approximate inner product. We focus on devising a fixed matrix A in R^{m x n} and a deterministic recovery/estimation procedure which work for all possible input vectors simultaneously. Our results improve upon existing work, the following being our main contributions: * A proof that linf/l1 sparse recovery and inner product estimation are equivalent, and that incoherent matrices can be used to solve both problems. Our upper bound for the number of measurements is m=O(eps^{-2}*min{log n, (log n / log(1/eps))^2}). We can also obtain fast sketching and recovery algorithms by making use of the Fast Johnson-Lindenstrauss transform. Both our running times and number of measurements improve upon previous work. We can also obtain better error guarantees than previous work in terms of a smaller tail of the input vector. * A new lower bound for the number of linear measurements required to solve l1/l1 sparse recovery. We show Omega(k/eps^2 + klog(n/k)/eps) measurements are required to recover an x' with |x - x'|_1 <= (1+eps)|x_{tail(k)}|_1, where x_{tail(k)} is x projected onto all but its largest k coordinates in magnitude. * A tight bound of m = Theta(eps^{-2}log(eps^2 n)) on the number of measurements required to solve deterministic norm estimation, i.e., to recover |x|_2 +/- eps|x|_1. For all the problems we study, tight bounds are already known for the randomized complexity from previous work, except in the case of l1/l1 sparse recovery, where a nearly tight bound is known. Our work thus aims to study the deterministic complexities of these problems

    A "superstorm": When moral panic and new risk discourses converge in the media

    Get PDF
    This is an Author's Accepted Manuscript of an article published in Health, Risk and Society, 15(6), 681-698, 2013, copyright Taylor & Francis, available online at: http://www.tandfonline.com/10.1080/13698575.2013.851180.There has been a proliferation of risk discourses in recent decades but studies of these have been polarised, drawing either on moral panic or new risk frameworks to analyse journalistic discourses. This article opens the theoretical possibility that the two may co-exist and converge in the same scare. I do this by bringing together more recent developments in moral panic thesis, with new risk theory and the concept of media logic. I then apply this theoretical approach to an empirical analysis of how and with what consequences moral panic and new risk type discourses converged in the editorials of four newspaper campaigns against GM food policy in Britain in the late 1990s. The article analyses 112 editorials published between January 1998 and December 2000, supplemented with news stories where these were needed for contextual clarity. This analysis shows that not only did this novel food generate intense media and public reactions; these developed in the absence of the type of concrete details journalists usually look for in risk stories. Media logic is important in understanding how journalists were able to engage and hence how a major scare could be constructed around convergent moral panic and new risk type discourses. The result was a media ‘superstorm’ of sustained coverage in which both types of discourse converged in highly emotive mutually reinforcing ways that resonated in a highly sensitised context. The consequence was acute anxiety, social volatility and the potential for the disruption of policy and social change

    Why social networks are different from other types of networks

    Full text link
    We argue that social networks differ from most other types of networks, including technological and biological networks, in two important ways. First, they have non-trivial clustering or network transitivity, and second, they show positive correlations, also called assortative mixing, between the degrees of adjacent vertices. Social networks are often divided into groups or communities, and it has recently been suggested that this division could account for the observed clustering. We demonstrate that group structure in networks can also account for degree correlations. We show using a simple model that we should expect assortative mixing in such networks whenever there is variation in the sizes of the groups and that the predicted level of assortative mixing compares well with that observed in real-world networks.Comment: 9 pages, 2 figure
    corecore