170 research outputs found

    Expander 0\ell_0-Decoding

    Get PDF
    We introduce two new algorithms, Serial-0\ell_0 and Parallel-0\ell_0 for solving a large underdetermined linear system of equations y=AxRmy = Ax \in \mathbb{R}^m when it is known that xRnx \in \mathbb{R}^n has at most k<mk < m nonzero entries and that AA is the adjacency matrix of an unbalanced left dd-regular expander graph. The matrices in this class are sparse and allow a highly efficient implementation. A number of algorithms have been designed to work exclusively under this setting, composing the branch of combinatorial compressed-sensing (CCS). Serial-0\ell_0 and Parallel-0\ell_0 iteratively minimise yAx^0\|y - A\hat x\|_0 by successfully combining two desirable features of previous CCS algorithms: the information-preserving strategy of ER, and the parallel updating mechanism of SMP. We are able to link these elements and guarantee convergence in O(dnlogk)\mathcal{O}(dn \log k) operations by assuming that the signal is dissociated, meaning that all of the 2k2^k subset sums of the support of xx are pairwise different. However, we observe empirically that the signal need not be exactly dissociated in practice. Moreover, we observe Serial-0\ell_0 and Parallel-0\ell_0 to be able to solve large scale problems with a larger fraction of nonzeros than other algorithms when the number of measurements is substantially less than the signal length; in particular, they are able to reliably solve for a kk-sparse vector xRnx\in\mathbb{R}^n from mm expander measurements with n/m=103n/m=10^3 and k/mk/m up to four times greater than what is achievable by 1\ell_1-regularization from dense Gaussian measurements. Additionally, Serial-0\ell_0 and Parallel-0\ell_0 are observed to be able to solve large problems sizes in substantially less time than other algorithms for compressed sensing. In particular, Parallel-0\ell_0 is structured to take advantage of massively parallel architectures.Comment: 14 pages, 10 figure

    A robust parallel algorithm for combinatorial compressed sensing

    Full text link
    In previous work two of the authors have shown that a vector xRnx \in \mathbb{R}^n with at most k<nk < n nonzeros can be recovered from an expander sketch AxAx in O(nnz(A)logk)\mathcal{O}(\mathrm{nnz}(A)\log k) operations via the Parallel-0\ell_0 decoding algorithm, where nnz(A)\mathrm{nnz}(A) denotes the number of nonzero entries in ARm×nA \in \mathbb{R}^{m \times n}. In this paper we present the Robust-0\ell_0 decoding algorithm, which robustifies Parallel-0\ell_0 when the sketch AxAx is corrupted by additive noise. This robustness is achieved by approximating the asymptotic posterior distribution of values in the sketch given its corrupted measurements. We provide analytic expressions that approximate these posteriors under the assumptions that the nonzero entries in the signal and the noise are drawn from continuous distributions. Numerical experiments presented show that Robust-0\ell_0 is superior to existing greedy and combinatorial compressed sensing algorithms in the presence of small to moderate signal-to-noise ratios in the setting of Gaussian signals and Gaussian additive noise

    Research priorities for maintaining biodiversity’s contributions to people in Latin America

    Get PDF
    Maintaining biodiversity is crucial for ensuring human well-being. The authors participated in a workshop held in Palenque, Mexico, in August 2018, that brought together 30 mostly early-career scientists working in different disciplines (natural, social and economic sciences) with the aim of identifying research priorities for studying the contributions of biodiversity to people and how these contributions might be impacted by environmental change. Five main groups of questions emerged: (1) Enhancing the quantity, quality, and availability of biodiversity data; (2) Integrating different knowledge systems; (3) Improved methods for integrating diverse data; (4) Fundamental questions in ecology and evolution; and (5) Multi-level governance across boundaries. We discuss the need for increased capacity building and investment in research programmes to address these challenges

    Search for the standard model Higgs boson decaying to a bbˉb\bar{b} pair in events with no charged leptons and large missing transverse energy using the full CDF data set

    Get PDF
    We report on a search for the standard model Higgs boson produced in association with a vector boson in the full data set of proton-antiproton collisions at s=1.96\sqrt{s} = 1.96 TeV recorded by the CDF II detector at the Tevatron, corresponding to an integrated luminosity of 9.45 fb1^{-1}. We consider events having no identified charged lepton, a transverse energy imbalance, and two or three jets, of which at least one is consistent with originating from the decay of a bb quark. We place 95% credibility level upper limits on the production cross section times standard model branching fraction for several mass hypotheses between 90 and 150GeV/c2150 \mathrm{GeV}/c^2. For a Higgs boson mass of 125GeV/c2125 \mathrm{GeV}/c^2, the observed (expected) limit is 6.7 (3.6) times the standard model prediction.Comment: Accepted by Phys. Rev. Let

    Search for the standard model Higgs boson decaying to a bb pair in events with one charged lepton and large missing transverse energy using the full CDF data set

    Get PDF
    We present a search for the standard model Higgs boson produced in association with a W boson in sqrt(s) = 1.96 TeV p-pbar collision data collected with the CDF II detector at the Tevatron corresponding to an integrated luminosity of 9.45 fb-1. In events consistent with the decay of the Higgs boson to a bottom-quark pair and the W boson to an electron or muon and a neutrino, we set 95% credibility level upper limits on the WH production cross section times the H->bb branching ratio as a function of Higgs boson mass. At a Higgs boson mass of 125 GeV/c2 we observe (expect) a limit of 4.9 (2.8) times the standard model value.Comment: Submitted to Phys. Rev. Lett (v2 contains clarifications suggested by PRL

    Search for the standard model Higgs boson decaying to a bb pair in events with two oppositely-charged leptons using the full CDF data set

    Get PDF
    We present a search for the standard model Higgs boson produced in association with a Z boson in data collected with the CDF II detector at the Tevatron, corresponding to an integrated luminosity of 9.45/fb. In events consistent with the decay of the Higgs boson to a bottom-quark pair and the Z boson to electron or muon pairs, we set 95% credibility level upper limits on the ZH production cross section times the H -> bb branching ratio as a function of Higgs boson mass. At a Higgs boson mass of 125 GeV/c^2 we observe (expect) a limit of 7.1 (3.9) times the standard model value.Comment: To be submitted to Phys. Rev. Let

    Evolving trends in the management of acute appendicitis during COVID-19 waves. The ACIE appy II study

    Get PDF
    Background: In 2020, ACIE Appy study showed that COVID-19 pandemic heavily affected the management of patients with acute appendicitis (AA) worldwide, with an increased rate of non-operative management (NOM) strategies and a trend toward open surgery due to concern of virus transmission by laparoscopy and controversial recommendations on this issue. The aim of this study was to survey again the same group of surgeons to assess if any difference in management attitudes of AA had occurred in the later stages of the outbreak. Methods: From August 15 to September 30, 2021, an online questionnaire was sent to all 709 participants of the ACIE Appy study. The questionnaire included questions on personal protective equipment (PPE), local policies and screening for SARS-CoV-2 infection, NOM, surgical approach and disease presentations in 2021. The results were compared with the results from the previous study. Results: A total of 476 answers were collected (response rate 67.1%). Screening policies were significatively improved with most patients screened regardless of symptoms (89.5% vs. 37.4%) with PCR and antigenic test as the preferred test (74.1% vs. 26.3%). More patients tested positive before surgery and commercial systems were the preferred ones to filter smoke plumes during laparoscopy. Laparoscopic appendicectomy was the first option in the treatment of AA, with a declined use of NOM. Conclusion: Management of AA has improved in the last waves of pandemic. Increased evidence regarding SARS-COV-2 infection along with a timely healthcare systems response has been translated into tailored attitudes and a better care for patients with AA worldwide

    Photography-based taxonomy is inadequate, unnecessary, and potentially harmful for biological sciences

    Get PDF
    The question whether taxonomic descriptions naming new animal species without type specimen(s) deposited in collections should be accepted for publication by scientific journals and allowed by the Code has already been discussed in Zootaxa (Dubois & Nemésio 2007; Donegan 2008, 2009; Nemésio 2009a–b; Dubois 2009; Gentile & Snell 2009; Minelli 2009; Cianferoni & Bartolozzi 2016; Amorim et al. 2016). This question was again raised in a letter supported by 35 signatories published in the journal Nature (Pape et al. 2016) on 15 September 2016. On 25 September 2016, the following rebuttal (strictly limited to 300 words as per the editorial rules of Nature) was submitted to Nature, which on 18 October 2016 refused to publish it. As we think this problem is a very important one for zoological taxonomy, this text is published here exactly as submitted to Nature, followed by the list of the 493 taxonomists and collection-based researchers who signed it in the short time span from 20 September to 6 October 2016

    Mapping 123 million neonatal, infant and child deaths between 2000 and 2017

    Get PDF
    Since 2000, many countries have achieved considerable success in improving child survival, but localized progress remains unclear. To inform efforts towards United Nations Sustainable Development Goal 3.2—to end preventable child deaths by 2030—we need consistently estimated data at the subnational level regarding child mortality rates and trends. Here we quantified, for the period 2000–2017, the subnational variation in mortality rates and number of deaths of neonates, infants and children under 5 years of age within 99 low- and middle-income countries using a geostatistical survival model. We estimated that 32% of children under 5 in these countries lived in districts that had attained rates of 25 or fewer child deaths per 1,000 live births by 2017, and that 58% of child deaths between 2000 and 2017 in these countries could have been averted in the absence of geographical inequality. This study enables the identification of high-mortality clusters, patterns of progress and geographical inequalities to inform appropriate investments and implementations that will help to improve the health of all populations
    corecore