90 research outputs found

    Adaptive Detection of Structured Signals in Low-Rank Interference

    Full text link
    In this paper, we consider the problem of detecting the presence (or absence) of an unknown but structured signal from the space-time outputs of an array under strong, non-white interference. Our motivation is the detection of a communication signal in jamming, where often the training portion is known but the data portion is not. We assume that the measurements are corrupted by additive white Gaussian noise of unknown variance and a few strong interferers, whose number, powers, and array responses are unknown. We also assume the desired signals array response is unknown. To address the detection problem, we propose several GLRT-based detection schemes that employ a probabilistic signal model and use the EM algorithm for likelihood maximization. Numerical experiments are presented to assess the performance of the proposed schemes

    Hybrid approximate message passing

    Full text link
    Gaussian and quadratic approximations of message passing algorithms on graphs have attracted considerable recent attention due to their computational simplicity, analytic tractability, and wide applicability in optimization and statistical inference problems. This paper presents a systematic framework for incorporating such approximate message passing (AMP) methods in general graphical models. The key concept is a partition of dependencies of a general graphical model into strong and weak edges, with the weak edges representing interactions through aggregates of small, linearizable couplings of variables. AMP approximations based on the Central Limit Theorem can be readily applied to aggregates of many weak edges and integrated with standard message passing updates on the strong edges. The resulting algorithm, which we call hybrid generalized approximate message passing (HyGAMP), can yield significantly simpler implementations of sum-product and max-sum loopy belief propagation. By varying the partition of strong and weak edges, a performance--complexity trade-off can be achieved. Group sparsity and multinomial logistic regression problems are studied as examples of the proposed methodology.The work of S. Rangan was supported in part by the National Science Foundation under Grants 1116589, 1302336, and 1547332, and in part by the industrial affiliates of NYU WIRELESS. The work of A. K. Fletcher was supported in part by the National Science Foundation under Grants 1254204 and 1738286 and in part by the Office of Naval Research under Grant N00014-15-1-2677. The work of V. K. Goyal was supported in part by the National Science Foundation under Grant 1422034. The work of E. Byrne and P. Schniter was supported in part by the National Science Foundation under Grant CCF-1527162. (1116589 - National Science Foundation; 1302336 - National Science Foundation; 1547332 - National Science Foundation; 1254204 - National Science Foundation; 1738286 - National Science Foundation; 1422034 - National Science Foundation; CCF-1527162 - National Science Foundation; NYU WIRELESS; N00014-15-1-2677 - Office of Naval Research

    Sketched Clustering via Hybrid Approximate Message Passing

    Get PDF
    International audienceIn sketched clustering, the dataset is first sketched down to a vector of modest size, from which the cluster centers are subsequently extracted. The goal is to perform clustering more efficiently than with methods that operate on the full training data, such as k-means++. For the sketching methodology recently proposed by Keriven, Gribonval, et al., which can be interpreted as a random sampling of the empirical characteristic function, we propose a cluster recovery algorithm based on simplified hybrid generalized approximate message passing (SHyGAMP). Numerical experiments suggest that our approach is more efficient than the state-of-the-art sketched clustering algorithms (in both computational and sample complexity) and more efficient than k-means++ in certain regimes

    3D Reconstructions Using Unstabilized Video Footage from an Unmanned Aerial Vehicle

    Get PDF
    Structure from motion (SFM) is a methodology for automatically reconstructing three-dimensional (3D) models from a series of two-dimensional (2D) images when there is no a priori knowledge of the camera location and direction. Modern unmanned aerial vehicles (UAV) now provide a low-cost means of obtaining aerial video footage of a point of interest. Unfortunately, raw video lacks the required information for SFM software, as it does not record exchangeable image file (EXIF) information for the frames. In this work, a solution is presented to modify aerial video so that it can be used for photogrammetry. The paper then examines how the field of view effects the quality of the reconstruction. The input is unstabilized, and distorted video footage obtained from a low-cost UAV which is then combined with an open-source SFM system to reconstruct a 3D model. This approach creates a high quality reconstruction by reducing the amount of unknown variables, such as focal length and sensor size, while increasing the data density. The experiments conducted examine the optical field of view settings to provide sufficient overlap without sacrificing image quality or exacerbating distortion. The system costs less than e1000, and the results show the ability to reproduce 3D models that are of centimeter-level accuracy. For verification, the results were compared against millimeter-level accurate models derived from laser scanning.European Union Grant FP7-632227; IRC Grant GOIPD/2015/125; IRC Grant GOIPG/2015/3003, Geological Survey of Ireland Grant 2015-sc-Laefer; Science Foundation Ireland Grant 13/TIDA/I27

    Horizontal Transmission and Recombination Maintain forever Young Bacterial Symbiont Genomes

    Get PDF
    Bacterial symbionts bring a wealth of functions to the associations they participate in, but by doing so, they endanger the genes and genomes underlying these abilities. When bacterial symbionts become obligately associated with their hosts, their genomes are thought to decay towards an organelle-like fate due to decreased homologous recombination and inef- ficient selection. However, numerous associations exist that counter these expectations, especially in marine environments, possibly due to ongoing horizontal gene flow. Despite extensive theoretical treatment, no empirical study thus far has connected these underlying population genetic processes with long-term evolutionary outcomes. By sampling marine chemosynthetic bacterial-bivalve endosymbioses that range from primarily vertical to strictly horizontal transmission, we tested this canonical theory. We found that transmission mode strongly predicts homologous recombination rates, and that exceedingly low recombination rates are associated with moderate genome degradation in the marine symbionts with nearly strict vertical transmission. Nonetheless, even the most degraded marine endosym- biont genomes are occasionally horizontally transmitted and are much larger than their ter- restrial insect symbiont counterparts. Therefore, horizontal transmission and recombination enable efficient natural selection to maintain intermediate symbiont genome sizes and sub- stantial functional genetic variation

    Understanding how institutions may support the development of transdisciplinary approaches to sustainability research

    Get PDF
    This article analyses the approaches of academics seeking to engage with private, public and community-based stakeholders through transdisciplinary research about pressing sustainability challenges and, in particular, climate change; it outlines aspects of the institutional factors which influence transdisciplinary research. A qualitative approach was employed in conducting 10 semi-structured interviews to analyse the challenges and motivations of academic researchers when working with a range of other stakeholders through transdisciplinary practice. Two key contributions are made through this work. First, this article adds to the existing literature on motivations and challenges for undertaking research with private, public and community stakeholders in a cross-disciplinary manner. Second, the current institutional circumstances influencing such research practices are outlined, alongside potential ways forward. The research presented here has been undertaken in light of the experiences of the two lead co-authors as early career researchers coming from the disciplines of sociology and energy engineering, engaging in transdisciplinary research within a local community context in relation to a regional energy transition project

    Innovative methods of community engagement: towards a low carbon climate resilient future

    Get PDF
    The proceedings of the Innovative Methods of Community Engagement: Toward a Low Carbon, Climate Resilient Future workshop have been developed by the Imagining2050 team in UCC and the Secretariat to the National Dialogue on Climate Action (NDCA). The NDCA also funded the workshop running costs. The proceedings offer a set of recommendations and insights into leveraging different community engagement approaches and methodologies in the area of climate action. They draw from interdisciplinary knowledge and experiences of researchers for identifying, mobilizing and mediating communities. The work presented below derives from a workshop held in the Environmental Research Institute in UCC on the 17th January 2019. These proceedings are complementary to an earlier workshop also funded by the NDCA and run by MaREI in UCC, titled ‘How do we Engage Communities in Climate Action? – Practical Learnings from the Coal Face’. The earlier workshop looked more closely at community development groups and other non-statutory organizations doing work in the area of climate change

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe
    corecore