724 research outputs found

    Matched Filtering from Limited Frequency Samples

    Full text link
    In this paper, we study a simple correlation-based strategy for estimating the unknown delay and amplitude of a signal based on a small number of noisy, randomly chosen frequency-domain samples. We model the output of this "compressive matched filter" as a random process whose mean equals the scaled, shifted autocorrelation function of the template signal. Using tools from the theory of empirical processes, we prove that the expected maximum deviation of this process from its mean decreases sharply as the number of measurements increases, and we also derive a probabilistic tail bound on the maximum deviation. Putting all of this together, we bound the minimum number of measurements required to guarantee that the empirical maximum of this random process occurs sufficiently close to the true peak of its mean function. We conclude that for broad classes of signals, this compressive matched filter will successfully estimate the unknown delay (with high probability, and within a prescribed tolerance) using a number of random frequency-domain samples that scales inversely with the signal-to-noise ratio and only logarithmically in the in the observation bandwidth and the possible range of delays.Comment: Submitted to the IEEE Transactions on Information Theory on January 13, 201

    Empirical constraints on the origin of fast radio bursts: volumetric rates and host galaxy demographics as a test of millisecond magnetar connection

    Full text link
    The localization of the repeating FRB 121102 to a low-metallicity dwarf galaxy at z=0.193z=0.193, and its association with a quiescent radio source, suggests the possibility that FRBs originate from magnetars, formed by the unusual supernovae in such galaxies. We investigate this via a comparison of magnetar birth rates, the FRB volumetric rate, and host galaxy demographics. We calculate average volumetric rates of possible millisecond magnetar production channels such as superluminous supernovae (SLSNe), long and short gamma-ray bursts (GRBs), and general magnetar production via core-collapse supernovae. For each channel we also explore the expected host galaxy demographics using their known properties. We determine for the first time the number density of FRB emitters (the product of their volumetric birthrate and lifetime), RFRBτ104R_{\rm FRB}\tau\approx 10^4Gpc3^{-3}, assuming that FRBs are predominantly emitted from repetitive sources similar to FRB 121102 and adopting a beaming factor of 0.1. By comparing rates we find that production via rare channels (SLSNe, GRBs) implies a typical FRB lifetime of \approx30-300 yr, in good agreement with other lines of argument. The total energy emitted over this time is consistent with the available energy stored in the magnetic field. On the other hand, any relation to magnetars produced via normal core-collapse supernovae leads to a very short lifetime of \approx0.5yr, in conflict with both theory and observation. We demonstrate that due to the diverse host galaxy distributions of the different progenitor channels, many possible sources of FRB birth can be ruled out with 10\lesssim 10 host galaxy identifications. Conversely, targeted searches of galaxies that have previously hosted decades-old SLSNe and GRBs may be a fruitful strategy for discovering new FRBs and related quiescent radio sources, and determining the nature of their progenitors

    Greed is Super: A Fast Algorithm for Super-Resolution

    Get PDF
    We present a fast two-phase algorithm for super-resolution with strong theoretical guarantees. Given the low-frequency part of the spectrum of a sequence of impulses, Phase I consists of a greedy algorithm that roughly estimates the impulse positions. These estimates are then refined by local optimization in Phase II. In contrast to the convex relaxation proposed by Cand\`es et al., our approach has a low computational complexity but requires the impulses to be separated by an additional logarithmic factor to succeed. The backbone of our work is the fundamental work of Slepian et al. involving discrete prolate spheroidal wave functions and their unique properties

    Improved Artificial Neural Network with High Precision for Predicting Burnout among Managers and Employees of Start-Ups during COVID-19 Pandemic

    Get PDF
    Notwithstanding the impact that the Coronavirus pandemic has had on the physical and psychological wellness of people, it has also caused a change in the psychological conditions of many employees, particularly among organizations and privately owned businesses, which confronted numerous limitations because of the unique states of the pandemic. Accordingly, the current review expected to implement an RBF neural network to dissect the connection between demographic variables, resilience, Coronavirus, and burnout in start-ups. The examination technique was quantitative. The statistical populace of the review is directors and representatives of start-ups. In view of the statistical sample size of the limitless community, 384 of them were investigated. For information gathering, standard polls incorporating MBI-GS and BRCS and specialist-made surveys of pressure brought about by Coronavirus were utilized. The validity of the polls was affirmed by a board of specialists and their reliability was affirmed by Cronbach’s alpha coefficient. The designed network structure had ten neurons in the input layer, forty neurons in the hidden layer, and one neuron in the output layer. The amount of training and test data were 70% and 30%, respectively. The output of the neural network and the collected results were compared with each other, and the designed network was able to classify all the data correctly. Using the method presented in this research can greatly help the sustainability of companies

    Developing a smart and clean technology for bioremediation of antibiotic contamination in arable lands

    Get PDF
    This study presents a smart technological framework to efficiently remove azithromycin from natural soil resources using bioremediation techniques. The framework consists of several modules, each with different models such as Penicillium Simplicissimum (PS) bioactivity, soft computing models, statistical optimisation, Machine Learning (ML) algorithms, and Decision Tree (DT) control system based on Removal Percentage (RP). The first module involves designing experiments using a literature review and the Taguchi Orthogonal design method for cultural conditions. The RP is predicted as a function of cultural parameters using Response Surface Methodology (RSM) and three ML algorithms: Instance-Based K (IBK), KStar, and Locally Weighted Learning (LWL). The sensitivity analysis shows that pH is the most important factor among all parameters, including pH, Aeration Intensity (AI), Temperature, Microbial/Food (M/F) ratio, and Retention Time (RT), with a p-value of <0.0001. AI is the next most significant parameter, also with a p-value of <0.0001. The optimal biological conditions for removing azithromycin from soil resources are a temperature of 32 °C, pH of 5.5, M/F ratio of 1.59 mg/g, and AI of 8.59 m3/h. During the 100-day bioremediation process, RP was found to be an insignificant factor for more than 25 days, which simplifies the conditions. Among the ML algorithms, the IBK model provided the most accurate prediction of RT, with a correlation coefficient of over 95%
    corecore