433,568 research outputs found

    Theoretical and Experimental Analysis of a Randomized Algorithm for Sparse Fourier Transform Analysis

    Full text link
    We analyze a sublinear RAlSFA (Randomized Algorithm for Sparse Fourier Analysis) that finds a near-optimal B-term Sparse Representation R for a given discrete signal S of length N, in time and space poly(B,log(N)), following the approach given in \cite{GGIMS}. Its time cost poly(log(N)) should be compared with the superlinear O(N log N) time requirement of the Fast Fourier Transform (FFT). A straightforward implementation of the RAlSFA, as presented in the theoretical paper \cite{GGIMS}, turns out to be very slow in practice. Our main result is a greatly improved and practical RAlSFA. We introduce several new ideas and techniques that speed up the algorithm. Both rigorous and heuristic arguments for parameter choices are presented. Our RAlSFA constructs, with probability at least 1-delta, a near-optimal B-term representation R in time poly(B)log(N)log(1/delta)/ epsilon^{2} log(M) such that ||S-R||^{2}<=(1+epsilon)||S-R_{opt}||^{2}. Furthermore, this RAlSFA implementation already beats the FFTW for not unreasonably large N. We extend the algorithm to higher dimensional cases both theoretically and numerically. The crossover point lies at N=70000 in one dimension, and at N=900 for data on a N*N grid in two dimensions for small B signals where there is noise.Comment: 21 pages, 8 figures, submitted to Journal of Computational Physic

    Optoelectronic fiber webs for imaging applications

    Get PDF
    Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.Includes bibliographical references (p. 73).We demonstrate the use of novel visible and infrared light-sensitive optoelectronic fiber in the development of large scale photodector arrays. Unlike conventional point photodetectors these one-dimensional linear photodectors are capable of sensing light along the entire length of the fiber and 360 radially. Multiple fibers can be arranged in an orthogonal grid to create a two-dimensional fiber web. The fiber web is capable of tracking a time-and space-varying beam, and output it onto a computer screen. Other imaging applications for the fiber web include image recovery for 2D images based on Computed Axial Tomography concepts. and lensless imaging. Lensless imaging is accomplished d using two fiber webs separated by a fixed distance, recovering the intensity distribution on each fiber web, and applying a phase retrieval algorithm to the two distributions. Furthermore, fiber webs consisting of six planar arrays forming a cube can be used to detect incident light in three dimensions.by Jerimy Reeves Arnold.M.Eng

    Solar radiation distribution inside a monospan greenhouse with the roof entirely covered by photovoltaic panels

    Get PDF
    In the present work the variation over space and time of the amount of the photosynthetic photons flux density, inside a greenhouse entirely covered with photovoltaic panels was investigated experimentally and numerically. The greenhouse had 10.00 m spam width, 50.00 m length, 3.00 m height of the gutter, 6.60 m height of the edge. Data were acquired in the period 18th April-8th June 2014 by one sensor outside and one inside the experimental greenhouse built in Southern Italy. Numeric simulations were performed by means of commercial software Autodesk® Ecotect®. For the investigated greenhouse model, the exposed percentage the ratio of the calculated insolation at a particular point within an enclosure to the simultaneous unobstructed outdoor insolation under the same sky conditions-was calculated over a three dimensional grid formed by 50×10 ×15 cells each with 1.00×1.00×0.20 m size. The long-term analysis demonstrated a good capability of the numerical model to predict the shading effect inside a photovoltaic greenhouse combining the daily calculated exposed percentage with measurements of solar radiation. The model was able also to predict the qualitative behaviour of the variation of photons flux during the day even if the measured values showed a higher fluctuation of values

    Constructing grids for molecular quantum dynamics using an autoencoder

    Full text link
    A challenge for molecular quantum dynamics (QD) calculations is the curse of dimensionality with respect to the nuclear degrees of freedom. A common approach that works especially well for fast reactive processes is to reduce the dimensionality of the system to a few most relevant coordinates. Identifying these can become a very difficult task, since they often are highly unintuitive. We present a machine learning approach that utilizes an autoencoder that is trained to find a low-dimensional representation of a set of molecular configurations. These configurations are generated by trajectory calculations performed on the reactive molecular systems of interest. The resulting low-dimensional representation can be used to generate a potential energy surface grid in the desired subspace. Using the G-matrix formalism to calculate the kinetic energy operator, QD calculations can be carried out on this grid. In addition to step-by-step instructions for the grid construction, we present the application to a test system.Comment: 24 pages, 6 figures, articl

    Three dimensional extension of Bresenham’s algorithm with Voronoi diagram

    Get PDF
    Bresenham’s algorithm for plotting a two-dimensional line segment is elegant and efficient in its deployment of mid-point comparison and integer arithmetic. It is natural to investigate its three-dimensional extensions. In so doing, this paper uncovers the reason for little prior work. The concept of the mid-point in a unit interval generalizes to that of nearest neighbours involving a Voronoi diagram. Algorithmically, there are challenges. While a unit interval in two-dimension becomes a unit square in three-dimension, “squaring” the number of choices in Bresenham’s algorithm is shown to have difficulties. In this paper, the three-dimensional extension is based on the main idea of Bresenham’s algorithm of minimum distance between the line and the grid points. The structure of the Voronoi diagram is presented for grid points to which the line may be approximated. The deployment of integer arithmetic and symmetry for the three-dimensional extension of the algorithm to raise the computation efficiency are also investigated

    Efficient cosmological parameter sampling using sparse grids

    Full text link
    We present a novel method to significantly speed up cosmological parameter sampling. The method relies on constructing an interpolation of the CMB-log-likelihood based on sparse grids, which is used as a shortcut for the likelihood-evaluation. We obtain excellent results over a large region in parameter space, comprising about 25 log-likelihoods around the peak, and we reproduce the one-dimensional projections of the likelihood almost perfectly. In speed and accuracy, our technique is competitive to existing approaches to accelerate parameter estimation based on polynomial interpolation or neural networks, while having some advantages over them. In our method, there is no danger of creating unphysical wiggles as it can be the case for polynomial fits of a high degree. Furthermore, we do not require a long training time as for neural networks, but the construction of the interpolation is determined by the time it takes to evaluate the likelihood at the sampling points, which can be parallelised to an arbitrary degree. Our approach is completely general, and it can adaptively exploit the properties of the underlying function. We can thus apply it to any problem where an accurate interpolation of a function is needed.Comment: Submitted to MNRAS, 13 pages, 13 figure
    corecore