2,094 research outputs found

    Interactive computation of radiation view factors

    Get PDF
    The development of a pair of computer programs to calculate the radiation exchange view factors is described. The surface generation program is based upon current graphics capabilities and includes special provisions which are unique to the radiation problem. The calculational program uses a combination of contour and double area integration to permit consideration of radiation with obstruction surfaces. Examples of the surface generation and the calculation are given

    Lorentzian Iterative Hard Thresholding: Robust Compressed Sensing with Prior Information

    Full text link
    Commonly employed reconstruction algorithms in compressed sensing (CS) use the L2L_2 norm as the metric for the residual error. However, it is well-known that least squares (LS) based estimators are highly sensitive to outliers present in the measurement vector leading to a poor performance when the noise no longer follows the Gaussian assumption but, instead, is better characterized by heavier-than-Gaussian tailed distributions. In this paper, we propose a robust iterative hard Thresholding (IHT) algorithm for reconstructing sparse signals in the presence of impulsive noise. To address this problem, we use a Lorentzian cost function instead of the L2L_2 cost function employed by the traditional IHT algorithm. We also modify the algorithm to incorporate prior signal information in the recovery process. Specifically, we study the case of CS with partially known support. The proposed algorithm is a fast method with computational load comparable to the LS based IHT, whilst having the advantage of robustness against heavy-tailed impulsive noise. Sufficient conditions for stability are studied and a reconstruction error bound is derived. We also derive sufficient conditions for stable sparse signal recovery with partially known support. Theoretical analysis shows that including prior support information relaxes the conditions for successful reconstruction. Simulation results demonstrate that the Lorentzian-based IHT algorithm significantly outperform commonly employed sparse reconstruction techniques in impulsive environments, while providing comparable performance in less demanding, light-tailed environments. Numerical results also demonstrate that the partially known support inclusion improves the performance of the proposed algorithm, thereby requiring fewer samples to yield an approximate reconstruction.Comment: 28 pages, 9 figures, accepted in IEEE Transactions on Signal Processin

    Differentially Private Mixture of Generative Neural Networks

    Get PDF
    Generative models are used in a wide range of applications building on large amounts of contextually rich information. Due to possible privacy violations of the individuals whose data is used to train these models, however, publishing or sharing generative models is not always viable. In this paper, we present a novel technique for privately releasing generative models and entire high-dimensional datasets produced by these models. We model the generator distribution of the training data with a mixture of kk generative neural networks. These are trained together and collectively learn the generator distribution of a dataset. Data is divided into kk clusters, using a novel differentially private kernel kk-means, then each cluster is given to separate generative neural networks, such as Restricted Boltzmann Machines or Variational Autoencoders, which are trained only on their own cluster using differentially private gradient descent. We evaluate our approach using the MNIST dataset, as well as call detail records and transit datasets, showing that it produces realistic synthetic samples, which can also be used to accurately compute arbitrary number of counting queries.Comment: A shorter version of this paper appeared at the 17th IEEE International Conference on Data Mining (ICDM 2017). This is the full version, published in IEEE Transactions on Knowledge and Data Engineering (TKDE

    Simultaneous multi-band detection of Low Surface Brightness galaxies with Markovian modelling

    Get PDF
    We present an algorithm for the detection of Low Surface Brightness (LSB) galaxies in images, called MARSIAA (MARkovian Software for Image Analysis in Astronomy), which is based on multi-scale Markovian modeling. MARSIAA can be applied simultaneously to different bands. It segments an image into a user-defined number of classes, according to their surface brightness and surroundings - typically, one or two classes contain the LSB structures. We have developed an algorithm, called DetectLSB, which allows the efficient identification of LSB galaxies from among the candidate sources selected by MARSIAA. To assess the robustness of our method, the method was applied to a set of 18 B and I band images (covering 1.3 square degrees in total) of the Virgo cluster. To further assess the completeness of the results of our method, both MARSIAA, SExtractor, and DetectLSB were applied to search for (i) mock Virgo LSB galaxies inserted into a set of deep Next Generation Virgo Survey (NGVS) gri-band subimages and (ii) Virgo LSB galaxies identified by eye in a full set of NGVS square degree gri images. MARSIAA/DetectLSB recovered ~20% more mock LSB galaxies and ~40% more LSB galaxies identified by eye than SExtractor/DetectLSB. With a 90% fraction of false positives from an entirely unsupervised pipeline, a completeness of 90% is reached for sources with r_e > 3" at a mean surface brightness level of mu_g=27.7 mag/arcsec^2 and a central surface brightness of mu^0 g=26.7 mag/arcsec^2. About 10% of the false positives are artifacts, the rest being background galaxies. We have found our method to be complementary to the application of matched filters and an optimized use of SExtractor, and to have the following advantages: it is scale-free, can be applied simultaneously to several bands, and is well adapted for crowded regions on the sky.Comment: 39 pages, 18 figures, accepted for publication in A

    Spectral Mapping Reconstruction of Extended Sources

    Get PDF
    Three dimensional spectroscopy of extended sources is typically performed with dedicated integral field spectrographs. We describe a method of reconstructing full spectral cubes, with two spatial and one spectral dimension, from rastered spectral mapping observations employing a single slit in a traditional slit spectrograph. When the background and image characteristics are stable, as is often achieved in space, the use of traditional long slits for integral field spectroscopy can substantially reduce instrument complexity over dedicated integral field designs, without loss of mapping efficiency -- particularly compelling when a long slit mode for single unresolved source followup is separately required. We detail a custom flux-conserving cube reconstruction algorithm, discuss issues of extended source flux calibration, and describe CUBISM, a tool which implements these methods for spectral maps obtained with ther Spitzer Space Telescope's Infrared Spectrograph.Comment: 11 pages, 8 figures, accepted by PAS
    • …
    corecore