178,701 research outputs found

    The generalized Lasso with non-linear observations

    Full text link
    We study the problem of signal estimation from non-linear observations when the signal belongs to a low-dimensional set buried in a high-dimensional space. A rough heuristic often used in practice postulates that non-linear observations may be treated as noisy linear observations, and thus the signal may be estimated using the generalized Lasso. This is appealing because of the abundance of efficient, specialized solvers for this program. Just as noise may be diminished by projecting onto the lower dimensional space, the error from modeling non-linear observations with linear observations will be greatly reduced when using the signal structure in the reconstruction. We allow general signal structure, only assuming that the signal belongs to some set K in R^n. We consider the single-index model of non-linearity. Our theory allows the non-linearity to be discontinuous, not one-to-one and even unknown. We assume a random Gaussian model for the measurement matrix, but allow the rows to have an unknown covariance matrix. As special cases of our results, we recover near-optimal theory for noisy linear observations, and also give the first theoretical accuracy guarantee for 1-bit compressed sensing with unknown covariance matrix of the measurement vectors.Comment: 21 page

    LIRA-SAPR PROGRAM FOR GENERATING DESIGN MODELS OF RECONSTRUCTED BUILDINGS

    Get PDF
    The paper deals with technique of simulation for buildings at maintenance stage with account of changes in structural model during reconstruction. The authours suggest algorithm for linear and nonlinear analysis of structures in LIRA-SAPR program with account of erection process. Generation of design models for reconstructed buildings are illustrated with real examples from design practice (reconstruction of 3-storey office building with overstorey; reconstruction of 5-storey hostel with built-in nonresidential premises when floor slabs are changed; reconstruction of building with account of defects that were detected and strengthening that was made; reconstruction of 9-storey residential building where gaz was exploded, with account of defects that were detected and strengthening that was made)

    EIT Reconstruction Algorithms: Pitfalls, Challenges and Recent Developments

    Full text link
    We review developments, issues and challenges in Electrical Impedance Tomography (EIT), for the 4th Workshop on Biomedical Applications of EIT, Manchester 2003. We focus on the necessity for three dimensional data collection and reconstruction, efficient solution of the forward problem and present and future reconstruction algorithms. We also suggest common pitfalls or ``inverse crimes'' to avoid.Comment: A review paper for the 4th Workshop on Biomedical Applications of EIT, Manchester, UK, 200

    Imaging via Compressive Sampling [Introduction to compressive sampling and recovery via convex programming]

    Get PDF
    There is an extensive body of literature on image compression, but the central concept is straightforward: we transform the image into an appropriate basis and then code only the important expansion coefficients. The crux is finding a good transform, a problem that has been studied extensively from both a theoretical [14] and practical [25] standpoint. The most notable product of this research is the wavelet transform [9], [16]; switching from sinusoid-based representations to wavelets marked a watershed in image compression and is the essential difference between the classical JPEG [18] and modern JPEG-2000 [22] standards. Image compression algorithms convert high-resolution images into a relatively small bit streams (while keeping the essential features intact), in effect turning a large digital data set into a substantially smaller one. But is there a way to avoid the large digital data set to begin with? Is there a way we can build the data compression directly into the acquisition? The answer is yes, and is what compressive sampling (CS) is all about

    Solving ptychography with a convex relaxation

    Get PDF
    Ptychography is a powerful computational imaging technique that transforms a collection of low-resolution images into a high-resolution sample reconstruction. Unfortunately, algorithms that are currently used to solve this reconstruction problem lack stability, robustness, and theoretical guarantees. Recently, convex optimization algorithms have improved the accuracy and reliability of several related reconstruction efforts. This paper proposes a convex formulation of the ptychography problem. This formulation has no local minima, it can be solved using a wide range of algorithms, it can incorporate appropriate noise models, and it can include multiple a priori constraints. The paper considers a specific algorithm, based on low-rank factorization, whose runtime and memory usage are near-linear in the size of the output image. Experiments demonstrate that this approach offers a 25% lower background variance on average than alternating projections, the current standard algorithm for ptychographic reconstruction.Comment: 8 pages, 8 figure
    • …
    corecore