178,701 research outputs found
The generalized Lasso with non-linear observations
We study the problem of signal estimation from non-linear observations when
the signal belongs to a low-dimensional set buried in a high-dimensional space.
A rough heuristic often used in practice postulates that non-linear
observations may be treated as noisy linear observations, and thus the signal
may be estimated using the generalized Lasso. This is appealing because of the
abundance of efficient, specialized solvers for this program. Just as noise may
be diminished by projecting onto the lower dimensional space, the error from
modeling non-linear observations with linear observations will be greatly
reduced when using the signal structure in the reconstruction. We allow general
signal structure, only assuming that the signal belongs to some set K in R^n.
We consider the single-index model of non-linearity. Our theory allows the
non-linearity to be discontinuous, not one-to-one and even unknown. We assume a
random Gaussian model for the measurement matrix, but allow the rows to have an
unknown covariance matrix. As special cases of our results, we recover
near-optimal theory for noisy linear observations, and also give the first
theoretical accuracy guarantee for 1-bit compressed sensing with unknown
covariance matrix of the measurement vectors.Comment: 21 page
LIRA-SAPR PROGRAM FOR GENERATING DESIGN MODELS OF RECONSTRUCTED BUILDINGS
The paper deals with technique of simulation for buildings at maintenance stage with account of changes in structural model during reconstruction. The authours suggest algorithm for linear and nonlinear analysis of structures in LIRA-SAPR program with account of erection process. Generation of design models for reconstructed buildings are illustrated with real examples from design practice (reconstruction of 3-storey office building with overstorey; reconstruction of 5-storey hostel with built-in nonresidential premises when floor slabs are changed; reconstruction of building with account of defects that were detected and strengthening that was made; reconstruction of 9-storey residential building where gaz was exploded, with account of defects that were detected and strengthening that was made)
EIT Reconstruction Algorithms: Pitfalls, Challenges and Recent Developments
We review developments, issues and challenges in Electrical Impedance
Tomography (EIT), for the 4th Workshop on Biomedical Applications of EIT,
Manchester 2003. We focus on the necessity for three dimensional data
collection and reconstruction, efficient solution of the forward problem and
present and future reconstruction algorithms. We also suggest common pitfalls
or ``inverse crimes'' to avoid.Comment: A review paper for the 4th Workshop on Biomedical Applications of
EIT, Manchester, UK, 200
Imaging via Compressive Sampling [Introduction to compressive sampling and recovery via convex programming]
There is an extensive body of literature on image compression, but the central concept is straightforward: we transform the image into an appropriate basis and then code only the important expansion coefficients. The crux is finding a good transform, a problem that has been studied extensively from both a theoretical [14] and practical [25] standpoint. The most notable product of this research is the wavelet transform [9], [16]; switching from sinusoid-based representations to wavelets marked a watershed in image compression and is the essential difference between the classical JPEG [18] and modern JPEG-2000 [22] standards.
Image compression algorithms convert high-resolution images into a relatively small bit streams (while keeping the essential features intact), in effect turning a large digital data set into a substantially smaller one. But is there a way to avoid the large digital data set to begin with? Is there a way we can build the data compression directly into the acquisition? The answer is yes, and is what compressive sampling (CS) is all about
Recommended from our members
Variability in Exposure to Subspecialty Rotations During Orthopaedic Residency: A Website-based Review of Orthopaedic Residency Programs.
IntroductionThe variability in exposure to various subspecialty rotations during orthopaedic residency across the United States has not been well studied.MethodsData regarding program size, resident's sex, department leadership, university-based status of the program, outsourcing of subspecialty rotation, and geographic location were collected from websites of 151 US allopathic orthopaedic residency programs. The relationship of these factors with the time allotted for various clinical rotations was analyzed.ResultsThe number of residents in a program correlated positively with time allocated for elective rotations (r = 0.57, P = 0.0003). Residents in programs where the program director was a general orthopaedic surgeon spent more time on general orthopaedic rotations (22 versus 9.9 months, P = 0.001). Programs where the program director or chairman was an orthopaedic oncologist spent more time on oncology rotations ([3.8 versus 3 months, P = 0.01] and [3.5 versus 2.7 months, P = 0.01], respectively). Residents in community programs spent more time on adult reconstruction than university-based programs (6.6 versus 5.5 months, P = 0.014). Based on multiple linear regression analysis, time allotted for adult reconstruction (t = 2.29, P = 0.02) and elective rotations (t = 2.43, P = 0.017) was positively associated with the number of residents in the program.ConclusionsSubstantial variability exists in the time allocated to various clinical rotations during orthopaedic residency. The effect of this variability on clinical competence, trainees' career choices, and quality of patient care needs further study
Solving ptychography with a convex relaxation
Ptychography is a powerful computational imaging technique that transforms a
collection of low-resolution images into a high-resolution sample
reconstruction. Unfortunately, algorithms that are currently used to solve this
reconstruction problem lack stability, robustness, and theoretical guarantees.
Recently, convex optimization algorithms have improved the accuracy and
reliability of several related reconstruction efforts. This paper proposes a
convex formulation of the ptychography problem. This formulation has no local
minima, it can be solved using a wide range of algorithms, it can incorporate
appropriate noise models, and it can include multiple a priori constraints. The
paper considers a specific algorithm, based on low-rank factorization, whose
runtime and memory usage are near-linear in the size of the output image.
Experiments demonstrate that this approach offers a 25% lower background
variance on average than alternating projections, the current standard
algorithm for ptychographic reconstruction.Comment: 8 pages, 8 figure
- …