2,251 research outputs found

    Computing a Compact Spline Representation of the Medial Axis Transform of a 2D Shape

    Full text link
    We present a full pipeline for computing the medial axis transform of an arbitrary 2D shape. The instability of the medial axis transform is overcome by a pruning algorithm guided by a user-defined Hausdorff distance threshold. The stable medial axis transform is then approximated by spline curves in 3D to produce a smooth and compact representation. These spline curves are computed by minimizing the approximation error between the input shape and the shape represented by the medial axis transform. Our results on various 2D shapes suggest that our method is practical and effective, and yields faithful and compact representations of medial axis transforms of 2D shapes.Comment: GMP14 (Geometric Modeling and Processing

    A different view on the vector-valued empirical mode decomposition (VEMD)

    Full text link
    The empirical mode decomposition (EMD) has achieved its reputation by providing a multi-scale time-frequency representation of nonlinear and/or nonstationary signals. To extend this method to vector-valued signals (VvS) in multidimensional (multi-D) space, a multivariate EMD (MEMD) has been designed recently, which employs an ensemble projection to extract local extremum locations (LELs) of the given VvS with respect to different projection directions. This idea successfully overcomes the problems of locally defining extrema of VvS. Different from the MEMD, where vector-valued envelopes (VvEs) are interpolated based on LELs extracted from the 1-D projected signal, the vector-valued EMD (VEMD) proposed in this paper employs a novel back projection method to interpolate the VvEs from 1-D envelopes in the projected space. Considering typical 4-D coordinates (3-D location and time), we show by numerical simulations that the VEMD outperforms state-of-art methods.Comment: 7th International Congress on Image and Signal Processing (CISP

    Perceptual techniques in audio quality assessment

    Get PDF

    Bayesian Boundary Trend Filtering

    Full text link
    Estimating boundary curves has many applications such as economics, climate science, and medicine. Bayesian trend filtering has been developed as one of locally adaptive smoothing methods to estimate the non-stationary trend of data. This paper develops a Bayesian trend filtering for estimating the boundary trend. To this end, the truncated multivariate normal working likelihood and global-local shrinkage priors based on the scale mixtures of normal distribution are introduced. In particular, well-known horseshoe prior for difference leads to locally adaptive shrinkage estimation for boundary trend. However, the full conditional distributions of the Gibbs sampler involve high-dimensional truncated multivariate normal distribution. To overcome the difficulty of sampling, an approximation of truncated multivariate normal distribution is employed. Using the approximation, the proposed models lead to an efficient Gibbs sampling algorithm via the P\'olya-Gamma data augmentation. The proposed method is also extended by considering a nearly isotonic constraint. The performance of the proposed method is illustrated through some numerical experiments and real data examples.Comment: 25 pages, 6 figure

    Detail Enhancing Denoising of Digitized 3D Models from a Mobile Scanning System

    Get PDF
    The acquisition process of digitizing a large-scale environment produces an enormous amount of raw geometry data. This data is corrupted by system noise, which leads to 3D surfaces that are not smooth and details that are distorted. Any scanning system has noise associate with the scanning hardware, both digital quantization errors and measurement inaccuracies, but a mobile scanning system has additional system noise introduced by the pose estimation of the hardware during data acquisition. The combined system noise generates data that is not handled well by existing noise reduction and smoothing techniques. This research is focused on enhancing the 3D models acquired by mobile scanning systems used to digitize large-scale environments. These digitization systems combine a variety of sensors – including laser range scanners, video cameras, and pose estimation hardware – on a mobile platform for the quick acquisition of 3D models of real world environments. The data acquired by such systems are extremely noisy, often with significant details being on the same order of magnitude as the system noise. By utilizing a unique 3D signal analysis tool, a denoising algorithm was developed that identifies regions of detail and enhances their geometry, while removing the effects of noise on the overall model. The developed algorithm can be useful for a variety of digitized 3D models, not just those involving mobile scanning systems. The challenges faced in this study were the automatic processing needs of the enhancement algorithm, and the need to fill a hole in the area of 3D model analysis in order to reduce the effect of system noise on the 3D models. In this context, our main contributions are the automation and integration of a data enhancement method not well known to the computer vision community, and the development of a novel 3D signal decomposition and analysis tool. The new technologies featured in this document are intuitive extensions of existing methods to new dimensionality and applications. The totality of the research has been applied towards detail enhancing denoising of scanned data from a mobile range scanning system, and results from both synthetic and real models are presented

    Optimized normal and distance matching for heterogeneous object modeling

    Get PDF
    This paper presents a new optimization methodology of material blending for heterogeneous object modeling by matching the material governing features for designing a heterogeneous object. The proposed method establishes point-to-point correspondence represented by a set of connecting lines between two material directrices. To blend the material features between the directrices, a heuristic optimization method developed with the objective is to maximize the sum of the inner products of the unit normals at the end points of the connecting lines and minimize the sum of the lengths of connecting lines. The geometric features with material information are matched to generate non-self-intersecting and non-twisted connecting surfaces. By subdividing the connecting lines into equal number of segments, a series of intermediate piecewise curves are generated to represent the material metamorphosis between the governing material features. Alternatively, a dynamic programming approach developed in our earlier work is presented for comparison purposes. Result and computational efficiency of the proposed heuristic method is also compared with earlier techniques in the literature. Computer interface implementation and illustrative examples are also presented in this paper

    Efficient DSP and Circuit Architectures for Massive MIMO: State-of-the-Art and Future Directions

    Full text link
    Massive MIMO is a compelling wireless access concept that relies on the use of an excess number of base-station antennas, relative to the number of active terminals. This technology is a main component of 5G New Radio (NR) and addresses all important requirements of future wireless standards: a great capacity increase, the support of many simultaneous users, and improvement in energy efficiency. Massive MIMO requires the simultaneous processing of signals from many antenna chains, and computational operations on large matrices. The complexity of the digital processing has been viewed as a fundamental obstacle to the feasibility of Massive MIMO in the past. Recent advances on system-algorithm-hardware co-design have led to extremely energy-efficient implementations. These exploit opportunities in deeply-scaled silicon technologies and perform partly distributed processing to cope with the bottlenecks encountered in the interconnection of many signals. For example, prototype ASIC implementations have demonstrated zero-forcing precoding in real time at a 55 mW power consumption (20 MHz bandwidth, 128 antennas, multiplexing of 8 terminals). Coarse and even error-prone digital processing in the antenna paths permits a reduction of consumption with a factor of 2 to 5. This article summarizes the fundamental technical contributions to efficient digital signal processing for Massive MIMO. The opportunities and constraints on operating on low-complexity RF and analog hardware chains are clarified. It illustrates how terminals can benefit from improved energy efficiency. The status of technology and real-life prototypes discussed. Open challenges and directions for future research are suggested.Comment: submitted to IEEE transactions on signal processin
    • …
    corecore