373,848 research outputs found
Enhanced BC Algorithm Incorporating a Novel Sampling Step and a Fractional Box Count
The Box-Counting (BC) method is one of the most commonly used algorithms for fractal dimension calculation of binary images in the fields of Engineering, Science, Medical Science, Geology and so on due to its simplicity and reliability.
One of the issues related to fractal dimension is data sampling that involves a process where a certain size of box is taken from a given image and it has a direct effect on the precision of the fractal dimension estimation. The Geometric Step (GS) method, arithmetic step method, and divisor step method are the representative methods. The GS method is mainly used because of its efficiency. However, the GS method has some drawbacks in nature. If the image size is large, it provides insufficient data for regression analysis. It can be applied to the image of pixel size for 100 [%] pixel utilization. Application of the GS method to an image of may waste pixels in the calculation and degrade the estimation accuracy.
In this thesis, a novel sampling method is proposed in order to resolve the shortcomings of the GS method on the basis of the intuitive observation that an estimate may have a higher degree of precision if more pixels are utilized in each step and a sufficiently large number of fitting data are guaranteed. The proposed sampling method is an improved version of the conventional GS method, called the modified GS (MGS) method. The MGS method selects some additional step sizes with higher pixel utilization rate among the middle values between the integer powers of 2 to constitute the overall step set with the GS method.
Not all sampling methods including the MGS method can guarantee 100 [%] pixel utilization when the BC method is applied to images of an arbitrary size. This study suggests a novel fractional counting method to resolve the problem of pixel waste. The proposed counting method counts pixels of fractal within a discarded box (not of size) and adds its fractional count normalized by both the average pixel number of all boxes with size and step size to integer count.
The performance of the enhanced BC method incorporating the MGS method and fractional counting method is verified on a set of deterministic fractal images whose theoretical dimensions are well known and compared it with those of the existing BC methods. The experimental results show that the proposed method outperforms the conventional BC method and triangle BC method.Contents
List of Tables ā
²
List of Figures ā
³
Abstract ā
µ
Chapter 1. Introduction
1.1 Motivation 1
1.2 Research objectives 3
1.3 Organization of the thesis 3
Chapter 2. Overview of Fractal Theory
2.1 Definition of fractal 5
2.2 Fractal dimension 7
2.3 Fractal geometry 9
2.3.1 Mandelbrot set and Julia set 10
2.3.2 Koch snowflake (Opened) 11
2.3.3 Apollonian gasket 12
2.3.4 Vicsek fractal 13
2.3.5 Sierpinski triangle 14
2.3.6 Rand cantor 15
2.3.7 Koch curve 85Ā° 16
2.3.8 Sierpinski carpet 17
2.3.9 Hilbert curve 18
Chapter 3. Existing Box-Counting Methods
3.1 Conventional BC method 20
3.2 Triangle BC method 25
Chapter 4. Enhanced BC method
4.1 Existing sampling methods and their drawbacks 27
4.1.1 Sampling methods 27
4.1.2 Pixel utilization 30
4.1.3 Drawbacks of existing sampling methods 30
4.2 New sampling method 32
4.3 Fractional box count 35
4.4 Procedure of the enhanced BC method 38
Chapter 5. Experiments and Review
5.1 Experiments on deterministic fractal image 41
5.1.1 test image 41
5.1.2 Determination of 43
5.1.3 Experiment with images of pixels 44
5.1.4 Experiments on rotated image 45
5.1.5 Experiment with images of pixels 46
5.2 Experiments on non-deterministic fractal images 51
5.2.1 Converting color images to binary images 51
5.2.2 Coastline images 52
Chapter 6. Conclusion 56
References 58
Appendix 61Maste
Recommended from our members
Stochastic Yield Analysis of Rare Failure Events in High-Dimensional Variation Space
As semiconductor industry kept shrinking the feature size to nanometer scale, circuit reliability has become an area of growing concern due to the uncertainty introduced by process variations. For highly-replicated standard cells, the failure event for each individual component must be extremely rare in order to maintain sufficiently high yield rate. Existing yield analysis approaches works fine at low dimension, but less effective either when there are a large amount of circuit parameters, or when the failure samples are distributed in multiple regions. In this thesis, four novel high sigma analysis approaches have been proposed. First, we propose an adaptive importance sampling (AIS) algorithm. AIS has several iterations of sampling region adjustments, while existing methods pre-decide a static sampling distribution. At each iteration, AIS generates samples from current proposed distribution. Next, AIS carefully assigns weight to each sample based on its tilted occurrence probability between failure region and current failure region distribution. Then we design two adaptive frameworks based on Resampling and population Metropolis-Hastings (MH) to iteratively search for failure regions. Second, we develop an Adaptive Clustering and Sampling (ACS) method to estimate the failure rate of high-dimensional and multi-failure-region circuit cases. The basic idea of the algorithm is to cluster failure samples and build global sampling distribution at each iteration. Specifically, in clustering step, we propose a multi-cone clustering method, which partitions the parametric space and clusters failure samples. Then global sampling distribution is constructed from a set of weighted Gaussian distributions. Next, we calculate importance weight for each sample based on the discrepancy between sampling distribution and target distribution. Failure probability is updated at the end of each iteration. This clustering and sampling procedure proceeds iteratively until all the failure regions are covered.Moreover, two meta-model based approaches are proposed for high sigma analysis. The Low-Rank Tensor Approximation (LRTA) formulate the meta-model in tensor space by representing a multi-way tensor into a finite sum of rank-one tensor. The polynomial degree of our LRTA model grows linearly with circuit dimension, which makes it especially promising for high-dimensional circuit problems. Then we solve our LRTA model efficiently with a robust greedy algorithm, and calibrate iteratively with an adaptive sampling method. The meta-model based importance sampling (MIS) method utilizes Gaussian Process meta-model to construct quasi-optimal importance sampling distribution, and performs Markov Chain Monte Carlo (MCMC) simulation to generate new samples from the proposed distribution. By updating our global Importance Sampling estimator in an iterated framework, MIS leads to better efficiency and higher accuracy than traditional importance sampling methods. Experiment results validate that the proposed approaches are 3 orders faster than Monte Carlo, and more accurate than both academia solutions such as importance sampling and classification based methods, and industrial solutions such as mixture IS used by Intel
Spatial-temporal fractions verification for high-resolution ensemble forecasts
Experiments with two ensemble systems of the resolutions of 10 km (MF10km) and 2 km (MF2km) were designed to examine the value of cloud-resolving ensemble forecast in predicting small spatiotemporal-scale precipitation. Since the verification was performed on short-term precipitation at high resolution, uncertainties from small-scale processes caused the traditional verification methods inconsistent with the subjective evaluation. An extended verification method based on the Fractions Skill Score (FSS) was introduced to account for these uncertainties. The main idea is to extend the concept of spatial neighborhood in FSS to the time and ensemble dimension. The extension was carried out by recognizing that even if ensemble forecast is used, small-scale variability still exists in forecasts and influences verification results. In addition to FSS, the neighborhood concept was also incorporated into reliability diagrams and relative operating characteristics to verify the reliability and resolution of two systems. The extension of FSS in time dimension demonstrates the important role of temporal scales in short-term precipitation verification at small spatial scales. The extension of FSS in ensemble space is called ensemble FSS, which is a good representative of FSS in ensemble forecast in comparison with FSS of ensemble mean. The verification results show that MF2km outperforms MF10km in heavy rain forecasts. In contrast, MF10km was slightly better than MF2km in predicting light rain, suggesting that the horizontal resolution of 2 km is not necessarily enough to completely resolve convective cells
High-Dimensional Stochastic Design Optimization by Adaptive-Sparse Polynomial Dimensional Decomposition
This paper presents a novel adaptive-sparse polynomial dimensional
decomposition (PDD) method for stochastic design optimization of complex
systems. The method entails an adaptive-sparse PDD approximation of a
high-dimensional stochastic response for statistical moment and reliability
analyses; a novel integration of the adaptive-sparse PDD approximation and
score functions for estimating the first-order design sensitivities of the
statistical moments and failure probability; and standard gradient-based
optimization algorithms. New analytical formulae are presented for the design
sensitivities that are simultaneously determined along with the moments or the
failure probability. Numerical results stemming from mathematical functions
indicate that the new method provides more computationally efficient design
solutions than the existing methods. Finally, stochastic shape optimization of
a jet engine bracket with 79 variables was performed, demonstrating the power
of the new method to tackle practical engineering problems.Comment: 18 pages, 2 figures, to appear in Sparse Grids and
Applications--Stuttgart 2014, Lecture Notes in Computational Science and
Engineering 109, edited by J. Garcke and D. Pfl\"{u}ger, Springer
International Publishing, 201
- ā¦