28,839 research outputs found
Manufacturing complexity analysis
The analysis of the complexity of a typical system is presented. Starting with the subsystems of an example system, the step-by-step procedure for analysis of the complexity of an overall system is given. The learning curves for the various subsystems are determined as well as the concurrent numbers of relevant design parameters. Then trend curves are plotted for the learning curve slopes versus the various design-oriented parameters, e.g. number of parts versus slope of learning curve, or number of fasteners versus slope of learning curve, etc. Representative cuts are taken from each trend curve, and a figure-of-merit analysis is made for each of the subsystems. Based on these values, a characteristic curve is plotted which is indicative of the complexity of the particular subsystem. Each such characteristic curve is based on a universe of trend curve data taken from data points observed for the subsystem in question. Thus, a characteristic curve is developed for each of the subsystems in the overall system
Complexity analysis of the stock market
We study the complexity of the stock market by constructing
-machines of Standard and Poor's 500 index from February 1983 to
April 2006 and by measuring the statistical complexities. It is found that both
the statistical complexity and the number of causal states of constructed
-machines have decreased for last twenty years and that the average
memory length needed to predict the future optimally has become shorter. These
results support that the information is delivered to the economic agents and
applied to the market prices more rapidly in year 2006 than in year 1983.Comment: 9 pages, 4 figure
Modular Complexity Analysis for Term Rewriting
All current investigations to analyze the derivational complexity of term
rewrite systems are based on a single termination method, possibly preceded by
transformations. However, the exclusive use of direct criteria is problematic
due to their restricted power. To overcome this limitation the article
introduces a modular framework which allows to infer (polynomial) upper bounds
on the complexity of term rewrite systems by combining different criteria.
Since the fundamental idea is based on relative rewriting, we study how matrix
interpretations and match-bounds can be used and extended to measure complexity
for relative rewriting, respectively. The modular framework is proved strictly
more powerful than the conventional setting. Furthermore, the results have been
implemented and experiments show significant gains in power.Comment: 33 pages; Special issue of RTA 201
A complexity analysis of statistical learning algorithms
We apply information-based complexity analysis to support vector machine
(SVM) algorithms, with the goal of a comprehensive continuous algorithmic
analysis of such algorithms. This involves complexity measures in which some
higher order operations (e.g., certain optimizations) are considered primitive
for the purposes of measuring complexity. We consider classes of information
operators and algorithms made up of scaled families, and investigate the
utility of scaling the complexities to minimize error. We look at the division
of statistical learning into information and algorithmic components, at the
complexities of each, and at applications to support vector machine (SVM) and
more general machine learning algorithms. We give applications to SVM
algorithms graded into linear and higher order components, and give an example
in biomedical informatics
Recommended from our members
Time complexity analysis of generalized decomposition algorithm
The time complexity of the fast algorithm for generalized disjunctive decomposition of an rvalued function is studied.The considered algorithm to find the best decomposition is based on the analysis of multiple-terminal multiple-valued decision diagrams. It is shown that the time complexity for random rvalued functions depends on the such restriction as the number n1 of inputs in the first level circuit. In the case where the best partition of input variables is searched with restriction the time complexity is reduced in several times. The algorithm was simulated on a digital computer. The experimental results are in agreement with the theoretical predictions
Complexity Analysis of Balloon Drawing for Rooted Trees
In a balloon drawing of a tree, all the children under the same parent are
placed on the circumference of the circle centered at their parent, and the
radius of the circle centered at each node along any path from the root
reflects the number of descendants associated with the node. Among various
styles of tree drawings reported in the literature, the balloon drawing enjoys
a desirable feature of displaying tree structures in a rather balanced fashion.
For each internal node in a balloon drawing, the ray from the node to each of
its children divides the wedge accommodating the subtree rooted at the child
into two sub-wedges. Depending on whether the two sub-wedge angles are required
to be identical or not, a balloon drawing can further be divided into two
types: even sub-wedge and uneven sub-wedge types. In the most general case, for
any internal node in the tree there are two dimensions of freedom that affect
the quality of a balloon drawing: (1) altering the order in which the children
of the node appear in the drawing, and (2) for the subtree rooted at each child
of the node, flipping the two sub-wedges of the subtree. In this paper, we give
a comprehensive complexity analysis for optimizing balloon drawings of rooted
trees with respect to angular resolution, aspect ratio and standard deviation
of angles under various drawing cases depending on whether the tree is of even
or uneven sub-wedge type and whether (1) and (2) above are allowed. It turns
out that some are NP-complete while others can be solved in polynomial time. We
also derive approximation algorithms for those that are intractable in general
- …