3,368,133 research outputs found

    Manufacturing complexity analysis

    Get PDF
    The analysis of the complexity of a typical system is presented. Starting with the subsystems of an example system, the step-by-step procedure for analysis of the complexity of an overall system is given. The learning curves for the various subsystems are determined as well as the concurrent numbers of relevant design parameters. Then trend curves are plotted for the learning curve slopes versus the various design-oriented parameters, e.g. number of parts versus slope of learning curve, or number of fasteners versus slope of learning curve, etc. Representative cuts are taken from each trend curve, and a figure-of-merit analysis is made for each of the subsystems. Based on these values, a characteristic curve is plotted which is indicative of the complexity of the particular subsystem. Each such characteristic curve is based on a universe of trend curve data taken from data points observed for the subsystem in question. Thus, a characteristic curve is developed for each of the subsystems in the overall system

    An analysis of MRI derived cortical complexity in premature-born adults : regional patterns, risk factors, and potential significance

    Get PDF
    Premature birth bears an increased risk for aberrant brain development concerning its structure and function. Cortical complexity (CC) expresses the fractal dimension of the brain surface and changes during neurodevelopment. We hypothesized that CC is altered after premature birth and associated with long-term cognitive development. One-hundred-and-one very premature-born adults (gestational age <32 weeks and/or birth weight <1500 ​g) and 111 term-born adults were assessed by structural MRI and cognitive testing at 26 years of age. CC was measured based on MRI by vertex-wise estimation of fractal dimension. Cognitive performance was measured based on Griffiths-Mental-Development-Scale (at 20 months) and Wechsler-Adult-Intelligence-Scales (at 26 years). In premature-born adults, CC was decreased bilaterally in large lateral temporal and medial parietal clusters. Decreased CC was associated with lower gestational age and birth weight. Furthermore, decreased CC in the medial parietal cortices was linked with reduced full-scale IQ of premature-born adults and mediated the association between cognitive development at 20 months and IQ in adulthood. Results demonstrate that CC is reduced in very premature-born adults in temporoparietal cortices, mediating the impact of prematurity on impaired cognitive development. These data indicate functionally relevant long-term alterations in the brain’s basic geometry of cortical organization in prematurity

    Complexity analysis of the stock market

    Get PDF
    We study the complexity of the stock market by constructing ϵ\epsilon-machines of Standard and Poor's 500 index from February 1983 to April 2006 and by measuring the statistical complexities. It is found that both the statistical complexity and the number of causal states of constructed ϵ\epsilon-machines have decreased for last twenty years and that the average memory length needed to predict the future optimally has become shorter. These results support that the information is delivered to the economic agents and applied to the market prices more rapidly in year 2006 than in year 1983.Comment: 9 pages, 4 figure

    GTA: Groupware task analysis Modeling complexity

    Get PDF
    The task analysis methods discussed in this presentation stem from Human-Computer Interaction (HCI) and Ethnography (as applied for the design of Computer Supported Cooperative Work CSCW), different disciplines that often are considered conflicting approaches when applied to the same design problems. Both approaches have their strength and weakness, and an integration of them does add value to the early stages of design of cooperation technology. In order to develop an integrated method for groupware task analysis (GTA) a conceptual framework is presented that allows a systematic perspective on complex work phenomena. The framework features a triple focus, considering (a) people, (b) work, and (c) the situation. Integrating various task-modeling approaches requires vehicles for making design information explicit, for which an object oriented formalism will be suggested. GTA consists of a method and framework that have been developed during practical design exercises. Examples from some of these cases will illustrate our approach

    Multiscale likelihood analysis and complexity penalized estimation

    Full text link
    We describe here a framework for a certain class of multiscale likelihood factorizations wherein, in analogy to a wavelet decomposition of an L^2 function, a given likelihood function has an alternative representation as a product of conditional densities reflecting information in both the data and the parameter vector localized in position and scale. The framework is developed as a set of sufficient conditions for the existence of such factorizations, formulated in analogy to those underlying a standard multiresolution analysis for wavelets, and hence can be viewed as a multiresolution analysis for likelihoods. We then consider the use of these factorizations in the task of nonparametric, complexity penalized likelihood estimation. We study the risk properties of certain thresholding and partitioning estimators, and demonstrate their adaptivity and near-optimality, in a minimax sense over a broad range of function spaces, based on squared Hellinger distance as a loss function. In particular, our results provide an illustration of how properties of classical wavelet-based estimators can be obtained in a single, unified framework that includes models for continuous, count and categorical data types

    Modular Complexity Analysis for Term Rewriting

    Full text link
    All current investigations to analyze the derivational complexity of term rewrite systems are based on a single termination method, possibly preceded by transformations. However, the exclusive use of direct criteria is problematic due to their restricted power. To overcome this limitation the article introduces a modular framework which allows to infer (polynomial) upper bounds on the complexity of term rewrite systems by combining different criteria. Since the fundamental idea is based on relative rewriting, we study how matrix interpretations and match-bounds can be used and extended to measure complexity for relative rewriting, respectively. The modular framework is proved strictly more powerful than the conventional setting. Furthermore, the results have been implemented and experiments show significant gains in power.Comment: 33 pages; Special issue of RTA 201
    corecore