12,310 research outputs found

    A Robust Aggregation Approach To Simplification Of Manufacturing Flow Line Models

    Get PDF
    One of the more difficult tasks facing a modeler in developing a simulation model of a discrete part manufacturing system is deciding at what level of abstraction to represent the resources of the system. For example, questions about plant capacity can be modeled with a simple model, whereas questions regarding the efficiency of different part scheduling rules can only be answered with a more detailed model. In developing a simulation model, most of the actual features of the system under study must be ignored and an abstraction must be developed. If done correctly, this idealization provides a useful approximation of the real system. Unfortunately, many individuals claim that the process of building a simulation model is an “intuitive art.” The objective of this research is to introduce aspects of “science” to model development by defining quantitative techniques for developing an aggregate simulation model for estimating part cycle time of a manufacturing flow line. The methodology integrates aspects of queueing theory, a recursive algorithm, and simulation to develop the specifications necessary for combining resources of a flow line into a reduced set of aggregation resources. Experimentation shows that developing a simulation model with the aggregation resources results in accurate interval estimates of the average part cycle time

    Decomposition of discrete-time open tandem queues with Poisson arrivals and general service times

    Get PDF
    In der Grobplanungsphase vernetzter Logistik- und Produktionssysteme ist man häufig daran interessiert, mit geringem Berechnungsaufwand eine zufriedenstellende Approximation der Leistungskennzahlen des Systems zu bestimmen. Hierbei bietet die Modellierung mittels zeitdiskreter Methoden gegenüber der zeitkontinuierlichen Modellierung den Vorteil, dass die gesamte Wahrscheinlichkeitsverteilung der Leistungskenngrößen berechnet werden kann. Da Produktions- und Logistiksysteme in der Regel so konzipiert sind, dass sie die Leistung nicht im Durchschnitt, sondern mit einer bestimmten Wahrscheinlichkeit (z.B. 95%) zusichern, können zeitdiskrete Warteschlangenmodelle detailliertere Informationen über die Leistung des Systems (wie z.B. der Warte- oder Durchlaufzeit) liefern. Für die Analyse vernetzter zeitdiskreter Bediensysteme sind Dekompositionsmethoden häufig der einzig praktikable und recheneffiziente Ansatz, um stationäre Leistungsmaße in den einzelnen Bediensystemen zu berechnen. Hierbei wird das Netzwerk in die einzelnen Knoten zerlegt und diese getrennt voneinander analysiert. Der Ansatz basiert auf der Annahme, dass der Punktprozess des Abgangsstroms stromaufwärts liegender Stationen durch einen Erneuerungsprozess approximiert werden kann, und so eine unabhängige Analyse der Bediensysteme möglich ist. Die Annahme der Unabhängigkeit ermöglicht zwar eine effiziente Berechnung, führt jedoch zu teilweise starken Approximationsfehlern in den berechneten Leistungskenngrößen. Der Untersuchungsgegenstand dieser Arbeit sind offene zeitdiskrete Tandem-Netzwerke mit Poisson-verteilten Ankünften am stromaufwärts liegenden Bediensystem und generell verteilten Bedienzeiten. Das Netzwerk besteht folglich aus einem stromaufwärts liegenden M/G/1-Bediensystem und einem stromabwärts liegenden G/G/1-System. Diese Arbeit verfolgt drei Ziele, (1) die Defizite des Dekompositionsansatzes aufzuzeigen und dessen Approximationsgüte mittels statistischer Schätzmethoden zu bestimmen, (2) die Autokorrelation des Abgangsprozesses des M/G/1-Systems zu modellieren um die Ursache des Approximationsfehlers erklären zu können und (3) einen Dekompositionsansatz zu entwickeln, der die Abhängigkeit des Abgangsstroms berücksichtigt und so beliebig genaue Annäherungen der Leistungskenngrößen ermöglicht. Im ersten Teil der Arbeit wird die Approximationsgüte des Dekompositionsverfahrens am stromabwärts liegenden G/G/1-Bediensystem mit Hilfe von Linearer Regression (Punktschätzung) und Quantilsregression (Intervallschätzung) bestimmt. Beide Schätzverfahren werden jeweils auf die relativen Fehler des Erwartungswerts und des 95%-Quantils der Wartezeit im Vergleich zu den simulierten Ergebnissen berechnet. Als signifikante Einflussfaktoren auf die Approximationsgüte werden die Auslastung des Systems und die Variabilität des Ankunftsstroms identifiziert. Der zweite Teil der Arbeit fokussiert sich auf die Berechnung der Autokorrelation im Abgangsstroms des M/G/1-Bediensystems. Aufeinanderfolgende Zwischenabgangszeiten sind miteinander korreliert, da die Abgangszeit eines Kunden von dem Systemzustand abhängt, den der vorherige Kunde bei dessen Abgang zurückgelassen hat. Die Autokorrelation ist ursächlich für den Dekompositionsfehler, da die Ankunftszeiten am stromabwärts liegenden Bediensystem nicht unabhängig identisch verteilt sind. Im dritten Teil der Arbeit wird ein neuer Dekompositionsansatz vorgestellt, der die Abhängigkeit im Abgangsstroms des M/G/1-Systems mittels eines semi-Markov Prozesses modelliert. Um eine explosionsartige Zunahme des Zustandsraums zu verhindern, wird ein Verfahren eingeführt, das den Zustandsraum der eingebetteten Markov-Kette beschränkt. Numerischen Auswertungen zeigen, dass die mit stark limitierten Zustandsraum erzielten Ergebnisse eine bessere Approximation bieten als der bisherige Dekompositionsansatz. Mit zunehmender Größe des Zustandsraums konvergieren die Leistungskennzahlen beliebig genau

    From Parallel Sequence Representations to Calligraphic Control: A Conspiracy of Neural Circuits

    Full text link
    Calligraphic writing presents a rich set of challenges to the human movement control system. These challenges include: initial learning, and recall from memory, of prescribed stroke sequences; critical timing of stroke onsets and durations; fine control of grip and contact forces; and letter-form invariance under voluntary size scaling, which entails fine control of stroke direction and amplitude during recruitment and derecruitment of musculoskeletal degrees of freedom. Experimental and computational studies in behavioral neuroscience have made rapid progress toward explaining the learning, planning and contTOl exercised in tasks that share features with calligraphic writing and drawing. This article summarizes computational neuroscience models and related neurobiological data that reveal critical operations spanning from parallel sequence representations to fine force control. Part one addresses stroke sequencing. It treats competitive queuing (CQ) models of sequence representation, performance, learning, and recall. Part two addresses letter size scaling and motor equivalence. It treats cursive handwriting models together with models in which sensory-motor tmnsformations are performed by circuits that learn inverse differential kinematic mappings. Part three addresses fine-grained control of timing and transient forces, by treating circuit models that learn to solve inverse dynamics problems.National Institutes of Health (R01 DC02852

    Moving in time: simulating how neural circuits enable rhythmic enactment of planned sequences

    Full text link
    Many complex actions are mentally pre-composed as plans that specify orderings of simpler actions. To be executed accurately, planned orderings must become active in working memory, and then enacted one-by-one until the sequence is complete. Examples include writing, typing, and speaking. In cases where the planned complex action is musical in nature (e.g. a choreographed dance or a piano melody), it appears to be possible to deploy two learned sequences at the same time, one composed from actions and a second composed from the time intervals between actions. Despite this added complexity, humans readily learn and perform rhythm-based action sequences. Notably, people can learn action sequences and rhythmic sequences separately, and then combine them with little trouble (Ullén & Bengtsson 2003). Related functional MRI data suggest that there are distinct neural regions responsible for the two different sequence types (Bengtsson et al. 2004). Although research on musical rhythm is extensive, few computational models exist to extend and inform our understanding of its neural bases. To that end, this article introduces the TAMSIN (Timing And Motor System Integration Network) model, a systems-level neural network model capable of performing arbitrary item sequences in accord with any rhythmic pattern that can be represented as a sequence of integer multiples of a base interval. In TAMSIN, two Competitive Queuing (CQ) modules operate in parallel. One represents and controls item order (the ORD module) and the second represents and controls the sequence of inter-onset-intervals (IOIs) that define a rhythmic pattern (RHY module). Further circuitry helps these modules coordinate their signal processing to enable performative output consistent with a desired beat and tempo.Accepted manuscrip

    Experimental analysis of computer system dependability

    Get PDF
    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance

    Comparative Analysis of Tandem Repeats from Hundreds of Species Reveals Unique Insights into Centromere Evolution

    Get PDF
    Centromeres are essential for chromosome segregation, yet their DNA sequences evolve rapidly. In most animals and plants that have been studied, centromeres contain megabase-scale arrays of tandem repeats. Despite their importance, very little is known about the degree to which centromere tandem repeats share common properties between different species across different phyla. We used bioinformatic methods to identify high-copy tandem repeats from 282 species using publicly available genomic sequence and our own data. The assumption that the most abundant tandem repeat is the centromere DNA was true for most species whose centromeres have been previously characterized, suggesting this is a general property of genomes. Our methods are compatible with all current sequencing technologies. Long Pacific Biosciences sequence reads allowed us to find tandem repeat monomers up to 1,419 bp. High-copy centromere tandem repeats were found in almost all animal and plant genomes, but repeat monomers were highly variable in sequence composition and in length. Furthermore, phylogenetic analysis of sequence homology showed little evidence of sequence conservation beyond ~50 million years of divergence. We find that despite an overall lack of sequence conservation, centromere tandem repeats from diverse species showed similar modes of evolution, including the appearance of higher order repeat structures in which several polymorphic monomers make up a larger repeating unit. While centromere position in most eukaryotes is epigenetically determined, our results indicate that tandem repeats are highly prevalent at centromeres of both animals and plants. This suggests a functional role for such repeats, perhaps in promoting concerted evolution of centromere DNA across chromosomes
    corecore