154,793 research outputs found

    Statistics on Logic Simulation

    Get PDF
    The high costs associated with logic simulation of large VLSI based systems have led to the need for new computer architectures tailored to the simulation task. Such architecture have the potential for significant speedups over standard software based logic simulators. Several commercial simulation engines have been produced to satisfy need in this area. To properly explore the space of alternative simulation architectures, data is required on the simulation process itself. This paper presents a framework for such data gathering activity by first examining possible sources of speedup in the logic simulation task, examining the sort of data needed in the design of simulation engines, and then presenting such data. The data contained in the paper includes information on the subtask times found in standard discrete event simulation algorithms, event intensities, queue length distributions and simultaneous event distributions

    Technical Report: A Receding Horizon Algorithm for Informative Path Planning with Temporal Logic Constraints

    Full text link
    This technical report is an extended version of the paper 'A Receding Horizon Algorithm for Informative Path Planning with Temporal Logic Constraints' accepted to the 2013 IEEE International Conference on Robotics and Automation (ICRA). This paper considers the problem of finding the most informative path for a sensing robot under temporal logic constraints, a richer set of constraints than have previously been considered in information gathering. An algorithm for informative path planning is presented that leverages tools from information theory and formal control synthesis, and is proven to give a path that satisfies the given temporal logic constraints. The algorithm uses a receding horizon approach in order to provide a reactive, on-line solution while mitigating computational complexity. Statistics compiled from multiple simulation studies indicate that this algorithm performs better than a baseline exhaustive search approach.Comment: Extended version of paper accepted to 2013 IEEE International Conference on Robotics and Automation (ICRA

    Variation aware analysis of bridging fault testing

    No full text
    This paper investigates the impact of process variation on test quality with regard to resistive bridging faults. The input logic threshold voltage and gate drive strength parameters are analyzed regarding their process variation induced influence on test quality. The impact of process variation on test quality is studied in terms of test escapes and measured by a robustness metric. It is shown that some bridges are sensitive to process variation in terms of logic behavior, but such variation does not necessarily compromise test quality if the test has high robustness. Experimental results of Monte-Carlo simulation based on recent process variation statistics are presented for ISCAS85 and -89 benchmark circuits, using a 45nm gate library and realistic bridges. The results show that tests generated without consideration of process variation are inadequate in terms of test quality, particularly for small test sets. On the other hand, larger test sets detect more of the logic faults introduced by process variation and have higher test quality

    Combating anti-statistical thinking using simulation-based methods throughout the undergraduate curriculum

    Get PDF
    The use of simulation-based methods for introducing inference is growing in popularity for the Stat 101 course, due in part to increasing evidence of the methods ability to improve students' statistical thinking. This impact comes from simulation-based methods (a) clearly presenting the overarching logic of inference, (b) strengthening ties between statistics and probability or mathematical concepts, (c) encouraging a focus on the entire research process, (d) facilitating student thinking about advanced statistical concepts, (e) allowing more time to explore, do, and talk about real research and messy data, and (f) acting as a firmer foundation on which to build statistical intuition. Thus, we argue that simulation-based inference should be an entry point to an undergraduate statistics program for all students, and that simulation-based inference should be used throughout all undergraduate statistics courses. In order to achieve this goal and fully recognize the benefits of simulation-based inference on the undergraduate statistics program we will need to break free of historical forces tying undergraduate statistics curricula to mathematics, consider radical and innovative new pedagogical approaches in our courses, fully implement assessment-driven content innovations, and embrace computation throughout the curriculum.Comment: To be published in "The American Statistician

    A microprocessor based digital logic simulator

    Get PDF
    It is the intent of this thesis to acquaint the reader with a tool which is available for use in the digital circuit design field. The reader is now able to totally simulate via DLS the digital logic design he creates on paper before it ever takes a hardware form. The computer program accepts a detailed description of the schematic and creates timing diagrams, loading statistics, cross references, and various lists for future documentation. The user needs no programming knowledge and will find the requirements to run a simulation with DLS extremely user oriented. The simulation descriptions and command language are tailored to logic design applications. The format is straight forward, utilizing standard English language and logic design concepts. To code a design for simulation the designer needs only a well labeled circuit diagram, where all the inputs and outputs of each element has a label With the addition of a few simulation parameters DLS will take the network description and form a program in memory which will recreate the operations of the digital circuit

    Integrated and adaptive traffic signal control for diamond interchange : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Mechatronics Engineering at Massey University, Albany, New Zealand

    Get PDF
    New dynamic signal control methods such as fuzzy logic and artificial intelligence developed recently mainly focused on isolated intersection. Adaptive signal control based on fuzzy logic control (FLC) determines the duration and sequence that traffic signal should stay in a certain state, before switching to the next state (Trabia et al. 1999, Pham 2013). The amount of arriving and waiting vehicles are quantized into fuzzy variables and fuzzy rules are used to determine if the duration of the current state should be extended. The fuzzy logic controller showed to be more flexible than fixed controllers and vehicle actuated controllers, allowing traffic to flow more smoothly. The FLC does not possess the ability to handle various uncertainties especially in real world traffic control. Therefore it is not best suited for stochastic nature problems such as traffic signal timing optimization. However, probabilistic logic is the best choice to handle the uncertainties containing both stochastic and fuzzy features (Pappis and Mamdani 1977) Probabilistic fuzzy logic control is developed for the signalised control of a diamond interchange, where the signal phasing, green time extension and ramp metering are decided in response to real time traffic conditions, which aim at improving traffic flows on surface streets and highways. The probabilistic fuzzy logic for diamond interchange (PFLDI) comprises three modules: probabilistic fuzzy phase timing (PFPT) that controls the green time extension process of the current running phase, phase selection (PSL) which decides the next phase based on the pre-setup phase logic by the local transport authority and, probabilistic fuzzy ramp-metering (PFRM) that determines on-ramp metering rate based on traffic conditions of the arterial streets and highways. We used Advanced Interactive Microscopic Simulator for Urban and Non-Urban Network (AIMSUN) software for diamond interchange modeling and performance measure of effectiveness for the PFLDI algorithm. PFLDI was compared with actuated diamond interchange (ADI) control based on ALINEA algorithm and conventional fuzzy logic diamond interchange algorithm (FLDI). Simulation results show that the PFLDI surpasses the traffic actuated and conventional fuzzy models with lower System Total Travel Time, Average Delay and improvements in Downstream Average Speed and Downstream Average Delay. On the other hand, little attention has been given in recent years to the delays experienced by cyclists in urban transport networks. When planning changes to traffic signals or making other network changes, the value of time for cycling trips is rarely considered. The traditional approach to road management has been to only focus on improving the carrying capacity relating to vehicles, with an emphasis on maximising the speed and volume of motorised traffic moving around the network. The problem of cyclist delay has been compounded by the fact that the travel time for cyclists have been lower than those for vehicles, which affects benefit–cost ratios and effectively provides a disincentive to invest in cycling issues compared with other modes. The issue has also been influenced by the way in which traffic signals have been set up and operated. Because the primary stresses on an intersection tend to occur during vehicle (commuter) peaks in the morning and afternoon, intersections tend to be set up and coordinated to allow maximum flow during these peaks. The result is that during off-peak periods there is often spare capacity that is underutilised. Phasing and timings set up for peaks may not provide the optimum benefits during off-peak times. This is particularly important to cyclists during lunch-time peaks, when vehicle volumes are low and cyclist volumes are high. Cyclists can end up waiting long periods of time as a result of poor signal phasing, rather than due to the demands of other road users being placed on the network. The outcome of this study will not only reduce the traffic congestion during peak hours but also improve the cyclists’ safety at a typical diamond interchange

    Sequential Specification Tests to Choose a Model: A Change-Point Approach

    Full text link
    Researchers faced with a sequence of candidate model specifications must often choose the best specification that does not violate a testable identification assumption. One option in this scenario is sequential specification tests: hypothesis tests of the identification assumption over the sequence. Borrowing an idea from the change-point literature, this paper shows how to use the distribution of p-values from sequential specification tests to estimate the point in the sequence where the identification assumption ceases to hold. Unlike current approaches, this method is robust to individual errant p-values and does not require choosing a test level or tuning parameter. This paper demonstrates the method's properties with a simulation study, and illustrates it by application to the problems of choosing a bandwidth in a regression discontinuity design while maintaining covariate balance and of choosing a lag order for a time series model
    • …
    corecore