1,930 research outputs found

    Statistical Classification Based Modelling and Estimation of Analog Circuits Failure Probability

    Get PDF
    At nanoscales, variations in transistor parameters cause variations and unpredictability in the circuit output, and may ultimately cause a violation of the desired specifications, leading to circuit failure. The parametric variations in transistors occur due to limitations in the manufacturing process and are commonly known as process variations. Circuit simulation is a Computer-Aided Design (CAD) technique for verifying the behavior of analog circuits but exhibits incompleteness under the effects of process variations. Hence, statistical circuit simulation is showing increasing importance for circuit design to address this incompleteness problem. However, existing statistical circuit simulation approaches either fail to analyze the rare failure events accurately and efficiently or are impractical to use. Moreover, none of the existing approaches is able to successfully analyze analog circuits in the presence of multiple performance specifications in timely and accurate manner. Therefore, we propose a new statistical circuit simulation based methodology for modelling and estimation of failure probability of analog circuits in the presence of multiple performance metrics. Our methodology is based on an iterative way of estimating failure probability, employing a statistical classifier to reduce the number of simulations while still maintaining high estimation accuracy. Furthermore, a more practical classifier model is proposed for analog circuit failure probability estimation. Our methodology estimates an accurate failure probability even when the failures resulting from each performance metric occur simultaneously. The proposed methodology can deliver many orders of speedup compared to traditional Monte Carlo methods. Moreover, experimental results show that the methodology generates accurate results for problems with multiple specifications, while other approaches fail totally

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Advances in Robotics, Automation and Control

    Get PDF
    The book presents an excellent overview of the recent developments in the different areas of Robotics, Automation and Control. Through its 24 chapters, this book presents topics related to control and robot design; it also introduces new mathematical tools and techniques devoted to improve the system modeling and control. An important point is the use of rational agents and heuristic techniques to cope with the computational complexity required for controlling complex systems. Through this book, we also find navigation and vision algorithms, automatic handwritten comprehension and speech recognition systems that will be included in the next generation of productive systems developed by man

    Eye quietness and quiet eye in expert and novice golf performance: an electrooculographic analysis

    Get PDF
    Quiet eye (QE) is the final ocular fixation on the target of an action (e.g., the ball in golf putting). Camerabased eye-tracking studies have consistently found longer QE durations in experts than novices; however, mechanisms underlying QE are not known. To offer a new perspective we examined the feasibility of measuring the QE using electrooculography (EOG) and developed an index to assess ocular activity across time: eye quietness (EQ). Ten expert and ten novice golfers putted 60 balls to a 2.4 m distant hole. Horizontal EOG (2ms resolution) was recorded from two electrodes placed on the outer sides of the eyes. QE duration was measured using a EOG voltage threshold and comprised the sum of the pre-movement and post-movement initiation components. EQ was computed as the standard deviation of the EOG in 0.5 s bins from –4 to +2 s, relative to backswing initiation: lower values indicate less movement of the eyes, hence greater quietness. Finally, we measured club-ball address and swing durations. T-tests showed that total QE did not differ between groups (p = .31); however, experts had marginally shorter pre-movement QE (p = .08) and longer post-movement QE (p < .001) than novices. A group × time ANOVA revealed that experts had less EQ before backswing initiation and greater EQ after backswing initiation (p = .002). QE durations were inversely correlated with EQ from –1.5 to 1 s (rs = –.48 - –.90, ps = .03 - .001). Experts had longer swing durations than novices (p = .01) and, importantly, swing durations correlated positively with post-movement QE (r = .52, p = .02) and negatively with EQ from 0.5 to 1s (r = –.63, p = .003). This study demonstrates the feasibility of measuring ocular activity using EOG and validates EQ as an index of ocular activity. Its findings challenge the dominant perspective on QE and provide new evidence that expert-novice differences in ocular activity may reflect differences in the kinematics of how experts and novices execute skills

    An overview of population-based algorithms for multi-objective optimisation

    Get PDF
    In this work we present an overview of the most prominent population-based algorithms and the methodologies used to extend them to multiple objective problems. Although not exact in the mathematical sense, it has long been recognised that population-based multi-objective optimisation techniques for real-world applications are immensely valuable and versatile. These techniques are usually employed when exact optimisation methods are not easily applicable or simply when, due to sheer complexity, such techniques could potentially be very costly. Another advantage is that since a population of decision vectors is considered in each generation these algorithms are implicitly parallelisable and can generate an approximation of the entire Pareto front at each iteration. A critique of their capabilities is also provided

    Irish Machine Vision and Image Processing Conference Proceedings 2017

    Get PDF

    Machine learning assisted optimization with applications to diesel engine optimization with the particle swarm optimization algorithm

    Get PDF
    A novel approach to incorporating Machine Learning into optimization routines is presented. An approach which combines the benefits of ML, optimization, and meta-model searching is developed and tested on a multi-modal test problem; a modified Rastragin\u27s function. An enhanced Particle Swarm Optimization method was derived from the initial testing. Optimization of a diesel engine was carried out using the modified algorithm demonstrating an improvement of 83% compared with the unmodified PSO algorithm. Additionally, an approach to enhancing the training of ML models by leveraging Virtual Sensing as an alternative to standard multi-layer neural networks is presented. Substantial gains were made in the prediction of Particulate matter, reducing the MMSE by 50% and improving the correlation R^2 from 0.84 to 0.98. Improvements were made in models of PM, NOx, HC, CO, and Fuel Consumption using the method, while training times and convergence reliability were simultaneously improved over the traditional approach

    Assessing The Impact Of University Technology Incubator Practices On Client Performance

    Get PDF
    This research is designed to distinguish and describe or explain incubator practices that affect the performance of incubator clients of university technology incubator programs. The research focuses on understanding which practices significantly contribute to increasing job creation for the firms located in university based technology incubators. An increasing number of communities are embracing economic development strategies that target the high tech sector with high wage, high value jobs as a way to diversify their economies and boost local and regional economies. New economic development strategies include the notion of a creation strategy or growing your own instead of relying on recruiting of existing companies from other regions. In 1999-2000 (according to the most recent data), small businesses created three-quarters of U.S. net new jobs (2.5 million of the 3.4 million total). The small business percentage varies from year to year and reflects economic trends. Over the decade of the 1990s, small business net job creation fluctuated between 60 and 80 percent. Moreover, according to a Bureau of the Census working paper, start-ups in the first two years of operation accounted for virtually all of the net new jobs in the economy. The study is broken into three parts: (1) a review of the literature on incubation, focusing on its history, best practices, technology incubation, networking theory, and previous empirical studies (2) a review of previous data collected in a recent national survey and (3) case studies of the top performing incubators in the country based on employment growth of client firms contracted with case studies from non-top ten programs. The literature suggests that the study of incubation must be considered in the context of a larger enterprise development system of which the incubator will fill gaps in the larger regional enterprise development system. This notion is explored. In general, there is a great need for more empirical research into best practice of incubation. It is a non trivial task however as the nature of the industry limits the ability to obtain traditional, statistically defendable, measures
    • …
    corecore