4,639 research outputs found

    A software framework for automated behavioral modeling of electronic devices

    Get PDF

    Surrogate modeling of RF circuit blocks

    Get PDF
    Surrogate models are a cost-effective replacement for expensive computer simulations in design space exploration. Literature has already demonstrated the feasibility of accurate surrogate models for single radio frequency (RF) and microwave devices. Within the European Marie Curie project O-MOORE-NICE! (Operational Model Order Reduction for Nanoscale IC Electronics) we aim to investigate the feasibility of the surrogate modeling approach for entire RF circuit blocks. This paper presents an overview about the surrogate model type selection problem for low noise amplifier modeling

    The SUMO toolbox: a tool for automatic regression modeling and active learning

    Get PDF
    Many complex, real world phenomena are difficult to study directly using controlled experiments. Instead, the use of computer simulations has become commonplace as a feasible alternative. Due to the computational cost of these high fidelity simulations, surrogate models are often employed as a drop-in replacement for the original simulator, in order to reduce evaluation times. In this context, neural networks, kernel methods, and other modeling techniques have become indispensable. Surrogate models have proven to be very useful for tasks such as optimization, design space exploration, visualization, prototyping and sensitivity analysis. We present a fully automated machine learning tool for generating accurate surrogate models, using active learning techniques to minimize the number of simulations and to maximize efficiency

    Generating sequential space-filling designs using genetic algorithms and Monte Carlo methods

    Get PDF
    In this paper, the authors compare a Monte Carlo method and an optimization-based approach using genetic algorithms for sequentially generating space-filling experimental designs. It is shown that Monte Carlo methods perform better than genetic algorithms for this specific problem

    Nanophotonic reservoir computing with photonic crystal cavities to generate periodic patterns

    Get PDF
    Reservoir computing (RC) is a technique in machine learning inspired by neural systems. RC has been used successfully to solve complex problems such as signal classification and signal generation. These systems are mainly implemented in software, and thereby they are limited in speed and power efficiency. Several optical and optoelectronic implementations have been demonstrated, in which the system has signals with an amplitude and phase. It is proven that these enrich the dynamics of the system, which is beneficial for the performance. In this paper, we introduce a novel optical architecture based on nanophotonic crystal cavities. This allows us to integrate many neurons on one chip, which, compared with other photonic solutions, closest resembles a classical neural network. Furthermore, the components are passive, which simplifies the design and reduces the power consumption. To assess the performance of this network, we train a photonic network to generate periodic patterns, using an alternative online learning rule called first-order reduced and corrected error. For this, we first train a classical hyperbolic tangent reservoir, but then we vary some of the properties to incorporate typical aspects of a photonics reservoir, such as the use of continuous-time versus discrete-time signals and the use of complex-valued versus real-valued signals. Then, the nanophotonic reservoir is simulated and we explore the role of relevant parameters such as the topology, the phases between the resonators, the number of nodes that are biased and the delay between the resonators. It is important that these parameters are chosen such that no strong self-oscillations occur. Finally, our results show that for a signal generation task a complex-valued, continuous-time nanophotonic reservoir outperforms a classical (i.e., discrete-time, real-valued) leaky hyperbolic tangent reservoir (normalized root-mean-square errors = 0.030 versus NRMSE = 0.127)
    • …
    corecore