2,470 research outputs found

    Behavioral Modeling of a C-Band Ring Hybrid Coupler Using Artificial Neural Networks

    Get PDF
    Artificial Neural Networks (ANNs) gained importance on the RF microwave (MW) design area and behavioral modeling of MW components in the past few decades. This paper presents a cost effective neural network (NN) approach to overcome design, modeling and optimization problems of an 180deg ring hybrid coupler operating in C-Band. The proposed NN model is trained by data sets obtained from electromagnetic (EM) simulators and neural test results are compared with simulator findings to determine the network accuracy. Moreover, necessary trade-offs are applied to improve the networks’ performance. Finally correlation factors, which are defined as comparison criteria between EM-simulator and proposed neural models, are calculated for each trade-off case

    Machine Learning based Surrogate Modeling of Electronic Devices and Circuits

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Modeling and Optimization of the Microwave PCB Interconnects Using Macromodel Techniques

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    State-of-the-art: AI-assisted surrogate modeling and optimization for microwave filters

    Get PDF
    Microwave filters are indispensable passive devices for modern wireless communication systems. Nowadays, electromagnetic (EM) simulation-based design process is a norm for filter designs. Many EM-based design methodologies for microwave filter design have emerged in recent years to achieve efficiency, automation, and customizability. The majority of EM-based design methods exploit low-cost models (i.e., surrogates) in various forms, and artificial intelligence techniques assist the surrogate modeling and optimization processes. Focusing on surrogate-assisted microwave filter designs, this article first analyzes the characteristic of filter design based on different design objective functions. Then, the state-of-the-art filter design methodologies are reviewed, including surrogate modeling (machine learning) methods and advanced optimization algorithms. Three essential techniques in filter designs are included: 1) smart data sampling techniques; 2) advanced surrogate modeling techniques; and 3) advanced optimization methods and frameworks. To achieve success and stability, they have to be tailored or combined together to achieve the specific characteristics of the microwave filters. Finally, new emerging design applications and future trends in the filter design are discussed

    Principles of Neuromorphic Photonics

    Full text link
    In an age overrun with information, the ability to process reams of data has become crucial. The demand for data will continue to grow as smart gadgets multiply and become increasingly integrated into our daily lives. Next-generation industries in artificial intelligence services and high-performance computing are so far supported by microelectronic platforms. These data-intensive enterprises rely on continual improvements in hardware. Their prospects are running up against a stark reality: conventional one-size-fits-all solutions offered by digital electronics can no longer satisfy this need, as Moore's law (exponential hardware scaling), interconnection density, and the von Neumann architecture reach their limits. With its superior speed and reconfigurability, analog photonics can provide some relief to these problems; however, complex applications of analog photonics have remained largely unexplored due to the absence of a robust photonic integration industry. Recently, the landscape for commercially-manufacturable photonic chips has been changing rapidly and now promises to achieve economies of scale previously enjoyed solely by microelectronics. The scientific community has set out to build bridges between the domains of photonic device physics and neural networks, giving rise to the field of \emph{neuromorphic photonics}. This article reviews the recent progress in integrated neuromorphic photonics. We provide an overview of neuromorphic computing, discuss the associated technology (microelectronic and photonic) platforms and compare their metric performance. We discuss photonic neural network approaches and challenges for integrated neuromorphic photonic processors while providing an in-depth description of photonic neurons and a candidate interconnection architecture. We conclude with a future outlook of neuro-inspired photonic processing.Comment: 28 pages, 19 figure

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions
    • …
    corecore