2,414 research outputs found
A Techniques for Scalable and Effective Routability Evaluation
Routing congestion has become a critical layout challenge in nanoscale circuits since it is a critical factor in determining the routability of a design. An unroutable design is not useful even though it closes on all other design metrics. Fast design closure can only be achieved by accurately evaluating whether a design is routable or not early in the design cycle. Lately, it has become common to use a “light mode ” version of a global router to quickly evaluate the routability of a given placement. This approach suffers from three weaknesses: (i) it does not adequately model local routing resources, which can cause incorrect routability predictions that are only detected late, during detailed routing, (ii) the congestion maps obtained by it tend to have isolated hot spots surrounded by noncongested spots, called “noisy hot spots”, which further affects the accuracy in routability evaluation, (iii) the metrics used to represent congestion may yield numbers that do not provide sufficient intuition to the designer; moreover, they may often fail to predict the routability accurately. This paper presents solutions to these issues. First, we propose three approaches to model local routing resources. Second, we propose a smoothing technique to reduce the number of noisy hot spots and obtain a more accurate routability evaluation result. Finally, we develop a new metric which represents congestion maps with higher fidelity. We apply the proposed techniques to several industrial circuits and demonstrate that one can better predict and evaluate design routability, and congestion mitigation tools can perform muc
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Wireless Communications in the Era of Big Data
The rapidly growing wave of wireless data service is pushing against the
boundary of our communication network's processing power. The pervasive and
exponentially increasing data traffic present imminent challenges to all the
aspects of the wireless system design, such as spectrum efficiency, computing
capabilities and fronthaul/backhaul link capacity. In this article, we discuss
the challenges and opportunities in the design of scalable wireless systems to
embrace such a "bigdata" era. On one hand, we review the state-of-the-art
networking architectures and signal processing techniques adaptable for
managing the bigdata traffic in wireless networks. On the other hand, instead
of viewing mobile bigdata as a unwanted burden, we introduce methods to
capitalize from the vast data traffic, for building a bigdata-aware wireless
network with better wireless service quality and new mobile applications. We
highlight several promising future research directions for wireless
communications in the mobile bigdata era.Comment: This article is accepted and to appear in IEEE Communications
Magazin
Digital design techniques for dependable High-Performance Computing
L'abstract è presente nell'allegato / the abstract is in the attachmen
- …