183 research outputs found

    A power optimization method considering glitch reduction by gate sizing

    Full text link
    We propose a power optimization method considering glitch re-duction by gate sizing. Our method reduces not only the amount of capacitive and short-circuit power consumption but also the power dissipated by glitches which has not been exploited previously. In the optimization method, we improve the accuracy of statistical glitch estimation method and device a gate sizing algorithm that utilizes perturbations for escaping a bad local solution. The effect of our method is verified experimentally using 12 benchmark cir-cuits with a 0.5 m standard cell library. Gate sizing reduces the number of glitch transitions by 38.2 % on average and by 63.4 % maximum. This results in the reduction of total transitions by 12.8 % on average. When the circuits are optimized for power without delay constraints, the power dissipation is reduced by 7.4 % on average and by 15.7 % maximum further from the minimum-sized circuits.

    Artificial Intelligence and Machine Learning: A Perspective on Integrated Systems Opportunities and Challenges for Multi-Domain Operations

    Get PDF
    This paper provides a perspective on historical background, innovation and applications of Artificial Intelligence (AI) and Machine Learning (ML), data successes and systems challenges, national security interests, and mission opportunities for system problems. AI and ML today are used interchangeably, or together as AI/ML, and are ubiquitous among many industries and applications. The recent explosion, based on a confluence of new ML algorithms, large data sets, and fast and cheap computing, has demonstrated impressive results in classification and regression and used for prediction, and decision-making. Yet, AI/ML today lacks a precise definition, and as a technical discipline, it has grown beyond its origins in computer science. Even though there are impressive feats, primarily of ML, there still is much work needed in order to see the systems benefits of AI, such as perception, reasoning, planning, acting, learning, communicating, and abstraction. Recent national security interests in AI/ML have focused on problems including multidomain operations (MDO), and this has renewed the focus on a systems view of AI/ML. This paper will address the solutions for systems from an AI/ML perspective and that these solutions will draw from methods in AI and ML, as well as computational methods in control, estimation, communication, and information theory, as in the early days of cybernetics. Along with the focus on developing technology, this paper will also address the challenges of integrating these AI/ML systems for warfare

    Virgo Detector Characterization and Data Quality: results from the O3 run

    Full text link
    The Advanced Virgo detector has contributed with its data to the rapid growth of the number of detected gravitational-wave (GW) signals in the past few years, alongside the two Advanced LIGO instruments. First during the last month of the Observation Run 2 (O2) in August 2017 (with, most notably, the compact binary mergers GW170814 and GW170817), and then during the full Observation Run 3 (O3): an 11-months data taking period, between April 2019 and March 2020, that led to the addition of about 80 events to the catalog of transient GW sources maintained by LIGO, Virgo and now KAGRA. These discoveries and the manifold exploitation of the detected waveforms require an accurate characterization of the quality of the data, such as continuous study and monitoring of the detector noise sources. These activities, collectively named {\em detector characterization and data quality} or {\em DetChar}, span the whole workflow of the Virgo data, from the instrument front-end hardware to the final analyses. They are described in details in the following article, with a focus on the results achieved by the Virgo DetChar group during the O3 run. Concurrently, a companion article describes the tools that have been used by the Virgo DetChar group to perform this work.Comment: 57 pages, 18 figures. To be submitted to Class. and Quantum Grav. This is the "Results" part of preprint arXiv:2205.01555 [gr-qc] which has been split into two companion articles: one about the tools and methods, the other about the analyses of the O3 Virgo dat

    Present and Future of Gravitational Wave Astronomy

    Get PDF
    The first detection on Earth of a gravitational wave signal from the coalescence of a binary black hole system in 2015 established a new era in astronomy, allowing the scientific community to observe the Universe with a new form of radiation for the first time. More than five years later, many more gravitational wave signals have been detected, including the first binary neutron star coalescence in coincidence with a gamma ray burst and a kilonova observation. The field of gravitational wave astronomy is rapidly evolving, making it difficult to keep up with the pace of new detector designs, discoveries, and astrophysical results. This Special Issue is, therefore, intended as a review of the current status and future directions of the field from the perspective of detector technology, data analysis, and the astrophysical implications of these discoveries. Rather than presenting new results, the articles collected in this issue will serve as a reference and an introduction to the field. This Special Issue will include reviews of the basic properties of gravitational wave signals; the detectors that are currently operating and the main sources of noise that limit their sensitivity; planned upgrades of the detectors in the short and long term; spaceborne detectors; a data analysis of the gravitational wave detector output focusing on the main classes of detected and expected signals; and implications of the current and future discoveries on our understanding of astrophysics and cosmology

    Research on performance enhancement for electromagnetic analysis and power analysis in cryptographic LSI

    Get PDF
    制度:新 ; 報告番号:甲3785号 ; 学位の種類:博士(工学) ; 授与年月日:2012/11/19 ; 早大学位記番号:新6161Waseda Universit

    Virgo Detector Characterization and Data Quality during the O3 run

    Full text link
    The Advanced Virgo detector has contributed with its data to the rapid growth of the number of detected gravitational-wave signals in the past few years, alongside the two LIGO instruments. First, during the last month of the Observation Run 2 (O2) in August 2017 (with, most notably, the compact binary mergers GW170814 and GW170817) and then during the full Observation Run 3 (O3): an 11 months data taking period, between April 2019 and March 2020, that led to the addition of about 80 events to the catalog of transient gravitational-wave sources maintained by LIGO, Virgo and KAGRA. These discoveries and the manifold exploitation of the detected waveforms require an accurate characterization of the quality of the data, such as continuous study and monitoring of the detector noise. These activities, collectively named {\em detector characterization} or {\em DetChar}, span the whole workflow of the Virgo data, from the instrument front-end to the final analysis. They are described in details in the following article, with a focus on the associated tools, the results achieved by the Virgo DetChar group during the O3 run and the main prospects for future data-taking periods with an improved detector.Comment: 86 pages, 33 figures. This paper has been divided into two articles which supercede it and have been posted to arXiv on October 2022. Please use these new preprints as references: arXiv:2210.15634 (tools and methods) and arXiv:2210.15633 (results from the O3 run

    Review of Path Selection Algorithms with Link Quality and Critical Switch Aware for Heterogeneous Traffic in SDN

    Get PDF
    Software Defined Networking (SDN) introduced network management flexibility that eludes traditional network architecture. Nevertheless, the pervasive demand for various cloud computing services with different levels of Quality of Service requirements in our contemporary world made network service provisioning challenging. One of these challenges is path selection (PS) for routing heterogeneous traffic with end-to-end quality of service support specific to each traffic class. The challenge had gotten the research community\u27s attention to the extent that many PSAs were proposed. However, a gap still exists that calls for further study. This paper reviews the existing PSA and the Baseline Shortest Path Algorithms (BSPA) upon which many relevant PSA(s) are built to help identify these gaps. The paper categorizes the PSAs into four, based on their path selection criteria, (1) PSAs that use static or dynamic link quality to guide PSD, (2) PSAs that consider the criticality of switch in terms of an update operation, FlowTable limitation or port capacity to guide PSD, (3) PSAs that consider flow variabilities to guide PSD and (4) The PSAs that use ML optimization in their PSD. We then reviewed and compared the techniques\u27 design in each category against the identified SDN PSA design objectives, solution approach, BSPA, and validation approaches. Finally, the paper recommends directions for further research

    Deep learning methods for enabling real-time gravitational wave and multimessenger astrophysics

    Get PDF
    A new era of gravitational wave (GW) astronomy has begun with the recent detections by LIGO. However, we need real-time observations of GW signals and their electromagnetic (EM) and astro-particle counterparts to unlock its full potential for scientific discoveries. Extracting and classifying the wide range of modeled and unmodeled GWs, whose amplitudes are often much weaker than the background noise, and rapidly inferring accurate parameters of their source is crucial in enabling this scenario of real-time multimessenger astrophysics. Identifying and automatically clustering anomalous non-Gaussian transient noises (glitches) that frequently contaminate the data and separating them from true GW signals is yet another difficult challenge. Currently, the most sensitive data analysis pipelines are limited by the extreme computational costs of template-matching methods and thus are unable to scale to all types of GW sources and their full parameter space. Accurate numerical models of GW signals covering the entire range of parameters including eccentric and spin-precessing compact binaries, which are essential to infer the astrophysical parameters of an event, are not available. Searches for unmodeled and anomalous signals do not have sufficient sensitivity compared to the targeted searches. Furthermore, existing search pipelines are not optimal for dealing with the non-stationary, non-Gaussian noise in the detectors. This indicates that many critical events will go unnoticed. The primary objective of this thesis is to resolve these issues via deep learning, a state-of-the-art machine learning method based on artificial neural networks. In this thesis we develop robust GW analysis algorithms for analyzing real LIGO/Virgo data based on deep learning with neural networks, that overcomes many limitations of existing techniques, allowing real-time detection and parameter estimation modeled GW sources and unmodeled GW bursts as well as classification and unsupervised clustering of anomalies and glitches in the detectors. This pipeline is designed to be highly scalable, therefore it can be trained with template banks of any size to cover the entire parameter-space of eccentric and spin-precessing black hole binaries as well as other sources and also optimized based on the real-time characteristics of the complex noise in the GW detectors. This deep learning framework may also be extended for low-latency analysis of the raw big data collected across multiple observational instruments to further facilitate real-time multimessenger astrophysics, which promises groundbreaking scientific insights about the origin, evolution, and destiny of the universe. In addition, this work introduces a new paradigm to accelerate scientific discovery by using data derived from high-performance physics simulations on supercomputers to train artificial intelligence algorithms that exploit emerging hardware architectures
    corecore