21 research outputs found

    ML-based Secure Low-Power Communication in Adversarial Contexts

    Full text link
    As wireless network technology becomes more and more popular, mutual interference between various signals has become more and more severe and common. Therefore, there is often a situation in which the transmission of its own signal is interfered with by occupying the channel. Especially in a confrontational environment, Jamming has caused great harm to the security of information transmission. So I propose ML-based secure ultra-low power communication, which is an approach to use machine learning to predict future wireless traffic by capturing patterns of past wireless traffic to ensure ultra-low-power transmission of signals via backscatters. In order to be more suitable for the adversarial environment, we use backscatter to achieve ultra-low power signal transmission, and use frequency-hopping technology to achieve successful confrontation with Jamming information. In the end, we achieved a prediction success rate of 96.19%

    Network on chip architecture for multi-agent systems in FPGA

    Get PDF
    A system of interacting agents is, by definition, very demanding in terms of computational resources. Although multi-agent systems have been used to solve complex problems in many areas, it is usually very difficult to perform large-scale simulations in their targeted serial computing platforms. Reconfigurable hardware, in particular Field Programmable Gate Arrays (FPGA) devices, have been successfully used in High Performance Computing applications due to their inherent flexibility, data parallelism and algorithm acceleration capabilities. Indeed, reconfigurable hardware seems to be the next logical step in the agency paradigm, but only a few attempts have been successful in implementing multi-agent systems in these platforms. This paper discusses the problem of inter-agent communications in Field Programmable Gate Arrays. It proposes a Network-on-Chip in a hierarchical star topology to enable agents’ transactions through message broadcasting using the Open Core Protocol, as an interface between hardware modules. A customizable router microarchitecture is described and a multi-agent system is created to simulate and analyse message exchanges in a generic heavy traffic load agent-based application. Experiments have shown a throughput of 1.6Gbps per port at 100 MHz without packet loss and seamless scalability characteristics

    Macroprudential oversight, risk communication and visualization

    Get PDF
    This paper discusses the role of risk communication in macroprudential oversight and of visualization in risk communication. Beyond the soar in data availability and precision, the transition from firm-centric to system-wide supervision imposes vast data needs. Moreover, except for internal communication as in any organization, broad and effective external communication of timely information related to systemic risks is a key mandate of macroprudential supervisors, further stressing the importance of simple representations of complex data. This paper focuses on the background and theory of information visualization and visual analytics, as well as techniques within these fields, as potential means for risk communication. We define the task of visualization in risk communication, discuss the structure of macroprudential data, and review visualization techniques applied to systemic risk. We conclude that two essential, yet rare, features for supporting the analysis of big data and communication of risks are analytical visualizations and interactive interfaces. For visualizing the so-called macroprudential data cube, we provide the VisRisk platform with three modules: plots, maps and networks. While VisRisk is herein illustrated with five web-based interactive visualizations of systemic risk indicators and models, the platform enables and is open to the visualization of any data from the macroprudential data cube

    Efficient Semantic Segmentation on Edge Devices

    Full text link
    Semantic segmentation works on the computer vision algorithm for assigning each pixel of an image into a class. The task of semantic segmentation should be performed with both accuracy and efficiency. Most of the existing deep FCNs yield to heavy computations and these networks are very power hungry, unsuitable for real-time applications on portable devices. This project analyzes current semantic segmentation models to explore the feasibility of applying these models for emergency response during catastrophic events. We compare the performance of real-time semantic segmentation models with non-real-time counterparts constrained by aerial images under oppositional settings. Furthermore, we train several models on the Flood-Net dataset, containing UAV images captured after Hurricane Harvey, and benchmark their execution on special classes such as flooded buildings vs. non-flooded buildings or flooded roads vs. non-flooded roads. In this project, we developed a real-time UNet based model and deployed that network on Jetson AGX Xavier module

    Disrupting Finance

    Get PDF
    This open access Pivot demonstrates how a variety of technologies act as innovation catalysts within the banking and financial services sector. Traditional banks and financial services are under increasing competition from global IT companies such as Google, Apple, Amazon and PayPal whilst facing pressure from investors to reduce costs, increase agility and improve customer retention. Technologies such as blockchain, cloud computing, mobile technologies, big data analytics and social media therefore have perhaps more potential in this industry and area of business than any other. This book defines a fintech ecosystem for the 21st century, providing a state-of-the art review of current literature, suggesting avenues for new research and offering perspectives from business, technology and industry

    Disrupting Finance : FinTech and Strategy in the 21st Century

    Get PDF
    This open access Pivot demonstrates how a variety of technologies act as innovation catalysts within the banking and financial services sector. Traditional banks and financial services are under increasing competition from global IT companies such as Google, Apple, Amazon and PayPal whilst facing pressure from investors to reduce costs, increase agility and improve customer retention. Technologies such as blockchain, cloud computing, mobile technologies, big data analytics and social media therefore have perhaps more potential in this industry and area of business than any other. This book defines a fintech ecosystem for the 21st century, providing a state-of-the art review of current literature, suggesting avenues for new research and offering perspectives from business, technology and industry

    Volatility modeling and limit-order book analytics with high-frequency data

    Get PDF
    The vast amount of information characterizing nowadays’s high-frequency financial datasets poses both opportunities and challenges. Among the opportunities, existing methods can be employed to provide new insights and better understanding of market’s complexity under different perspectives, while new methods, capable of fully-exploit all the information embedded in high-frequency datasets and addressing new issues, can be devised. Challenges are driven by data complexity: limit-order book datasets constitute of hundreds of thousands of events, interacting with each other, and affecting the event-flow dynamics. This dissertation aims at improving our understanding over the effective applicability of machine learning methods for mid-price movement prediction, over the nature of long-range autocorrelations in financial time-series, and over the econometric modeling and forecasting of volatility dynamics in high-frequency settings. Our results show that simple machine learning methods can be successfully employed for mid-price forecasting, moreover adopting methods that rely on the natural tensorrepresentation of financial time series, inter-temporal connections captured by this convenient representation are shown to be of relevance for the prediction of future mid-price movements. Furthermore, by using ultra-high-frequency order book data over a considerably long period, a quantitative characterization of the long-range autocorrelation is achieved by extracting the so-called scaling exponent. By jointly considering duration series of both inter- and cross- events, for different stocks, and separately for the bid and ask side, long-range autocorrelations are found to be ubiquitous and qualitatively homogeneous. With respect to the scaling exponent, evidence of three cross-overs is found, and complex heterogeneous associations with a number of relevant economic variables discussed. Lastly, the use of copulas as the main ingredient for modeling and forecasting realized measures of volatility is explored. The modeling background resembles but generalizes, the well-known Heterogeneous Autoregressive (HAR) model. In-sample and out-of-sample analyses, based on several performance measures, statistical tests, and robustness checks, show forecasting improvements of copula-based modeling over the HAR benchmark
    corecore