21 research outputs found
ML-based Secure Low-Power Communication in Adversarial Contexts
As wireless network technology becomes more and more popular, mutual
interference between various signals has become more and more severe and
common. Therefore, there is often a situation in which the transmission of its
own signal is interfered with by occupying the channel. Especially in a
confrontational environment, Jamming has caused great harm to the security of
information transmission. So I propose ML-based secure ultra-low power
communication, which is an approach to use machine learning to predict future
wireless traffic by capturing patterns of past wireless traffic to ensure
ultra-low-power transmission of signals via backscatters. In order to be more
suitable for the adversarial environment, we use backscatter to achieve
ultra-low power signal transmission, and use frequency-hopping technology to
achieve successful confrontation with Jamming information. In the end, we
achieved a prediction success rate of 96.19%
Network on chip architecture for multi-agent systems in FPGA
A system of interacting agents is, by definition, very demanding in terms of computational resources. Although multi-agent systems have been used to solve complex problems in many areas, it is usually very difficult to perform large-scale simulations in their targeted serial computing platforms. Reconfigurable hardware, in particular Field Programmable Gate Arrays (FPGA) devices, have been successfully used in High Performance Computing applications due to their inherent flexibility, data parallelism and algorithm acceleration capabilities. Indeed, reconfigurable hardware seems to be the next logical step in the agency paradigm, but only a few attempts have been successful in implementing multi-agent systems in these platforms. This paper discusses the problem of inter-agent communications in Field Programmable Gate Arrays. It proposes a Network-on-Chip in a hierarchical star topology to enable agentsâ transactions through message broadcasting using the Open Core Protocol, as an interface between hardware modules. A customizable router microarchitecture is described and a multi-agent system is created to simulate and analyse message exchanges in a generic heavy traffic load agent-based application. Experiments have shown a throughput of 1.6Gbps per port at 100 MHz without packet loss and seamless scalability characteristics
Macroprudential oversight, risk communication and visualization
This paper discusses the role of risk communication in macroprudential oversight and of visualization in risk communication. Beyond the soar in data availability and precision, the transition from firm-centric to system-wide supervision imposes vast data needs. Moreover, except for internal communication as in any organization, broad and effective external communication of timely information related to systemic risks is a key mandate of macroprudential supervisors, further stressing the importance of simple representations of complex data. This paper focuses on the background and theory of information visualization and visual analytics, as well as techniques within these fields, as potential means for risk communication. We define the task of visualization in risk communication, discuss the structure of macroprudential data, and review visualization techniques applied to systemic risk. We conclude that two essential, yet rare, features for supporting the analysis of big data and communication of risks are analytical visualizations and interactive interfaces. For visualizing the so-called macroprudential data cube, we provide the VisRisk platform with three modules: plots, maps and networks. While VisRisk is herein illustrated with five web-based interactive visualizations of systemic risk indicators and models, the platform enables and is open to the visualization of any data from the macroprudential data cube
Efficient Semantic Segmentation on Edge Devices
Semantic segmentation works on the computer vision algorithm for assigning
each pixel of an image into a class. The task of semantic segmentation should
be performed with both accuracy and efficiency. Most of the existing deep FCNs
yield to heavy computations and these networks are very power hungry,
unsuitable for real-time applications on portable devices. This project
analyzes current semantic segmentation models to explore the feasibility of
applying these models for emergency response during catastrophic events. We
compare the performance of real-time semantic segmentation models with
non-real-time counterparts constrained by aerial images under oppositional
settings. Furthermore, we train several models on the Flood-Net dataset,
containing UAV images captured after Hurricane Harvey, and benchmark their
execution on special classes such as flooded buildings vs. non-flooded
buildings or flooded roads vs. non-flooded roads. In this project, we developed
a real-time UNet based model and deployed that network on Jetson AGX Xavier
module
Recommended from our members
Computational intelligence for measuring macro-knowledge competitiveness
The aim of this research is to investigate the utilisation of Computational Intelligence methods for constructing Synthetic Composite Indicators (SCI). In particular for delivering a Unified Macro-Knowledge Competitiveness Indicator (UKCI) to enable consistent and transparent assessments and forecasting of the progress and competitiveness of Knowledge Based Economy (KBE). SCI are assessment tools usually constructed to evaluate and contrast entities performance by aggregating intangible measures in many areas such as economy, education, technology and innovation. SCI key value is inhibited in its capacity to aggregate complex and multi-dimensional variables into a single meaningful value. As a result, SCIs have been considered as one of the most important tools for macro-level and strategic decision making. Considering the shortcomings of the existing SCI, this study is proposing an alternative approach to develop Intelligent Synthetic Composite Indicators (iSCI). The suggested approach utilizes Fuzzy Proximity Knowledge Mining technique to build the qualitative taxonomy initially, and Fuzzy c-mean is employed to form the new composite indicators
Recommended from our members
Nature inspired computational intelligence for financial contagion modelling
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Financial contagion refers to a scenario in which small shocks, which initially affect only a few financial institutions or a particular region of the economy, spread to the rest of the financial sector and other countries whose economies were previously healthy. This resembles the âtransmissionâ of a medical disease. Financial contagion happens both at domestic level and international level. At domestic level, usually the failure of a domestic bank or financial intermediary triggers transmission by defaulting on inter-bank liabilities, selling assets in a fire sale, and undermining confidence in similar banks. An example of this phenomenon is the failure of Lehman Brothers and the subsequent turmoil in the US financial markets. International financial contagion happens in both advanced economies and developing economies, and is the transmission of financial crises across financial markets. Within the current globalise financial system, with large volumes of cash flow and cross-regional operations of large banks and hedge funds, financial contagion usually happens simultaneously among both domestic institutions and across countries. There is no conclusive definition of financial contagion, most research papers study contagion by analyzing the change in the variance-covariance matrix during the period of market turmoil. King and Wadhwani (1990) first test the correlations between the US, UK and Japan, during the US stock market crash of 1987. Boyer (1997) finds significant increases in correlation during financial crises, and reinforces a definition of financial contagion as a correlation changing during the crash period. Forbes and Rigobon (2002) give a definition of financial contagion. In their work, the term interdependence is used as the alternative to contagion. They claim that for the period they study, there is no contagion but only interdependence. Interdependence leads to common price movements during periods both of stability and turmoil. In the past two decades, many studies (e.g. Kaminsky et at., 1998; Kaminsky 1999) have developed early warning systems focused on the origins of financial crises rather than on financial contagion. Further authors (e.g. Forbes and Rigobon, 2002; Caporale et al, 2005), on the other hand, have focused on studying contagion or interdependence. In this thesis, an overall mechanism is proposed that simulates characteristics of propagating crisis through contagion. Within that scope, a new co-evolutionary market model is developed, where some of the technical traders change their behaviour during crisis to transform into herd traders making their decisions based on market sentiment rather than underlying strategies or factors. The thesis focuses on the transformation of market interdependence into contagion and on the contagion effects. The author first build a multi-national platform to allow different type of players to trade implementing their own rules and considering information from the domestic and a foreign market. Tradersâ strategies and the performance of the simulated domestic market are trained using historical prices on both markets, and optimizing artificial marketâs parameters through immune - particle swarm optimization techniques (I-PSO). The author also introduces a mechanism contributing to the transformation of technical into herd traders. A generalized auto-regressive conditional heteroscedasticity - copula (GARCH-copula) is further applied to calculate the tail dependence between the affected market and the origin of the crisis, and that parameter is used in the fitness function for selecting the best solutions within the evolving population of possible model parameters, and therefore in the optimization criteria for contagion simulation. The overall model is also applied in predictive mode, where the author optimize in the pre-crisis period using data from the domestic market and the crisis-origin foreign market, and predict in the crisis period using data from the foreign market and predicting the affected domestic market
Disrupting Finance
This open access Pivot demonstrates how a variety of technologies act as innovation catalysts within the banking and financial services sector. Traditional banks and financial services are under increasing competition from global IT companies such as Google, Apple, Amazon and PayPal whilst facing pressure from investors to reduce costs, increase agility and improve customer retention. Technologies such as blockchain, cloud computing, mobile technologies, big data analytics and social media therefore have perhaps more potential in this industry and area of business than any other. This book defines a fintech ecosystem for the 21st century, providing a state-of-the art review of current literature, suggesting avenues for new research and offering perspectives from business, technology and industry
Disrupting Finance : FinTech and Strategy in the 21st Century
This open access Pivot demonstrates how a variety of technologies act as innovation catalysts within the banking and financial services sector. Traditional banks and financial services are under increasing competition from global IT companies such as Google, Apple, Amazon and PayPal whilst facing pressure from investors to reduce costs, increase agility and improve customer retention. Technologies such as blockchain, cloud computing, mobile technologies, big data analytics and social media therefore have perhaps more potential in this industry and area of business than any other. This book defines a fintech ecosystem for the 21st century, providing a state-of-the art review of current literature, suggesting avenues for new research and offering perspectives from business, technology and industry
Volatility modeling and limit-order book analytics with high-frequency data
The vast amount of information characterizing nowadaysâs high-frequency financial datasets poses both opportunities and challenges. Among the opportunities, existing methods can be employed to provide new insights and better understanding of marketâs complexity under different perspectives, while new methods, capable of fully-exploit all the information embedded in high-frequency datasets and addressing new issues, can be devised. Challenges are driven by data complexity: limit-order book datasets constitute of hundreds of thousands of events, interacting with each other, and affecting the event-flow dynamics.
This dissertation aims at improving our understanding over the effective applicability of machine learning methods for mid-price movement prediction, over the nature of long-range autocorrelations in financial time-series, and over the econometric modeling and forecasting of volatility dynamics in high-frequency settings. Our results show that simple machine learning methods can be successfully employed for mid-price forecasting, moreover adopting methods that rely on the natural tensorrepresentation of financial time series, inter-temporal connections captured by this convenient representation are shown to be of relevance for the prediction of future mid-price movements. Furthermore, by using ultra-high-frequency order book data over a considerably long period, a quantitative characterization of the long-range autocorrelation is achieved by extracting the so-called scaling exponent. By jointly considering duration series of both inter- and cross- events, for different stocks, and separately for the bid and ask side, long-range autocorrelations are found to be ubiquitous and qualitatively homogeneous. With respect to the scaling exponent, evidence of three cross-overs is found, and complex heterogeneous associations with a number of relevant economic variables discussed. Lastly, the use of copulas as the main ingredient for modeling and forecasting realized measures of volatility is explored. The modeling background resembles but generalizes, the well-known Heterogeneous Autoregressive (HAR) model. In-sample and out-of-sample analyses, based on several performance measures, statistical tests, and robustness checks, show forecasting improvements of copula-based modeling over the HAR benchmark