16 research outputs found
Size, efficiency, market power, and economies of scale in the African banking sector
Abstract
There is a growing body of evidence that interest rate spreads in Africa are higher for big banks compared to small banks. One concern is that big banks might be using their market power to charge higher lending rates as they become larger, more efficient, and unchallenged. In contrast, several studies found that when bank size increases beyond certain thresholds, diseconomies of scale are introduced that lead to inefficiency. In that case, we also would expect to see widened interest margins. This study examines the connection between bank size and efficiency to understand whether that relationship is influenced by exploitation of market power or economies of scale. Using a panel of 162 African banks for 2001–2011, we analyzed the empirical data using instrumental variables and fixed effects regressions, with overlapping and non-overlapping thresholds for bank size. We found two key results. First, bank size increases bank interest rate margins with an inverted U-shaped nexus. Second, market power and economies of scale do not increase or decrease the interest rate margins significantly. The main policy implication is that interest rate margins cannot be elucidated by either market power or economies of scale. Other implications are discussed
Nonlinear feature extraction through manifold learning in an electronic tongue classification task
A nonlinear feature extraction-based approach using manifold learning algorithms is developed in order to improve the classification accuracy in an electronic tongue sensor array. The developed signal processing methodology is composed of four stages: data unfolding, scaling, feature extraction, and classification. This study aims to compare seven manifold learning algorithms: Isomap, Laplacian Eigenmaps, Locally Linear Embedding (LLE), modified LLE, Hessian LLE, Local Tangent Space Alignment (LTSA), and t-Distributed Stochastic Neighbor Embedding (t-SNE) to find the best classification accuracy in a multifrequency large-amplitude pulse voltammetry electronic tongue. A sensitivity study of the parameters of each manifold learning algorithm is also included. A data set of seven different aqueous matrices is used to validate the proposed data processing methodology. A leave-one-out cross validation was employed in 63 samples. The best accuracy (96.83%) was obtained when the methodology uses Mean-Centered Group Scaling (MCGS) for data normalization, the t-SNE algorithm for feature extraction, and k-nearest neighbors (kNN) as classifier.Peer ReviewedPostprint (published version
Novel Modelling Strategies for High-frequency Stock Trading Data
Full electronic automation in stock exchanges has recently become popular,
generating high-frequency intraday data and motivating the development of near
real-time price forecasting methods. Machine learning algorithms are widely
applied to mid-price stock predictions. Processing raw data as inputs for
prediction models (e.g., data thinning and feature engineering) can primarily
affect the performance of the prediction methods. However, researchers rarely
discuss this topic. This motivated us to propose three novel modelling
strategies for processing raw data. We illustrate how our novel modelling
strategies improve forecasting performance by analyzing high-frequency data of
the Dow Jones 30 component stocks. In these experiments, our strategies often
lead to statistically significant improvement in predictions. The three
strategies improve the F1 scores of the SVM models by 0.056, 0.087, and 0.016,
respectively.Comment: 28 pages, 5 tables, 5 figure
Critical slowing down as an early warning signal for financial crises?
Financial crises have repeatedly been coined as a potential application area in the recent literature on constructing early warning signals through identifying characteristics of critical slowing down on the basis of time series observations. To test this idea, we consider four historical financial crises—Black Monday 1987, the 1997 Asian Crisis, the 2000 Dot-com bubble burst, and the 2008 Financial Crisis—and investigate whether there is evidence for critical slowing down prior to these market collapses. We find statistical evidence for critical slowing down before Black Monday 1987, while the results are mixed or insignificant for the more recent financial crises
Behavior monitoring methods for trade-based money laundering integrating macro and micro prudential regulation: a case from China
Trade-based Money Laundering, a new form of money laundering using international trade as a signboard, always appears along with speculative capital movement which has been accepted as the most concerned and consensus incentive giving rise to the collapse of the financial market. Unfortunately, preventing money laundering is very difficult since money laundering always has a plausible trade characterization. To reach this goal, supervision for regulator and financial institutions aims to effectively monitor micro entities’ behavior in financial markets. The main purpose of this paper is to establish a monitoring method including accurate recognition and classified supervision for Trade-based Money Laundering by means of knowledge-driven multi-class classification algorithms associated with macro and micro prudential regulation, such that the model can forecast the predicted class from the concerned management areas. Based on empirical data from China, we demonstrate the application and explain how the monitor method can help to improve management efficiency in the financial market.
First published online 8 May 201
Dynamic Advisor-Based Ensemble (dynABE): Case study in stock trend prediction of critical metal companies
Stock trend prediction is a challenging task due to the market's noise, and
machine learning techniques have recently been successful in coping with this
challenge. In this research, we create a novel framework for stock prediction,
Dynamic Advisor-Based Ensemble (dynABE). dynABE explores domain-specific areas
based on the companies of interest, diversifies the feature set by creating
different "advisors" that each handles a different area, follows an effective
model ensemble procedure for each advisor, and combines the advisors together
in a second-level ensemble through an online update strategy we developed.
dynABE is able to adapt to price pattern changes of the market during the
active trading period robustly, without needing to retrain the entire model. We
test dynABE on three cobalt-related companies, and it achieves the best-case
misclassification error of 31.12% and an annualized absolute return of 359.55%
with zero maximum drawdown. dynABE also consistently outperforms the baseline
models of support vector machine, neural network, and random forest in all case
studies.Comment: This is the latest version published in Plos ON