103 research outputs found
Recommended from our members
Entropy and Efficiency of the ETF Market
We investigate the relative information efficiency of financial markets by measuring the entropy of the time series of high frequency data. Our tool to measure efficiency is the Shannon entropy, applied to 2-symbol and 3-symbol discretisations of the data. Analysing 1-min and 5-min price time series of 55 Exchange Traded Funds traded at the New York Stock Exchange, we develop a methodology to isolate residual inefficiencies from other sources of regularities, such as the intraday pattern, the volatility clustering and the microstructure effects. The first two are modelled as multiplicative factors, while the microstructure is modelled as an ARMA noise process. Following an analytical and empirical combined approach, we find a strong relationship between low entropy and high relative tick size and that volatility is responsible for the largest amount of regularity, averaging 62% of the total regularity against 18% of the intraday pattern regularity and 20% of the microstructure
Market inefficiency identified by both single and multiple currency trends
Many studies have shown that there are good reasons to claim very low
predictability of currency nevertheless, the deviations from true randomness
exist which have potential predictive and prognostic power [J.James,
Quantitative finance 3 (2003) C75-C77]. We analyze the local trends which are
of the main focus of the technical analysis. In this article we introduced
various statistical quantities examining role of single temporal discretized
trend or multitude of trends corresponding to different time delays. Our
specific analysis based on Euro-dollar currency pair data at the one minute
frequency suggests the importance of cumulative nonrandom effect of trends on
the forecasting performance
New procedures for testing whether stock price processes are martingales
We propose procedures for testing whether stock price processes are
martingales based on limit order type betting strategies. We first show that
the null hypothesis of martingale property of a stock price process can be
tested based on the capital process of a betting strategy. In particular with
high frequency Markov type strategies we find that martingale null hypotheses
are rejected for many stock price processes
Thermal error modelling of a gantry-type 5-axis machine tool using a Grey Neural Network Model
This paper presents a new modelling methodology for compensation of the thermal errors on a gantry-type 5-axis CNC machine tool. The method uses a “Grey Neural Network Model with Convolution Integral” (GNNMCI(1, N)), which makes full use of the similarities and complementarity between Grey system models and artificial neural networks (ANNs) to overcome the disadvantage of applying either model in isolation. A Particle Swarm Optimisation (PSO) algorithm is also employed to optimise the proposed Grey neural network. The size of the data pairs is crucial when the generation of data is a costly affair, since the machine downtime necessary to acquire the data is often considered prohibitive. Under such circumstances, optimisation of the number of data pairs used for training is of prime concern for calibrating a physical model or training a black-box model. A Grey Accumulated Generating Operation (AGO), which is a basis of the Grey system theory, is used to transform the original data to a monotonic series of data, which has less randomness than the original series of data. The choice of inputs to the thermal model is a non-trivial decision which is ultimately a compromise between the ability to obtain data that sufficiently correlates with the thermal distortion and the cost of implementation of the necessary feedback sensors. In this study, temperature measurement at key locations was supplemented by direct distortion measurement at accessible locations. This form of data fusion simplifies the modelling process, enhances the accuracy of the system and reduces the overall number of inputs to the model, since otherwise a much larger number of thermal sensors would be required to cover the entire structure. The Z-axis heating test, C-axis heating test, and the combined (helical) movement are considered in this work. The compensation values, calculated by the GNNMCI(1, N) model were sent to the controller for live error compensation. Test results show that a 85% reduction in thermal errors was achieved after compensation
Double-degradable responsive self-assembled multivalent arrays-temporary nanoscale recognition between dendrons and DNA
This article reports self-assembling dendrons which bind DNA in a multivalent manner. The molecular design directly impacts on self-assembly which subsequently controls the way these multivalent nanostructures bind DNA-this can be simulated by multiscale modelling. Incorporation of an S-S linkage between the multivalent hydrophilic dendron and the hydrophobic units responsible for self-assembly allows these structures to undergo triggered reductive cleavage, with dithiothreitol (DTT) inducing controlled breakdown, enabling the release of bound DNA. As such, the high-affinity self-assembled multivalent binding is temporary. Furthermore, because the multivalent dendrons are constructed from esters, a second slow degradation step causes further breakdown of these structures. This two-step double-degradation mechanism converts a large self-assembling unit with high affinity for DNA into small units with no measurable binding affinity-demonstrating the advantage of self-assembled multivalency (SAMul) in achieving highly responsive nanoscale binding of biological targets
- …