774 research outputs found

    The Effects of International F/X Markets on Domestic Currencies Using Wavelet Networks: Evidence from Emerging Markets

    Get PDF
    This paper proposes a powerful methodology wavelet networks to investigate the effects of international F/X markets on emerging markets currencies. We used EUR/USD parity as input indicator (international F/X markets) and three emerging markets currencies as Brazilian Real, Turkish Lira and Russian Ruble as output indicator (emerging markets currency). We test if the effects of international F/X markets change across different timescale. Using wavelet networks, we showed that the effects of international F/X markets increase with higher timescale. This evidence shows that the causality of international F/X markets on emerging markets should be tested based on 64-128 days effect. We also find that the effects of EUR/USD parity on Turkish Lira is higher on 17-32 days and 65-128 days scales and this evidence shows that Turkish lira is less stable compare to other emerging markets currencies as international F/X markets effects Turkish lira on shorten time scale.F/X Markets; Emerging markets; Wavelet networks; Wavelets; Neural networks

    Recurrent backpropagation and the dynamical approach to adaptive neural computation

    Get PDF
    Error backpropagation in feedforward neural network models is a popular learning algorithm that has its roots in nonlinear estimation and optimization. It is being used routinely to calculate error gradients in nonlinear systems with hundreds of thousands of parameters. However, the classical architecture for backpropagation has severe restrictions. The extension of backpropagation to networks with recurrent connections will be reviewed. It is now possible to efficiently compute the error gradients for networks that have temporal dynamics, which opens applications to a host of problems in systems identification and control

    Non-Direct Encoding Method Based on Cellular Automata to Design Neural Network Architectures

    Get PDF
    Architecture design is a fundamental step in the successful application of Feed forward Neural Networks. In most cases a large number of neural networks architectures suitable to solve a problem exist and the architecture design is, unfortunately, still a human expert’s job. It depends heavily on the expert and on a tedious trial-and-error process. In the last years, many works have been focused on automatic resolution of the design of neural network architectures. Most of the methods are based on evolutionary computation paradigms. Some of the designed methods are based on direct representations of the parameters of the network. These representations do not allow scalability; thus, for representing large architectures very large structures are required. More interesting alternatives are represented by indirect schemes. They codify a compact representation of the neural network. In this work, an indirect constructive encoding scheme is proposed. This scheme is based on cellular automata representations and is inspired by the idea that only a few seeds for the initial configuration of a cellular automaton can produce a wide variety of feed forward neural networks architectures. The cellular approach is experimentally validated in different domains and compared with a direct codification scheme.Publicad

    Generative capacities of cellular automata codification for evolution of NN codification

    Get PDF
    Proceeding of: International Conference on Artificial Neural Networks. ICANN 2002, Madrid, Spain, August 28-30, 2002Automatic methods for designing artificial neural nets are desired to avoid the laborious and erratically human expert’s job. Evolutionary computation has been used as a search technique to find appropriate NN architectures. Direct and indirect encoding methods are used to codify the net architecture into the chromosome. A reformulation of an indirect encoding method, based on two bi-dimensional cellular automata, and its generative capacity are presented.Publicad
    • …
    corecore