210 research outputs found
Saccade Velocity Driven Oscillatory Network Model of Grid Cells
Grid cells and place cells are believed to be cellular substrates for the spatial navigation functions of hippocampus as experimental animals physically navigated in 2D and 3D spaces. However, a recent saccade study on head fixated monkey has also reported grid-like representations on saccadic trajectory while the animal scanned the images on a computer screen. We present two computational models that explain the formation of grid patterns on saccadic trajectory formed on the novel Images. The first model named Saccade Velocity Driven Oscillatory Network -Direct PCA (SVDON—DPCA) explains how grid patterns can be generated on saccadic space using Principal Component Analysis (PCA) like learning rule. The model adopts a hierarchical architecture. We extend this to a network model viz. Saccade Velocity Driven Oscillatory Network—Network PCA (SVDON-NPCA) where the direct PCA stage is replaced by a neural network that can implement PCA using a neurally plausible algorithm. This gives the leverage to study the formation of grid cells at a network level. Saccade trajectory for both models is generated based on an attention model which attends to the salient location by computing the saliency maps of the images. Both models capture the spatial characteristics of grid cells such as grid scale variation on the dorso-ventral axis of Medial Entorhinal cortex. Adding one more layer of LAHN over the SVDON-NPCA model predicts the Place cells in saccadic space, which are yet to be discovered experimentally. To the best of our knowledge, this is the first attempt to model grid cells and place cells from saccade trajectory
Bio-inspired relevant interaction modelling in cognitive crowd management
Cognitive algorithms, integrated in intelligent systems, represent an important innovation in designing interactive smart environments. More in details, Cognitive Systems have important applications in anomaly detection and management in advanced video surveillance. These algorithms mainly address the problem of modelling interactions and behaviours among the main entities in a scene. A bio-inspired structure is here proposed, which is able to encode and synthesize signals, not only for the description of single entities behaviours, but also for modelling cause–effect relationships between user actions and changes in environment configurations. Such models are stored within a memory (Autobiographical Memory) during a learning phase. Here the system operates an effective knowledge transfer from a human operator towards an automatic systems called Cognitive Surveillance Node (CSN), which is part of a complex cognitive JDL-based and bio-inspired architecture. After such a knowledge-transfer phase, learned representations can be used, at different levels, either to support human decisions, by detecting anomalous interaction models and thus compensating for human shortcomings, or, in an automatic decision scenario, to identify anomalous patterns and choose the best strategy to preserve stability of the entire system. Results are presented in a video surveillance scenario , where the CSN can observe two interacting entities consisting in a simulated crowd and a human operator. These can interact within a visual 3D simulator, where crowd behaviour is modelled by means of Social Forces. The way anomalies are detected and consequently handled is demonstrated, on synthetic and also on real video sequences, in both the user-support and automatic modes
Quantitative Analyses on Non-Linearities in Financial Markets
"The brief market plunge was just a small indicator of how
complex and chaotic, in the formal sense, these systems
have become. Our nancial system is so complicated and so
interactive [...]. What happened in the stock market is just
a little example of how things can cascade or how technology
can interact with market panic" (Ben Bernanke, IHT, May
17, 2010)
One of the most important issues in economics is modeling and fore-
casting the
uctuations that characterize both nancial and real mar-
kets, such as interest rates, commodities and stock prices, output
growth, unemployment, or exchange rate. There are mainly two op-
posite views concerning these economic
uctuations. According to
the rst one, which was the predominant thought in the 1930s, the
economic system is mainly linear and stable, only randomly hit by
exogenous shocks. Ragnar Frisch, Eugen Slutsky and Jan Tinbergen,
to cite a few, are important exponents of this view, and they demon-
strated that the
uctuations observed in the real business cycle may
be produced in a stable linear system subject to an external sequence
of random shocks. This view has been criticized starting from the
1940s and the 1950s, since it was not able to provide a strong eco-
nomic explanation of observed
uctuations. Richard Goodwin,John
Hicks and Nicholas Kaldor introduced a nonlinear view of the econ-
omy, showing that even in absence of external shocks,
uctuations
might arise. The economists then suggested an alternative within
the exogenous approach, at rst by using the stochastic real busi-
ness cycle models (Finn E. Kidland and Edward C. Prescott, 1982)
and, more recently, by the adoption of the New Keynesian Dynamic
Stochastic General Equilibrium (DSGE) models, very adopted from
the most important institutions and central banks. These models,
however, have also been criticized for the assumption of the rational-
ity of agents' behaviour, since rational expectations have been found
to be systematically wrong in the business cycle. Expectations are of
fundamental importance in economics and nance, since the agents'
decisions about the future depends upon their expectations and their
beliefs. It is in fact very unlikely that agents are perfect foresighters
with rational expectations in a complex world, characterized by an
irregular pattern of prices and quantities dealt in nancial markets,
in which sophisticated nancial instruments are widespread.
In the rst chapter of this dissertation, I will face the machine learn-
ing technique, which is a nonlinear tool used for a better tting, fore-
casting and clustering of dierent nancial time series and existing
information in nancial markets. In particular, I will present a collec-
tion of three dierent applications of these techniques, adapted from
three dierent joint works:
"Yield curve estimation under extreme conditions: do RBF net-
works perform better?, joint with Pier Giuseppe Giribone, Marco
Neelli, Marina Resta, published Anna Esposito, Marcos Faundez-
Zanuy, Carlo Francesco Morabito, Eros Pasero Edrs, Multidisci-
plinary Approaches to Neural Computing/Vol. 69/ WIRN 2017
and Chapter 22 in book "Neural Advances in Processing Non-
linear Dynamic Signals", Springer;
Interest rates term structure models and their impact on actuarial
forecasting, joint with Pier Giuseppe Giribone and Marina Resta,
presented at XVIII Quantitative Finance Workshop, University
of Roma 3, January 2018;
Applications of Kohonen Maps in financial markets: design of
an automatic system for the detection of pricing anomalies, joint
with Pier Giuseppe Giribone and published on Risk Management
Magazine, 3-2017.
In the second chapter, I will present the study A nancial market
model with conrmation bias, in which nonlinearity is present as a
result of the formation of heterogeneous expectations. This work is
joint with Fabio Tramontana and it has been presented during the
X MDEF (Dynamic Models in Economics and Finance) Workshop at
University of Urbino Carlo Bo.
Finally, the third chapter is a rielaboration of another joint paper,
"The eects of negative nominal risk rates on the pricing of American
Calls: some theoretical and numerical insights", with Pier Giuseppe
Giribone and Marina Resta, published on Modern Economy 8(7), July
2017, pp 878-887. The problem of quantifying the value of early ex-
ercise in an option written on equity is a complex mathematical issue
that deals with continuous optimal control. In order to solve the con-
tinuous dynamic optimization problem that involves high non linearity
in the state variables, we have adopted a discretization scheme based
on a stochastic trinomial tree. This methodology reveals a higher
reliability and
exibility than the traditional approaches based on approximated quasi-closed formulas in a context where financial markets
are characterized by strong anomalies such as negative interest rates
- …