19 research outputs found
Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction
Assessing systemic risk in financial markets is of great importance but it
often requires data that are unavailable or available at a very low frequency.
For this reason, systemic risk assessment with partial information is
potentially very useful for regulators and other stakeholders. In this paper we
consider systemic risk due to fire sales spillover and portfolio rebalancing by
using the risk metrics defined by Greenwood et al. (2015). By using the Maximum
Entropy principle we propose a method to assess aggregated and single bank's
systemicness and vulnerability and to statistically test for a change in these
variables when only the information on the size of each bank and the
capitalization of the investment assets are available. We prove the
effectiveness of our method on 2001-2013 quarterly data of US banks for which
portfolio composition is available.Comment: 36 pages, 6 figures, Accepted on Journal of Economic Dynamics and
Contro
Score-driven generalized fitness model for sparse and weighted temporal networks
Temporal network data have recently received increasing attention due to the rich information content and valuable insight that appropriate modeling of links’ dynamics can unveil. While most of the literature on temporal network models focuses on binary graphs, each link of a real networks is often associated with a weight, a positive number describing the intensity of the relation between the nodes. Here we propose a novel dynamical model for sparse and weighted temporal networks as a combination of an extension of the fitness model and of the score-driven framework. We consider a zero-augmented generalized linear model to handle the weights and an observation-driven approach to describe time-varying parameters. We propose a flexible approach where the existence probability of a link is independent of its expected weight. This fact represents a crucial difference with alternative specifications proposed in the recent literature, with relevant implications both for the model's flexibility and for the forecasting capability. Our approach also accommodates the network dynamics’ dependence on external variables. We present a link forecasting analysis to data describing the overnight exposures in the Euro interbank market and investigate whether the influence of EONIA rates on the interbank network dynamics has changed over time during the sovereign debt crisis. (c) 2022 Elsevier Inc. All rights reserved
Modelling time-varying interactions in complex systems: the Score Driven Kinetic Ising Model
A common issue when analyzing real-world complex systems is that the interactions between their elements often change over time. Here we propose a new modeling approach for time-varying interactions generalising the well-known Kinetic Ising Model, a minimalistic pairwise constant interactions model which has found applications in several scientific disciplines. Keeping arbitrary choices of dynamics to a minimum and seeking information theoretical optimality, the Score-Driven methodology allows to extract from data and interpret the presence of temporal patterns describing time-varying interactions. We identify a parameter whose value at a given time can be directly associated with the local predictability of the dynamics and we introduce a method to dynamically learn its value from the data, without specifying parametrically the system's dynamics. We extend our framework to disentangle different sources (e.g. endogenous vs exogenous) of predictability in real time, and show how our methodology applies to a variety of complex systems such as financial markets, temporal (social) networks, and neuronal populations
Network Sensitivity of Systemic Risk
A growing body of studies on systemic risk in financial markets has
emphasized the key importance of taking into consideration the complex
interconnections among financial institutions. Much effort has been put in
modeling the contagion dynamics of financial shocks, and to assess the
resilience of specific financial markets - either using real network data,
reconstruction techniques or simple toy networks. Here we address the more
general problem of how shock propagation dynamics depends on the topological
details of the underlying network. To this end we consider different realistic
network topologies, all consistent with balance sheets information obtained
from real data on financial institutions. In particular, we consider networks
of varying density and with different block structures, and diversify as well
in the details of the shock propagation dynamics. We confirm that the systemic
risk properties of a financial network are extremely sensitive to its network
features. Our results can aid in the design of regulatory policies to improve
the robustness of financial markets
Statistical Mechanics of Complex Networks for Systemic Risk Reconstruction
Systemic risk concerns the stability of systems composed by different parts, specifically
the prediction and prevention of systemic events. A systemic event is typically defined as a
phenomenon that emerges from the complex interactions of the constituents and compromises
the normal functioning of the system. In particular, in finance the elementary constituents are
financial institutions, such as banks, and systemic risk pertains the description and prevention
of collapses that involve large portions of the financial system. After the recent troubled years
for the global economy, in which two severe crises (the 2007 crisis of financial markets and
the 2010 sovereign debt crisis) have put the whole economic system in dramatic distress,
vulnerability of banks to systemic events is now the main focus of a growing number of
investigations of the academic community, crosswise different disciplines.
When a bank undergoes some sort of malfunctioning or distress, and its troubled situation
negatively affects other institutions, we say that distress propagates, and that the interactions
among the two institutions serves as channel of contagion.
Networks are one of the main tools in the modelling and description systemic risk in
financial systems, since they allow for a straightforward description of different channels of
contagion.
The indirect channel of contagion can be described as a bipartite network, i.e. a network
where the set of nodes is sharply divided into two subsets, one containing only banks, the other
only assets. Banks can only be connected with assets but not with other banks. Moreover,
a link exist if the bank holds the asset in its portfolio, i.e. invests in that particular assets.
In order to quantify losses from indirect contagion a full knowledge of how the banks divide
their investments is needed. As quantifiers of systemic risk we consider systemicness and
vulnerability of a bank as, respectively, the total percentage loss induced on the system by
the distress of the bank and the total percentage loss experienced by the bank when the whole
system is in distress.
An important part of our thesis is dedicated to the empirical analysis of a dataset that
we originally developed, collecting data publicly available from the databases of the Federal Reserve, that describes the quarterly networks of US commercial banks’ exposures in the
period 2001-2015. Specifically we compute, for each quarter, systemicness and vulnerability
of each bank and the aggregate vulnerability of the system. From our network description of
the system, it emerges that there is a clear relation between the vulnerability of the system
to external shocks and the way in which the largest banks present manage their investments,
as we show in Figure 1.
The central topic of the thesis is the problem of reconstructing systemic risk measures
from partial information on the network, e.g. knowing only the size of the banks present and
the capitalization of the assets available.Our main purpose is to develop efficient methods to
estimate systemic risk from partial information, without the full knowledge of the bipartite
network.
A prolific analogy between statistical mechanics and the statistical inference of networks,
has been proposed and exploited by physicists. Network ensembles can be defined, akin
to grand canonical ensembles in statistical mechanics, and they can be used to efficiently
reconstruct networks from the sole partial information available. Ensembles are endowed with
a probability mass function on a set of graphs that depends on a set of parameters, that in
turn need to be estimated numerically from the partial information available. Banks statistics,
that are related to the network structure, are estimated through their expected values on the
statistical ensemble. We implemented the numerical procedures needed to practically employ
ensembles of bipartite networks to systemic risk reconstruction, and applied them to the reconstruction of systemicness and indirect vulnerability.
In order to test the reconstruction capability of our ensembles, we assume that the balance
sheet compositions of the banks are not known, estimate the measures of systemic risk, trough
network ensembles, and finally compare the real statistics computed for our dataset, with the
values inferred from partial information. This ex-post comparison allows us to asses whether
our methods are appropriate when the network structure is actually unknown.
While network ensembles based on Bose-Einstein statistic are now a standard tool in
network science, we found them to be unfit for the reconstruction of systemic risk. In fact
better suited ensembles, for the purpose of systemic risk reconstruction, are defined trough
a bold extension to networks of the correct Boltzmann counting of states. We refer to the
resulting ensembles as Maxwell-Boltzmann (MB) ensembles.
Inspired by the analogy with statistical mechanics, we propose an extension of the diffused
Max-Ent approach to the definition of network ensembles, and we refer to it as Minx-Ent.
Via Minx-Entr we define a family of ensembles that includes as limiting cases BE and MB
ensembles, and propose it as a new tool for the reconstruction of networks statistics.
At the end of this thesis we propose a, previously unexplored, application of grand-
canonical ensembles, to a subject not directly related to systemic risk, i.e. “filtering of complex
networks”. Often complex networks, that describe real systems, are extremely dense. A large
network with a high edge density may be hard to interpret, or visualize by traditional tools
of network analysis. In addition, a large portion of the links might not be informative, or
subject to measurement errors. Hence it is sometimes useful to extract sub-networks, that
contain only a portion of the original nodes and links. The reduction of “noisy” networks,
in order to maintain only relevant information, is known as information filtering in complex
networks. We propose an original technique for the filtering of complex networks based on
grand-canonical ensembles.
A Portion of this thesis has been the object of the paper Di Gangi, Lillo, and Pirino
(“Assessing systemic risk due to fire sales spillover through maximum entropy network recon-
struction”), currently undergoing reviewing process
Recurrent Deep Neural Networks for Nucleosome Classification
Nucleosomes are the fundamental repeating unit of chromatin. A nucleosome is an 8 histone proteins complex, in which approximately 147–150 pairs of DNA bases bind. Several biological studies have clearly stated that the regulation of cell type-specific gene activities are influenced by nucleosome positioning. Bioinformatic studies have improved those results showing proof of sequence specificity in nucleosomes’ DNA fragment. In this work, we present a recurrent neural network that uses nucleosome sequence features representation for their classification. In particular, we implement an architecture which stacks convolutional and long short-term memory layers, with the main purpose to avoid the features extraction and selection steps. We have computed classifications using eight datasets of three different organisms with a growing genome complexity, from yeast to human. We have also studied the capability of the model trained on the highest complex species in recognizing nucleosomes of the other organisms
Polymorphism of cytochrome P450 (CYP) genes and response to chemiotherapy in patients with colorectal cancer (CRC)
Background: Genes coding for the cytochrome P450 (CYP) enzyme
system implied in antineoplastic drug metabolism pathways are highly
polymorphic. This may influence both carcinogen metabolism and drug
pharmacodynamics modifying their therapeutic efficacy and side effects.
Methods: We investigated the influence of genetic polymorphisms of CYP
enzymes: rs1799853 (CYP2C9), rs35742686 (CYP2D), rs5030655
(CYP2D6/3), rs2740574 (CYP3A4/1) rs776746 (CYP3A5) on the
response of chemotherapy and clinical outcomes, in a group of 56
patients affected by sporadic CRC, treated with the standard protocols. A
total of 44 patients were in complete remission after treatment, 12 had
persistence of the disease. Polymorphisms were typed using a
competitive allele specific PCR assay (KASPar), developed by
KBioscience. Statistical were analyzed using the χ2 test with Yates
correction and Fisher's Exact Test. Significance was defined as p values
<0.05.
Results: No significant genetic contribute was observed for 4 of the 5
SNPs tested. A significant different genetic distribution between patients in
complete remission after treatment and those of symptomatic patients was
observed for the polymorphism C→T (rs1799853) responsible of an
Arg144Cys change in CYP2C9 and associated with reduced enzyme
activity (p=0.031, O.R.= 4.760, 95% C. I.: 1.237 to 18.311).
Conclusions: These results suggest that rs1799853 is a functionaly
relevant SNP of CYP2C9 that may influence the efficacy of therapy. Thus,
pharmacogenetic biomarkers have the potential of optimizing
chemotherapy for individual patient