247 research outputs found
Decompose et Impera: tensor methods in high-dimensional data
This thesis is written with the scope of exploring multiway data. Multiway
data, also referred to as tensor data, is a collection of data points in multidimensional matrices. At a first glance one may think that these objects
are only a convenient representation of a datasets. They are not just a col-
lection of data, they have their own structure. For this reason, multiway
data need specific models to be correctly analysed. In this spirit, I developed
my personal idea on data analysis which can be represented by following
statement:
\It is not the data that should fit models, but models that should fit the data"
However, this should not be taken literary I do think that models are im-
portant: giving a structure to our techniques is necessary. Nevertheless, I do
think that data should be the main driver.This means that instead of trimming data at our necessity to fit existing models, researchers should develop
new models to re
ect the complexity of the data.
The purpose of this work is to provide an overview of tensor methods applied
to Economics and Finance. Yet, the most important aspect of this thesis are
ideas and applications rather than the mathematical content. New models
are proposed and fitted to data in order to test their performance and get
insights from the datasets analysed.
The description of the tensor methods provided in this thesis is not intended
to be complete but rather restricted to the model applicable to the analysed
data
Epidemics of Liquidity Shortages in Interbank Markets
Financial contagion from liquidity shocks has being recently ascribed as a
prominent driver of systemic risk in interbank lending markets. Building on
standard compartment models used in epidemics, in this work we develop an EDB
(Exposed-Distressed-Bankrupted) model for the dynamics of liquidity shocks
reverberation between banks, and validate it on electronic market for interbank
deposits data. We show that the interbank network was highly susceptible to
liquidity contagion at the beginning of the 2007/2008 global financial crisis,
and that the subsequent micro-prudential and liquidity hoarding policies
adopted by banks increased the network resilience to systemic risk---yet with
the undesired side effect of drying out liquidity from the market. We finally
show that the individual riskiness of a bank is better captured by its network
centrality than by its participation to the market, along with the currently
debated concept of "too interconnected to fail"
Surgical Implications of Ischemia Reperfusion Damage and Future Perspectives.
Flaps are often used in plastic and reconstructive surgery, and random pattern flaps were the first to be described. The indications for their use are numerous and include post-traumatic loss of su..
Disaggregating Time-Series with Many Indicators: An Overview of the DisaggregateTS Package
Low-frequency time-series (e.g., quarterly data) are often treated as
benchmarks for interpolating to higher frequencies, since they generally
exhibit greater precision and accuracy in contrast to their high-frequency
counterparts (e.g., monthly data) reported by governmental bodies. An array of
regression-based methods have been proposed in the literature which aim to
estimate a target high-frequency series using higher frequency indicators.
However, in the era of big data and with the prevalence of large volume of
administrative data-sources there is a need to extend traditional methods to
work in high-dimensional settings, i.e. where the number of indicators is
similar or larger than the number of low-frequency samples. The package
DisaggregateTS includes both classical regressions-based disaggregation methods
alongside recent extensions to high-dimensional settings, c.f. Mosley et al.
(2022). This paper provides guidance on how to implement these methods via the
package in R, and demonstrates their use in an application to disaggregating
CO2 emissions
- …