7,578 research outputs found
Designing capital-ratio triggers for Contingent Convertibles
Contingent Convertible (CoCo) bonds represent a novel category of debt financial instruments, recently introduced into the financial landscape. Their primary role is to bolster financial stability by maintaining healthy capital levels for the issuing entity. This is achieved by converting the bond principal into equity or writing it down once the minimum capital ratios are violated. CoCos aim to recapitalize the bank before it is on the brink of collapse, to avoid a state bailout at a huge cost to the taxpayer. Under normal circumstances, CoCo bonds operate as ordinary coupon-paying bonds, which only in case of insufficient capital ratios are converted into equity of the issuer.
However, the CoCo market has struggled to expand over the years, and the recent tumult involving Credit Suisse and its enforced CoCo write-off has underscored these challenges. The focus of this research work is on the first hand to understand the reasons for this failure, and, on the other hand, to modify its underlying design in order to restore its intended purpose: to act as a liquidity buffer, strengthening the capital structure of the issuing firm.
The cornerstone of the proposed work is the design of a self-adaptive model for leverage. This model features an automatic conversion that does not hinge on the judgment of regulatory authorities. Notably, it allows the issuer's debt-to-assets ratio to remain within predetermined boundaries, where the likelihood of default on outstanding liabilities remains minimal. The pricing of the proposed instruments is difficult as the conversion is dynamic. We view CoCos
essentially as a portfolio of different financial instruments. This treatment makes it easier to analyze their response to different market events that may or may not trigger their conversion to equity.
We provide evidence of the model's effectiveness and discuss it
implications of its implementation, in light of the regulatory environment and best market practices.Skilyrt breytanleg (e. Contingent Convertible, skammstafað CoCo) skuldabréf eru nýstárleg gerð af fjármálagerningum sem nýlega komu fram á sjónarsvið fjármálamarkaða. Helsta hlutverk þeirra er að e a fjármálastöðugleika með því að viðhalda hæfilegum eiginfjárgrunni fyrir útgefendur þeirra. Þetta er gert með því að umbreyta höfuðstól skuldabréfs í hlutafé eða með því færa þau niður þegar krafa um eiginfjárhlutföll eru rofin. CoCo hefur það markmið að endurfjármagna bankann áður en hann fellur og þar með koma í veg fyrir björgunaraðgerðir af hálfu ríkisins, sem hefur í för með sér mikinn kostnað fyrir skattgreiðendur. Undir venjulegum kringumstæðum virka CoCo skuldabréf eins og hefðbundin arðgreiðslu- skuldabréf, sem einungis er breytt í hlutafé þegar eiginfjárhlutföll útgefanda þeirra eru ekki nægjanleg. Eigi að síður hefur markaður fyrir CoCo átt erfitt uppdráttar í gegnum tíðina og hefur nýlegur titringur í kringum Credit Suisse og þvingaðar afskriftir þeirra á CoCo skuldabréfum ýtt enn frekar undir erfiðleikana. Helsti tilgangur þessarar rannsóknar er tvíþættur. Annars vegar er ætlunin að skilja hvers vegna CoCo hefur ekki átt meiri velgengni að fagna en raun ber vitni. Hins vegar er henni ætlað að breyta grundvallarhönnun CoCo í þeim tilgangi að endurheimta upprunalegan tilgang þeirra: sem er að vera stuðpúði lausafés sem styrkir fjármagnsskipan útgáfu fyrirtækisins. Hornsteinn verkefnisins er hönnun á líkani með sjálfaðlögunarhæfni með tilliti til skuldsetningarhlutfalls. Líkanið býr yfir sjálfvirkri umbreytingu sem ræðst því ekki af reglum eftirlitsyfirvalda. Það gerir útgefanda því kleift að viðhalda hlutfalli skulda á móti eignum innan fyrirfram skilgreindra marka, þar sem líkur á vanskilum vegna útistandandi skuldbindinga haldast í lágmarki. Verðlagning gerninganna sem lagðir eru til í rannsókninni er þó vandasöm þar sem umbreytingin er dýnamísk. Í meginatriðum verður litið á CoCos sem safn ólíkra fjármálagerninga. Með þessari aðferð er hægt að greina viðbrögð þeirra við mismunandi markaðsatburðum sem geta mögulega hrint af stað umbreytingu yfir í hlutafé. Sýnt verður fram á skilvirkni líkansins ásamt því að álykta um innleiðingu þess með tilliti til regluverks og bestu markaðsvenja.RU Research Fund
Icelandic Research Fun
Opportunities and risks of stochastic deep learning
This thesis studies opportunities and risks associated with stochasticity in deep learning that specifically manifest in the context of adversarial robustness and neural architecture search (NAS). On the one hand, opportunities arise because stochastic methods have a strong impact on robustness and generalisation, both from a theoretical and an empirical standpoint. In addition, they provide a framework for navigating non-differentiable search spaces, and for expressing data and model uncertainty. On the other hand, trade-offs (i.e., risks) that are coupled with these benefits need to be carefully considered. The three novel contributions that comprise the main body of this thesis are, by these standards, instances of opportunities and risks.
In the context of adversarial robustness, our first contribution proves that the impact of an adversarial input perturbation on the output of a stochastic neural network (SNN) is theoretically bounded. Specifically, we demonstrate that SNNs are maximally robust when they achieve weight-covariance alignment, i.e., when the vectors of their classifier layer are aligned with the eigenvectors of that layer's covariance matrix. Based on our theoretical insights, we develop a novel SNN architecture with excellent empirical adversarial robustness and show that our theoretical guarantees also hold experimentally.
Furthermore, we discover that SNNs partially owe their robustness to having a noisy loss landscape. Gradient-based adversaries find this landscape difficult to ascend during adversarial perturbation search, and therefore fail to create strong adversarial examples. We show that inducing a noisy loss landscape is not an effective defence mechanism, as it is easy to circumvent. To demonstrate that point, we develop a stochastic loss-smoothing extension to state-of-the-art gradient-based adversaries that allows them to attack successfully. Interestingly, our loss-smoothing extension can also (i) be successful against non-stochastic neural networks that defend by altering their loss landscape in different ways, and (ii) strengthen gradient-free adversaries.
Our third and final contribution lies in the field of few-shot learning, where we develop a stochastic NAS method for adapting pre-trained neural networks to previously unseen classes, by observing only a few training examples of each new class. We determine that the adaptation of a pre-trained backbone is not as simple as adapting all of its parameters. In fact, adapting or fine-tuning the entire architecture is sub-optimal, as a lot of layers already encode knowledge optimally. Our NAS algorithm searches for the optimal subset of pre-trained parameters to be adapted or fine-tuned, which yields a significant improvement over the existing paradigm for few-shot adaptation
Dynamic PET-Tau Quantification for Progressive Supranuclear Palsy Diagnosis
Treballs Finals de Grau d'Enginyeria Biomèdica. Facultat de Medicina i Ciències de la Salut. Universitat de Barcelona. Curs: 2023-2024. Tutor: Raúl Tudela ; Director: Aida Niñerola, Raúl TudelaTauopathies are neurodegenerative diseases caused by the abnormal accumulation of tau proteins
in the brain. One uncommon tauopathy is progressive supranuclear palsy (PSP), whose symptoms
often overlap with other brain disorders, and its detection is only possible postmortem since there
is not an available ideal biomarker.
PET-tau imaging has the potential to revolutionize the early detection of this disease. PET is a
nuclear imaging test which allows seeing the functionality of organs and tissues in vivo using a
radiotracer that emits radiation from inside the body. A new PET tracer called 18F-PI-2620 has
shown promising results concerning the detection of PSP, with high affinity to tau aggregates and
low off-target binding.
This project consists of designing and testing a software for the quantification of PET images of the
brain with a dynamic acquisition, which show the radiotracer distribution through time. The software
performs a coregistration of the images to the standard space, where the different regions of the
brain can be segmented using an atlas, and provides two physiologically meaningful parameters
which are the Distribution Volume Ratio (DVR) and Standardized Uptake Value Ratio (SUVR). It
gives out the DVR and SUVR values for any region of interest, as well as parametric images which
help visualizing the radiotracer distribution in the brain.
A set of brain PET images from 13 subjects acquired using 18F-PI-2620 has been used for the
development and testing of the software, divided into healthy controls, subjects with Down
syndrome, some of whom have developed Alzheimer’s disease (AD), which also implies a higher
amount of abnormal deposited tau proteins. The results have shown higher DVR and SUVR values
for several brain regions in those subjects who have developed AD, confirming that they have a
higher radiotracer uptake and a greater amount of deposited tau proteins. This proves the correct
functionality of the software and its potential as a future tool for detecting tauopathies such as PSP
in combination with the radiotracer
Physiological responses and cognitive behaviours: Measures of heart rate variability index language knowledge
Over the past decades, focus has been on developing methods that allow tapping into aspects of cognition that are not directly observable. This includes linguistic knowledge and skills which develop largely without awareness and may therefore be difficult or impossible to articulate. Building on the relation between language cognition and the nervous system, we examine whether Heart Rate Variability (HRV), a cardiovascular measure that indexes Autonomic Nervous System activity, can be used to assess implicit language knowledge. We test the potential of HRV to detect whether individuals possess grammatical knowledge and explore how sensitive the cardiovascular response is.
41 healthy, British English-speaking adults listened to 40 English speech samples, half of which contained grammatical errors. Thought Technology's 5-channel ProComp 5 encoder tracked heart rate via a BVP-Flex/Pro sensor attached to the middle finger of the non-dominant hand, at a rate of 2048 samples per second. A Generalised Additive Mixed Effects Model confirmed a cardiovascular response to grammatical violations: there is a statistically significant reduction in HRV as indexed by NN50 in response to stimuli that contain errors. The cardiovascular response reflects the extent of the linguistic violations, and NN50 decreases linearly with an increase in the number of errors, up to a certain level, after which HRV remains constant.
This observation brings into focus a new dimension of the intricate relationship between physiology and cognition. Being able to use a highly portable and non-intrusive technique with language stimuli also creates exciting possibilities for assessing the language knowledge of individuals from a range of populations in their natural environment and in authentic communicative situations
Brittle-viscous deformation cycles at the base of the seismogenic zone in the continental crust
The main goal of the study was to determine the dynamical cycle of ductile-brittle deformation and to characterise the fluid pathways at different scales of a brittle-viscous fault zone active at the base of the seismogenic crust. Object of analysis are samples from the sinistral strike-slip fault zone BFZ045 from Olkiluoto (SW Finland), located at the site of a deep geological repository for nuclear waste.
Combined microstructural analysis, electron backscatter diffraction (EBSD), and mineral chemistry were applied to reconstruct the variations in pressure, temperature, fluid pressure, and differential stress that mediated deformation and strain localization along BFZ045 across the BDTZ. Ductile deformation took place at 400-500° C and 3-4 kbar, and recrystallized grain size piezometry for quartz document a progressive increase in differential stress during mylonitization, from ca. 50 MPa to ca. 120 MPa. The increase in differential stress was localised towards the shear zone center, which was eventually overprinted by brittle deformation in a narrowing shear zone. Cataclastic deformation occurred under lower T conditions down to T ≥ 320° C and was not further overprinted by mylonitic creep. Porosity estimates were obtained through the combination of x-ray micro-computed tomography (µCT), mercury intrusion porosimetry, He pycnometry, and microstructural analysis. Low porosity values (0.8-4.4%) for different rock type, 2-20 µm pore size, representative of pore connectivity, and microstructural observation suggest a relationship to a dynamical cycle of fracturing and sealing mechanism, mostly controlled by ductile deformation. Similarly, the observation from fracture orientation analysis indicates that the mylonitic precursor of BFZ045 played an important role in the localization of the brittle deformation. This thesis highlights that the ductile-brittle deformation cycle in BFZ045 was controlled by transient oscillations in fluid pressure in a narrowing shear zone deforming at progressively higher differential stress during cooling
Information actors beyond modernity and coloniality in times of climate change:A comparative design ethnography on the making of monitors for sustainable futures in Curaçao and Amsterdam, between 2019-2022
In his dissertation, Mr. Goilo developed a cutting-edge theoretical framework for an Anthropology of Information. This study compares information in the context of modernity in Amsterdam and coloniality in Curaçao through the making process of monitors and develops five ways to understand how information can act towards sustainable futures. The research also discusses how the two contexts, that is modernity and coloniality, have been in informational symbiosis for centuries which is producing negative informational side effects within the age of the Anthropocene. By exploring the modernity-coloniality symbiosis of information, the author explains how scholars, policymakers, and data-analysts can act through historical and structural roots of contemporary global inequities related to the production and distribution of information. Ultimately, the five theses propose conditions towards the collective production of knowledge towards a more sustainable planet
Influence of Preaching’s Rhetorical Appeal on Evangelical Listeners’ Motivation
Preaching is a form of rhetorical narratology aimed at persuading its audience via sermons to experience a renewal of the mind and the transformation of their life. While previous research established the fact that listeners comprehend sermons through their rhetorical appeal, it has been unclear how this has motivated evangelical listeners to act. The purpose of this qualitative narrative study was to explore how the rhetorical appeal of preaching influences evangelical listeners’ motivation at evangelical churches in Savannah, Georgia. A comprehensive approach to exploring a sermon’s rhetorical appeal was utilized by focusing jointly on individual perception and social context. The Narrative Transportation Theory served as the theoretical framework, and 34 participants from six churches were interviewed to reach saturation. The findings showed that the rhetorical appeal embedded in preaching, plus its narrative essence, influences evangelical listener motivation. In addition, listeners subconsciously understand that aspects of rhetoric and narrative work together in sermons to influence their motivation. This study specifically identified three themes, seven categories, 13 conditions, and 32 codes relevant for rhetorical appeal to be effective and to help motivation occur. The three themes of Relatability, Applicability, and Engagement were aligned with Ethos, Logos, and Pathos, and then integrated with Environmental, Cognitive, and Behavioral functions, to create the Sermon Listener Motivation Triangle. This study’s corroboration of preaching’s collaborative nature between the perfectly divine and the imperfectly human is shared in hopes of helping speakers prepare scripturally authentic sermons and communicate in engaging ways that inspire change
Spatial epidemiology of a highly transmissible disease in urban neighbourhoods: Using COVID-19 outbreaks in Toronto as a case study
The emergence of infectious diseases in an urban area involves a complex interaction between the socioecological processes in the neighbourhood and urbanization. As a result, such an urban environment can be the incubator of new epidemics and spread diseases more rapidly in densely populated areas than elsewhere. Most recently, the Coronavirus-19 (COVID-19) pandemic has brought unprecedented challenges around the world. Toronto, the capital city of Ontario, Canada, has been severely impacted by COVID-19. Understanding the spatiotemporal patterns and the key drivers of such patterns is imperative for designing and implementing an effective public health program to control the spread of the pandemic. This dissertation was designed to contribute to the global research effort on the COVID-19 pandemic by conducting spatial epidemiological studies to enhance our understanding of the disease's epidemiology in a spatial context to guide enhancing the public health strategies in controlling the disease.
Comprised of three original research manuscripts, this dissertation focuses on the spatial epidemiology of COVID-19 at a neighbourhood scale in Toronto. Each manuscript makes scientific contributions and enhances our knowledge of how interactions between different socioecological processes in the neighbourhood and urbanization can influence spatial spread and patterns of COVID-19 in Toronto with the application of novel and advanced methodological approaches. The findings of the outcomes of the analyses are intended to contribute to the public health policy that informs neighbourhood-based disease intervention initiatives by the public health authorities, local government, and policymakers.
The first manuscript analyzes the globally and locally variable socioeconomic drivers of COVID-19 incidence and examines how these relationships vary across different neighbourhoods. In the global model, lower levels of education and the percentage of immigrants were found to have a positive association with increased risk for COVID-19. This study provides the methodological framework for identifying the local variations in the association between risk for COVID-19 and socioeconomic factors in an urban environment by applying a local multiscale geographically weighted regression (MGWR) modelling approach. The MGWR model is an improvement over the methods used in earlier studies of COVID-19 in identifying local variations of COVID-19 by incorporating a correction factor for the multiple testing problem in the geographically weighted regression models.
The second manuscript quantifies the associations between COVID-19 cases and urban socioeconomic and land surface temperature (LST) at the neighbourhood scale in Toronto. Four spatiotemporal Bayesian hierarchical models with spatial, temporal, and varying space-time interaction terms are compared. The results of this study identified the seasonal trends of COVID-19 risk, where the spatiotemporal trends show increasing, decreasing, or stable patterns, and identified area-specific spatial risk for targeted interventions. Educational level and high land surface temperature are shown to have a positive association with the risk for COVID-19. In this study, high spatial and temporal resolution satellite images were used to extract LST, and atmospheric corrections methods were applied to these images by adopting a land surface emissivity (LSE) model, which provided a high estimation accuracy. The methodological approach of this work will help researchers understand how to acquire long time-series data of LST at a spatial scale from satellite images, develop methodological approaches for atmospheric correction and create the environmental data with a high estimation accuracy to fit into modelling disease. Applying to policy, the findings of this study can inform the design and implementation of urban planning strategies and programs to control disease risks.
The third manuscript developed a novel approach for visualization of the spread of infectious disease outbreaks by incorporating neighbourhood networks and the time-series data of the disease at the neighbourhood level. The findings of the model provide an understanding of the direction and magnitude of spatial risk for the outbreak and guide for the importance of early intervention in order to stop the spread of the outbreak. The manuscript also identified hotspots using incidence rate and disease persistence, the findings of which may inform public health planners to develop priority-based intervention plans in a resource constraint situation
Spike timing reshapes robustness against attacks in spiking neural networks
The success of deep learning in the past decade is partially shrouded in the
shadow of adversarial attacks. In contrast, the brain is far more robust at
complex cognitive tasks. Utilizing the advantage that neurons in the brain
communicate via spikes, spiking neural networks (SNNs) are emerging as a new
type of neural network model, boosting the frontier of theoretical
investigation and empirical application of artificial neural networks and deep
learning. Neuroscience research proposes that the precise timing of neural
spikes plays an important role in the information coding and sensory processing
of the biological brain. However, the role of spike timing in SNNs is less
considered and far from understood. Here we systematically explored the timing
mechanism of spike coding in SNNs, focusing on the robustness of the system
against various types of attacks. We found that SNNs can achieve higher
robustness improvement using the coding principle of precise spike timing in
neural encoding and decoding, facilitated by different learning rules. Our
results suggest that the utility of spike timing coding in SNNs could improve
the robustness against attacks, providing a new approach to reliable coding
principles for developing next-generation brain-inspired deep learning
- …