236 research outputs found
The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools
On September 14, 2015, the newly upgraded Laser Interferometer
Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW)
signal, emitted a billion light-years away by a coalescing binary of two
stellar-mass black holes. The detection was announced in February 2016, in time
for the hundredth anniversary of Einstein's prediction of GWs within the theory
of general relativity (GR). The signal represents the first direct detection of
GWs, the first observation of a black-hole binary, and the first test of GR in
its strong-field, high-velocity, nonlinear regime. In the remainder of its
first observing run, LIGO observed two more signals from black-hole binaries,
one moderately loud, another at the boundary of statistical significance. The
detections mark the end of a decades-long quest, and the beginning of GW
astronomy: finally, we are able to probe the unseen, electromagnetically dark
Universe by listening to it. In this article, we present a short historical
overview of GW science: this young discipline combines GR, arguably the
crowning achievement of classical physics, with record-setting, ultra-low-noise
laser interferometry, and with some of the most powerful developments in the
theory of differential geometry, partial differential equations,
high-performance computation, numerical analysis, signal processing,
statistical inference, and data science. Our emphasis is on the synergy between
these disciplines, and how mathematics, broadly understood, has historically
played, and continues to play, a crucial role in the development of GW science.
We focus on black holes, which are very pure mathematical solutions of
Einstein's gravitational-field equations that are nevertheless realized in
Nature, and that provided the first observed signals.Comment: 41 pages, 5 figures. To appear in Bulletin of the American
Mathematical Societ
The Devil's Invention: Asymptotic, Superasymptotic and Hyperasymptotic Series
Singular perturbation methods, such as the method of multiple scales and the method of matched asymptotic expansions, give series in a small parameter ε which are asymptotic but (usually) divergent. In this survey, we use a plethora of examples to illustrate the cause of the divergence, and explain how this knowledge can be exploited to generate a 'hyperasymptotic' approximation. This adds a second asymptotic expansion, with different scaling assumptions about the size of various terms in the problem, to achieve a minimum error much smaller than the best possible with the original asymptotic series. (This rescale-and-add process can be repeated further.) Weakly nonlocal solitary waves are used as an illustration.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/41670/1/10440_2004_Article_193995.pd
Fear Classification using Affective Computing with Physiological Information and Smart-Wearables
Mención Internacional en el título de doctorAmong the 17 Sustainable Development Goals proposed within the 2030 Agenda
and adopted by all of the United Nations member states, the fifth SDG is a call
for action to effectively turn gender equality into a fundamental human right and
an essential foundation for a better world. It includes the eradication of all types
of violence against women. Focusing on the technological perspective, the range of
available solutions intended to prevent this social problem is very limited. Moreover,
most of the solutions are based on a panic button approach, leaving aside
the usage and integration of current state-of-the-art technologies, such as the Internet
of Things (IoT), affective computing, cyber-physical systems, and smart-sensors.
Thus, the main purpose of this research is to provide new insight into the design and
development of tools to prevent and combat Gender-based Violence risky situations
and, even, aggressions, from a technological perspective, but without leaving aside
the different sociological considerations directly related to the problem. To achieve
such an objective, we rely on the application of affective computing from a realist
point of view, i.e. targeting the generation of systems and tools capable of being implemented
and used nowadays or within an achievable time-frame. This pragmatic
vision is channelled through: 1) an exhaustive study of the existing technological
tools and mechanisms oriented to the fight Gender-based Violence, 2) the proposal
of a new smart-wearable system intended to deal with some of the current technological
encountered limitations, 3) a novel fear-related emotion classification approach
to disentangle the relation between emotions and physiology, and 4) the definition
and release of a new multi-modal dataset for emotion recognition in women.
Firstly, different fear classification systems using a reduced set of physiological signals are explored and designed. This is done by employing open datasets together
with the combination of time, frequency and non-linear domain techniques. This
design process is encompassed by trade-offs between both physiological considerations
and embedded capabilities. The latter is of paramount importance due to
the edge-computing focus of this research. Two results are highlighted in this first
task, the designed fear classification system that employed the DEAP dataset data
and achieved an AUC of 81.60% and a Gmean of 81.55% on average for a subjectindependent
approach, and only two physiological signals; and the designed fear
classification system that employed the MAHNOB dataset data achieving an AUC
of 86.00% and a Gmean of 73.78% on average for a subject-independent approach,
only three physiological signals, and a Leave-One-Subject-Out configuration. A detailed
comparison with other emotion recognition systems proposed in the literature
is presented, which proves that the obtained metrics are in line with the state-ofthe-
art.
Secondly, Bindi is presented. This is an end-to-end autonomous multimodal system
leveraging affective IoT throughout auditory and physiological commercial off-theshelf
smart-sensors, hierarchical multisensorial fusion, and secured server architecture
to combat Gender-based Violence by automatically detecting risky situations
based on a multimodal intelligence engine and then triggering a protection protocol.
Specifically, this research is focused onto the hardware and software design of one of
the two edge-computing devices within Bindi. This is a bracelet integrating three
physiological sensors, actuators, power monitoring integrated chips, and a System-
On-Chip with wireless capabilities. Within this context, different embedded design
space explorations are presented: embedded filtering evaluation, online physiological
signal quality assessment, feature extraction, and power consumption analysis.
The reported results in all these processes are successfully validated and, for some
of them, even compared against physiological standard measurement equipment.
Amongst the different obtained results regarding the embedded design and implementation
within the bracelet of Bindi, it should be highlighted that its low power
consumption provides a battery life to be approximately 40 hours when using a 500
mAh battery.
Finally, the particularities of our use case and the scarcity of open multimodal datasets dealing with emotional immersive technology, labelling methodology considering
the gender perspective, balanced stimuli distribution regarding the target
emotions, and recovery processes based on the physiological signals of the volunteers
to quantify and isolate the emotional activation between stimuli, led us to the definition
and elaboration of Women and Emotion Multi-modal Affective Computing
(WEMAC) dataset. This is a multimodal dataset in which 104 women who never
experienced Gender-based Violence that performed different emotion-related stimuli
visualisations in a laboratory environment. The previous fear binary classification
systems were improved and applied to this novel multimodal dataset. For instance,
the proposed multimodal fear recognition system using this dataset reports up to
60.20% and 67.59% for ACC and F1-score, respectively. These values represent a
competitive result in comparison with the state-of-the-art that deal with similar
multi-modal use cases.
In general, this PhD thesis has opened a new research line within the research group
under which it has been developed. Moreover, this work has established a solid base
from which to expand knowledge and continue research targeting the generation of
both mechanisms to help vulnerable groups and socially oriented technology.Programa de Doctorado en Ingeniería Eléctrica, Electrónica y Automática por la Universidad Carlos III de MadridPresidente: David Atienza Alonso.- Secretaria: Susana Patón Álvarez.- Vocal: Eduardo de la Torre Arnan
Routines and Applications of Symbolic Algebra Software
Computing has become an essential resource in modern research and has found application
across a wide range of scientific disciplines. Developments in symbolic algebra tools have been
particularly valuable in physics where calculations in fields such as general relativity, quantum
field theory and physics beyond the standard model are becoming increasing complex and
unpractical to work with by hand. The computer algebra system Cadabra is a tensor-first
approach to symbolic algebra based on the programming language Python which has been used
extensively in research in these fields while also having a shallow learning curve making it an
excellent way to introduce students to methods in computer algebra.
The work in this thesis has been concentrated on developing Cadabra, which has involved
looking at two different elements which make up a computer algebra program. Firstly, the
implementation of algebraic routines is discussed. This has primarily been focused on the
introduction of an algorithm for detecting the equivalence of tensorial expressions related by
index permutation symmetries. The method employed differs considerably from traditional
canonicalisation routines which are commonly used for this purpose by using Young projection
operators to make such symmetries manifest.
The other element of writing a computer algebra program which is covered is the infrastruc-
ture and environment. The importance of this aspect of software design is often overlooked by
funding committees and academic software users resulting in an anti-pattern of code not being
shared and contributed to in the way in which research itself is published and promulgated.
The focus in this area has been on implementing a packaging system for Cadabra which allows
the writing of generic libraries which can be shared by the community, and interfacing with
other scientific computing packages to increase the capabilities of Cadabra
A formal analysis of complexity and systemic risk in financial networks with derivatives
The 2008 financial crisis has been attributed by policymakers to “excessive complexity” of the financial network, especially due to financial derivatives. In a financial network, financial institutions (“banks” for short) are connected by financial contracts. As banks depend on payments from contracts with other banks to cover their own obligations, such a situation creates systemic risk, i.e., the risk of a financial crisis. Some of the contracts are financial derivatives, where an obligation to pay depends on another variable.
In this thesis, I study in what sense derivatives make a financial network fundamentally “more complex” compared to one without derivatives. I capture the notion of “complexity” formally using tools from finance and theoretical computer science. I reveal new kinds of systemic risk that arise in financial networks specifically because of derivatives and I discuss the impact of recent regulatory policy.
I first focus on a type of derivative called a credit default swap (CDS), in which the writer insures the holder of the contract against the default (i.e., bankruptcy) of a third party, the reference entity. I show that, when the reference entity is another bank, then such CDSs introduce a new kind of systemic risk arising from what I call default ambiguity. Default ambiguity is a situation where it is impossible to decide which banks are in default following a shock (i.e., a loss in banks’ assets). At a technical level, I show that the clearing problem may have no solution or multiple incompatible solutions. In contrast, without CDSs, a unique canonical solution always exists.
I then demonstrate that increased “complexity” due to CDSs also manifests as computational complexity. More in detail, I show that the clearing problem leads to NP-complete decision and PPAD-complete approximation problems if CDSs are allowed. This implies a fundamental barrier to the computational analysis of these networks, specifically to macroprudential stress testing. Without CDSs, the problems are either trivial or in P. I study the impact of different regulatory policies. My main result is that the aforementioned phenomena can be attributed to naked CDS positions.
In a final step, I focus on one specific regulatory policy: mandatory portfolio compression, which is a post-trade mechanism by which cycles in the financial network are eliminated. While this always reduces individual exposures, I show that, surprisingly, it can worsen the impact of certain shocks. Banks’ incentives to compress may further be misaligned with social welfare. I provide sufficient conditions on the network structure under which these issues are eliminated. Overall, my results in this thesis contribute to a better understanding of systemic risk and the effects of regulatory policy
Towards the improvement of machine learning peak runoff forecasting by exploiting ground- and satellite-based precipitation data: A feature engineering approach
La predicción de picos de caudal en sistemas montañosos complejos presenta desafíos en
hidrología debido a la falta de datos y las limitaciones de los modelos físicos. El aprendizaje
automático (ML) ofrece una solución al permitir la integración de técnicas y productos satelitales
de precipitación (SPPs). Sin embargo, se ha debatido sobre la efectividad del ML debido a su
naturaleza de "caja negra" que dificulta la mejora del rendimiento y la reproducibilidad de los
resultados. Para abordar estas preocupaciones, se han propuesto estrategias de ingeniería de
características (FE) para incorporar conocimiento físico en los modelos de ML, mejorando la
comprensión y precisión de las predicciones. Esta investigación doctoral tiene como objetivo
mejorar la predicción de picos de caudal mediante la integración de conceptos hidrológicos a
través de técnicas de FE y el uso de datos de precipitación in-situ y SPPs. Se exploran técnicas
y estrategias de ML para mejorar la precisión en sistemas hidrológicos macro y mesoescala.
Además, se propone una estrategia de FE para aprovechar la información de SPPs y superar la
escasez de datos espaciales y temporales. La integración de técnicas avanzadas de ML y FE
representa un avance en hidrología, especialmente para sistemas montañosos complejos con
limitada o nula red de monitoreo. Los hallazgos de este estudio serán valiosos para tomadores
de decisiones e hidrólogos, facilitando la mitigación de los impactos de los picos de caudal.
Además, las metodologías desarrolladas se pueden adaptar a otros sistemas de macro y
mesoescala, beneficiando a la comunidad científica en general.Peak runoff forecasting in complex mountain systems poses significant challenges in hydrology
due to limitations in traditional physically-based models and data scarcity. However, the
integration of machine learning (ML) techniques offers a promising solution by balancing
computational efficiency and enabling the incorporation of satellite precipitation products (SPPs).
However, debates have emerged regarding the effectiveness of ML in hydrology, as its black-box
nature lacks explicit representation of hydrological processes, hindering performance
improvement and result reproducibility. To address these concerns, recent studies emphasize the
inclusion of FE strategies to incorporate physical knowledge into ML models, enabling a better
understanding of the system and improved forecasting accuracy. This doctoral research aims to
enhance the effectiveness of ML in peak runoff forecasting by integrating hydrological concepts
through FE techniques, utilizing both ground-based and satellite-based precipitation data. For
this, we explore ML techniques and strategies to enhance accuracy in complex macro- and mesoscale
hydrological systems.
Additionally, we propose a FE strategy for a proper utilization of SPP information which is crucial for overcoming spatial and temporal data scarcity.
The integration of advanced ML techniques and FE represents a significant advancement in hydrology,
particularly for complex mountain systems with limited or inexistent monitoring networks.
The findings of this study will provide valuable insights for decision-makers and hydrologists, facilitating effective mitigation of the impacts of peak runoffs. Moreover, the developed methodologies can be adapted
to other macro- and meso-scale systems, with necessary adjustments based on available data
and system-specific characteristics, thus benefiting the broader scientific community.0000-0002-7683-37680000-0002-6206-075XDoctor (PhD) en Recursos HídricosCuenc
Wavelet Theory
The wavelet is a powerful mathematical tool that plays an important role in science and technology. This book looks at some of the most creative and popular applications of wavelets including biomedical signal processing, image processing, communication signal processing, Internet of Things (IoT), acoustical signal processing, financial market data analysis, energy and power management, and COVID-19 pandemic measurements and calculations. The editor’s personal interest is the application of wavelet transform to identify time domain changes on signals and corresponding frequency components and in improving power amplifier behavior
- …