1,166 research outputs found
Energy storage design and integration in power systems by system-value optimization
Energy storage can play a crucial role in decarbonising power systems by balancing
power and energy in time. Wider power system benefits that arise from these
balancing technologies include lower grid expansion, renewable curtailment, and
average electricity costs. However, with the proliferation of new energy storage
technologies, it becomes increasingly difficult to identify which technologies are
economically viable and how to design and integrate them effectively.
Using large-scale energy system models in Europe, the dissertation shows that solely
relying on Levelized Cost of Storage (LCOS) metrics for technology assessments can
mislead and that traditional system-value methods raise important questions about
how to assess multiple energy storage technologies. Further, the work introduces a
new complementary system-value assessment method called the market-potential
method, which provides a systematic deployment analysis for assessing multiple
storage technologies under competition. However, integrating energy storage in
system models can lead to the unintended storage cycling effect, which occurs in
approximately two-thirds of models and significantly distorts results. The thesis
finds that traditional approaches to deal with the issue, such as multi-stage optimization
or mixed integer linear programming approaches, are either ineffective
or computationally inefficient. A new approach is suggested that only requires
appropriate model parameterization with variable costs while keeping the model
convex to reduce the risk of misleading results.
In addition, to enable energy storage assessments and energy system research around
the world, the thesis extended the geographical scope of an existing European opensource
model to global coverage. The new build energy system model ‘PyPSA-Earth’
is thereby demonstrated and validated in Africa. Using PyPSA-Earth, the thesis
assesses for the first time the system value of 20 energy storage technologies across
multiple scenarios in a representative future power system in Africa. The results offer
insights into approaches for assessing multiple energy storage technologies under
competition in large-scale energy system models. In particular, the dissertation
addresses extreme cost uncertainty through a comprehensive scenario tree and finds
that, apart from lithium and hydrogen, only seven energy storage are optimizationrelevant
technologies. The work also discovers that a heterogeneous storage design
can increase power system benefits and that some energy storage are more important
than others. Finally, in contrast to traditional methods that only consider single
energy storage, the thesis finds that optimizing multiple energy storage options
tends to significantly reduce total system costs by up to 29%.
The presented research findings have the potential to inform decision-making processes
for the sizing, integration, and deployment of energy storage systems in
decarbonized power systems, contributing to a paradigm shift in scientific methodology
and advancing efforts towards a sustainable future
LIPIcs, Volume 251, ITCS 2023, Complete Volume
LIPIcs, Volume 251, ITCS 2023, Complete Volum
Sequential Gibbs Posteriors with Applications to Principal Component Analysis
Gibbs posteriors are proportional to a prior distribution multiplied by an
exponentiated loss function, with a key tuning parameter weighting information
in the loss relative to the prior and providing a control of posterior
uncertainty. Gibbs posteriors provide a principled framework for
likelihood-free Bayesian inference, but in many situations, including a single
tuning parameter inevitably leads to poor uncertainty quantification. In
particular, regardless of the value of the parameter, credible regions have far
from the nominal frequentist coverage even in large samples. We propose a
sequential extension to Gibbs posteriors to address this problem. We prove the
proposed sequential posterior exhibits concentration and a Bernstein-von Mises
theorem, which holds under easy to verify conditions in Euclidean space and on
manifolds. As a byproduct, we obtain the first Bernstein-von Mises theorem for
traditional likelihood-based Bayesian posteriors on manifolds. All methods are
illustrated with an application to principal component analysis
A Survey on Event-based News Narrative Extraction
Narratives are fundamental to our understanding of the world, providing us
with a natural structure for knowledge representation over time. Computational
narrative extraction is a subfield of artificial intelligence that makes heavy
use of information retrieval and natural language processing techniques.
Despite the importance of computational narrative extraction, relatively little
scholarly work exists on synthesizing previous research and strategizing future
research in the area. In particular, this article focuses on extracting news
narratives from an event-centric perspective. Extracting narratives from news
data has multiple applications in understanding the evolving information
landscape. This survey presents an extensive study of research in the area of
event-based news narrative extraction. In particular, we screened over 900
articles that yielded 54 relevant articles. These articles are synthesized and
organized by representation model, extraction criteria, and evaluation
approaches. Based on the reviewed studies, we identify recent trends, open
challenges, and potential research lines.Comment: 37 pages, 3 figures, to be published in the journal ACM CSU
Tradition and Innovation in Construction Project Management
This book is a reprint of the Special Issue 'Tradition and Innovation in Construction Project Management' that was published in the journal Buildings
Introduction to Riemannian Geometry and Geometric Statistics: from basic theory to implementation with Geomstats
International audienceAs data is a predominant resource in applications, Riemannian geometry is a natural framework to model and unify complex nonlinear sources of data.However, the development of computational tools from the basic theory of Riemannian geometry is laborious.The work presented here forms one of the main contributions to the open-source project geomstats, that consists in a Python package providing efficient implementations of the concepts of Riemannian geometry and geometric statistics, both for mathematicians and for applied scientists for whom most of the difficulties are hidden under high-level functions. The goal of this monograph is two-fold. First, we aim at giving a self-contained exposition of the basic concepts of Riemannian geometry, providing illustrations and examples at each step and adopting a computational point of view. The second goal is to demonstrate how these concepts are implemented in Geomstats, explaining the choices that were made and the conventions chosen. The general concepts are exposed and specific examples are detailed along the text.The culmination of this implementation is to be able to perform statistics and machine learning on manifolds, with as few lines of codes as in the wide-spread machine learning tool scikit-learn. We exemplify this with an introduction to geometric statistics
Markov field models of molecular kinetics
Computer simulations such as molecular dynamics (MD) provide a possible means to understand protein dynamics and mechanisms on an atomistic scale. The resulting simulation data can be analyzed with Markov state models (MSMs), yielding a quantitative kinetic model that, e.g., encodes state populations and transition rates. However, the larger an investigated system, the more data is required to estimate a valid kinetic model. In this work, we show that this scaling problem can be escaped when decomposing a system into smaller ones, leveraging weak couplings between local domains. Our approach, termed independent Markov decomposition (IMD), is a first-order approximation neglecting couplings, i.e., it represents a decomposition of the underlying global dynamics into a set of independent local ones. We demonstrate that for truly independent systems, IMD can reduce the sampling by three orders of magnitude. IMD is applied to two biomolecular systems. First, synaptotagmin-1 is analyzed, a rapid calcium switch from the neurotransmitter release machinery. Within its C2A domain, local conformational switches are identified and modeled with independent MSMs, shedding light on the mechanism of its calcium-mediated activation. Second, the catalytic site of the serine protease TMPRSS2 is analyzed with a local drug-binding model. Equilibrium populations of different drug-binding modes are derived for three inhibitors, mirroring experimentally determined drug efficiencies. IMD is subsequently extended to an end-to-end deep learning framework called iVAMPnets, which learns a domain decomposition from simulation data and simultaneously models the kinetics in the local domains. We finally classify IMD and iVAMPnets as Markov field models (MFM), which we define as a class of models that describe dynamics by decomposing systems into local domains. Overall, this thesis introduces a local approach to Markov modeling that enables to quantitatively assess the kinetics of large macromolecular complexes, opening up possibilities to tackle current and future computational molecular biology questions
Optimization of Sustainable Urban Energy Systems: Model Development and Application
Digital Appendix: Optimization of Sustainable Urban Energy Systems: Model Development and Applicatio
Measuring the impact of COVID-19 on hospital care pathways
Care pathways in hospitals around the world reported significant disruption during the recent COVID-19 pandemic but measuring the actual impact is more problematic. Process mining can be useful for hospital management to measure the conformance of real-life care to what might be considered normal operations. In this study, we aim to demonstrate that process mining can be used to investigate process changes associated with complex disruptive events. We studied perturbations to accident and emergency (A &E) and maternity pathways in a UK public hospital during the COVID-19 pandemic. Co-incidentally the hospital had implemented a Command Centre approach for patient-flow management affording an opportunity to study both the planned improvement and the disruption due to the pandemic. Our study proposes and demonstrates a method for measuring and investigating the impact of such planned and unplanned disruptions affecting hospital care pathways. We found that during the pandemic, both A &E and maternity pathways had measurable reductions in the mean length of stay and a measurable drop in the percentage of pathways conforming to normative models. There were no distinctive patterns of monthly mean values of length of stay nor conformance throughout the phases of the installation of the hospital’s new Command Centre approach. Due to a deficit in the available A &E data, the findings for A &E pathways could not be interpreted
A Tale of Two Approaches: Comparing Top-Down and Bottom-Up Strategies for Analyzing and Visualizing High-Dimensional Data
The proliferation of high-throughput and sensory technologies in various fields has led to a considerable increase in data volume, complexity, and diversity. Traditional data storage, analysis, and visualization methods are struggling to keep pace with the growth of modern data sets, necessitating innovative approaches to overcome the challenges of managing, analyzing, and visualizing data across various disciplines.
One such approach is utilizing novel storage media, such as deoxyribonucleic acid~(DNA), which presents efficient, stable, compact, and energy-saving storage option. Researchers are exploring the potential use of DNA as a storage medium for long-term storage of significant cultural and scientific materials.
In addition to novel storage media, scientists are also focussing on developing new techniques that can integrate multiple data modalities and leverage machine learning algorithms to identify complex relationships and patterns in vast data sets. These newly-developed data management and analysis approaches have the potential to unlock previously unknown insights into various phenomena and to facilitate more effective translation of basic research findings to practical and clinical applications.
Addressing these challenges necessitates different problem-solving approaches. Researchers are developing novel tools and techniques that require different viewpoints. Top-down and bottom-up approaches are essential techniques that offer valuable perspectives for managing, analyzing, and visualizing complex high-dimensional multi-modal data sets. This cumulative dissertation explores the challenges associated with handling such data and highlights top-down, bottom-up, and integrated approaches that are being developed to manage, analyze, and visualize this data. The work is conceptualized in two parts, each reflecting the two problem-solving approaches and their uses in published studies. The proposed work showcases the importance of understanding both approaches, the steps of reasoning about the problem within them, and their concretization and application in various domains
- …