9,633 research outputs found
Implicit Loss of Surjectivity and Facial Reduction: Theory and Applications
Facial reduction, pioneered by Borwein and Wolkowicz, is a preprocessing method that is commonly used to obtain strict feasibility in the reformulated, reduced constraint system.
The importance of strict feasibility is often addressed in the context of the convergence results for interior point methods.
Beyond the theoretical properties that the facial reduction conveys, we show that facial reduction, not only limited to interior point methods, leads to strong numerical performances in different classes of algorithms.
In this thesis we study various consequences and the broad applicability of facial reduction.
The thesis is organized in two parts.
In the first part, we show the instabilities accompanied by the absence
of strict feasibility through the lens of facially reduced systems.
In particular, we exploit the implicit redundancies, revealed by each nontrivial facial reduction step, resulting in the implicit loss of surjectivity.
This leads to the two-step facial reduction and two novel related notions of singularity.
For the area of semidefinite programming, we use these singularities to strengthen a known bound on the solution rank, the Barvinok-Pataki bound.
For the area of linear programming, we reveal degeneracies caused by the implicit redundancies.
Furthermore, we propose a preprocessing tool that uses the simplex method.
In the second part of this thesis, we continue with the semidefinite programs that do not have strictly feasible points.
We focus on the doubly-nonnegative relaxation of the binary quadratic program and a semidefinite program with a nonlinear objective function.
We closely work with two classes of algorithms, the splitting method and the Gauss-Newton interior point method.
We elaborate on the advantages in building models from facial reduction. Moreover, we develop algorithms for real-world problems including the quadratic assignment problem, the protein side-chain positioning problem, and the key rate computation for quantum key distribution.
Facial reduction continues to play an important role for
providing robust reformulated models in both the theoretical and the practical aspects, resulting in successful numerical performances
Towards A Practical High-Assurance Systems Programming Language
Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation.
Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code.
To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process
Beam scanning by liquid-crystal biasing in a modified SIW structure
A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium
Swarm Reinforcement Learning For Adaptive Mesh Refinement
The Finite Element Method, an important technique in engineering, is aided by
Adaptive Mesh Refinement (AMR), which dynamically refines mesh regions to allow
for a favorable trade-off between computational speed and simulation accuracy.
Classical methods for AMR depend on task-specific heuristics or expensive error
estimators, hindering their use for complex simulations. Recent learned AMR
methods tackle these problems, but so far scale only to simple toy examples. We
formulate AMR as a novel Adaptive Swarm Markov Decision Process in which a mesh
is modeled as a system of simple collaborating agents that may split into
multiple new agents. This framework allows for a spatial reward formulation
that simplifies the credit assignment problem, which we combine with Message
Passing Networks to propagate information between neighboring mesh elements. We
experimentally validate the effectiveness of our approach, Adaptive Swarm Mesh
Refinement (ASMR), showing that it learns reliable, scalable, and efficient
refinement strategies on a set of challenging problems. Our approach
significantly speeds up computation, achieving up to 30-fold improvement
compared to uniform refinements in complex simulations. Additionally, we
outperform learned baselines and achieve a refinement quality that is on par
with a traditional error-based AMR strategy without expensive oracle
information about the error signal.Comment: Version 1 of this paper is a preliminary workshop version that was
accepted as a workshop paper in the ICLR 2023 Workshop on Physics for Machine
Learnin
Minimum income support systems as elements of crisis resilience in Europe: Final Report
Mindestsicherungssysteme dienen in den meisten entwickelten Wohlfahrtsstaaten als Sicherheitsnetz letzter Instanz. Dementsprechend spielen sie gerade in wirtschaftlichen Krisenzeiten eine besondere Rolle. Inwieweit Mindestsicherungssysteme in Zeiten der Krise beansprucht werden, hängt auch von der Ausprägung vorgelagerter Sozialschutzsysteme ab. Diese Studie untersucht die Bedeutung von Systemen der Mindestsicherung sowie vorgelagerter Systeme wie Arbeitslosenversicherung, Kurzarbeit und arbeitsrechtlichem Bestandsschutz für die Krisenfestigkeit in Europa. Im Kontext der Finanzkrise von 2008/2009 und der Corona-Krise wird die Fähigkeit sozialpolitischer Maßnahmen untersucht, Armut und EinkommensÂverluste einzudämmen und gesellschaftliche Ausgrenzung zu vermeiden. Die Studie setzt dabei auf quantitative und qualitative Methoden, etwa multivariate Analysen, Mikrosimulationsmethoden sowie eingehende Fallstudien der Länder Dänemark, Frankreich, Irland, Polen und Spanien, die für unterschiedliche Typen von Wohlfahrtsstaaten stehen.The aim of this study is to analyse the role of social policies in different European welfare states regarding minimum income protection and active inclusion. The core focus lies on crisis resilience, i.e. the capacity of social policy arrangements to contain poverty and inequality and avoid exclusion before, during and after periods of economic shocks. To achieve this goal, the study expands its analytical focus to include other tiers of social protection, in particular upstream systems such as unemployment insurance, job retention and employment protection, as they play an additional and potentially prominent role in providing income and job protection in situations of crisis. A mixed-method approach is used that combines quantitative and qualitative research, such as descriptive and multivariate quantitative analyses, microsimulation methods and in-depth case studies. The study finds consistent differences in terms of crisis resilience across countries and welfare state types. In general, Nordic and Continental European welfare states with strong upstream systems and minimum income support (MIS) show better outcomes in core socio-economic outcomes such as poverty and exclusion risks. However, labour market integration shows some dualisms in Continental Europe. The study shows that MIS holds particular importance if there are gaps in upstream systems or cases of severe and lasting crises
Preferentialism and the conditionality of trade agreements. An application of the gravity model
Modern economic growth is driven by international trade, and the preferential trade agreement constitutes the primary fit-for-purpose mechanism of choice for establishing, facilitating, and governing its flows. However, too little attention has been afforded to the differences in content and conditionality associated with different trade agreements. This has led to an under-considered mischaracterisation of the design-flow relationship. Similarly, while the relationship between trade facilitation and trade is clear, the way trade facilitation affects other areas of economic activity, with respect to preferential trade agreements, has received considerably less attention. Particularly, in light of an increasingly globalised and interdependent trading system, the interplay between trade facilitation and foreign direct investment is of particular importance.
Accordingly, this thesis explores the bilateral trade and investment effects of specific conditionality sets, as established within Preferential Trade Agreements (PTAs).
Chapter one utilises recent content condition-indexes for depth, flexibility, and constraints on flexibility, established by Dür et al. (2014) and Baccini et al. (2015), within a gravity framework to estimate the average treatment effect of trade agreement characteristics across bilateral trade relationships in the Association of Southeast Asian Nations (ASEAN) from 1948-2015. This chapter finds that the composition of a given ASEAN trade agreement’s characteristic set has significantly determined the concomitant bilateral trade flows. Conditions determining the classification of a trade agreements depth are positively associated with an increase to bilateral trade; hereby representing the furthered removal of trade barriers and frictions as facilitated by deeper trade agreements. Flexibility conditions, and constraint on flexibility conditions, are also identified as significant determiners for a given trade agreement’s treatment effect of subsequent bilateral trade flows. Given the political nature of their inclusion (i.e., the appropriate address to short term domestic discontent) this influence is negative as regards trade flows. These results highlight the longer implementation and time frame requirements for trade impediments to be removed in a market with higher domestic uncertainty.
Chapter two explores the incorporation of non-trade issue (NTI) conditions in PTAs. Such conditions are increasing both at the intensive and extensive margins. There is a concern from developing nations that this growth of NTI inclusions serves as a way for high-income (HI) nations to dictate the trade agenda, such that developing nations are subject to ‘principled protectionism’. There is evidence that NTI provisions are partly driven by protectionist motives but the effect on trade flows remains largely undiscussed. Utilising the Gravity Model for trade, I test Lechner’s (2016) comprehensive NTI dataset for 202 bilateral country pairs across a 32-year timeframe and find that, on average, NTIs are associated with an increase to bilateral trade. Primarily this boost can be associated with the market access that a PTA utilising NTIs facilitates. In addition, these results are aligned theoretically with the discussions on market harmonisation, shared values, and the erosion of artificial production advantages. Instead of inhibiting trade through burdensome cost, NTIs are acting to support a more stable production and trading environment, motivated by enhanced market access. Employing a novel classification to capture the power supremacy associated with shaping NTIs, this chapter highlights that the positive impact of NTIs is largely driven by the relationship between HI nations and middle-to-low-income (MTLI) counterparts.
Chapter Three employs the gravity model, theoretically augmented for foreign direct investment (FDI), to estimate the effects of trade facilitation conditions utilising indexes established by Neufeld (2014) and the bilateral FDI data curated by UNCTAD (2014). The resultant dataset covers 104 countries, covering a period of 12 years (2001–2012), containing 23,640 observations. The results highlight the bilateral-FDI enhancing effects of trade facilitation conditions in the ASEAN context, aligning itself with the theoretical branch of FDI-PTA literature that has outlined how the ratification of a trade agreement results in increased and positive economic prospect between partners (Medvedev, 2012) resulting from the interrelation between trade and investment as set within an improving regulatory environment. The results align with the expectation that an enhanced trade facilitation landscape (one in which such formalities, procedures, information, and expectations around trade facilitation are conditioned for) is expected to incentivise and attract FDI
Multiscale Modelling of Self-assembly in Soft Matter
This thesis presents all-atom molecular dynamics simulations and the development of coarse-grained models for various classes of liquid crystals. The overall aim was to parametrise chemically specific models, propagating information between different resolutions through multiscale modelling approaches, to investigate hierarchical self-assembly in soft matter systems. Common coarse-graining methods were assessed in terms of their representability
and transferability for applications involving thermotropic calamitic and discotic mesogens, and lyotropic chromonic liquid crystals.
Extensive all-atom simulations were performed on: bent liquid crystal dimers, such as CB7CB; ionic cyanine dyes in aqueous solution (PIC, PCYN, TTBC and BIC); a chromonic
perylene bisimide dye (PER); and its thermotropic discotic analogue (PEROEG). These serve as references to parametrise/validate lower resolution models and to provide insights into these systems at the molecular level. For CB7CB, the twist-bend nematic (NTB) phase is observed and characterised. The self-assembly of cyanine dyes and chromonic mesogens was studied by calculating , and for the association of -mers (where = 2, 3 or 4). Structures of H-aggregate stacks, with shift and Y junction defects, and J-aggregates with a brickwork arrangement were detected.
Coarse-graining approaches including iterative Boltzmann inversion (IBI), multiscale coarse-graining (MS-CG) in the form of hybrid force matching (FM) and the Martini 3 force field were utilised for the aforementioned systems. A FM model of CB7CB demonstrates high representability and transferability; the NTB phase is captured and the full
phase diagram can be explored via heating or cooling. An optimised Martini model correctly exhibits the chromonic nematic and hexagonal phases for PER at the expected concentrations. For PEROEG, an IBI model was found to be superior in modelling the columnar-hexagonal phase. This thesis discusses, in detail, the successes and failures of
the various coarse-graining strategies. While successful coarse-graining of liquid crystals remains a challenge, this thesis demonstrates that, with the right choice of method,
high-quality coarse-grained models can be developed for both thermotropic and lyotropic systems
Differential Models, Numerical Simulations and Applications
This Special Issue includes 12 high-quality articles containing original research findings in the fields of differential and integro-differential models, numerical methods and efficient algorithms for parameter estimation in inverse problems, with applications to biology, biomedicine, land degradation, traffic flows problems, and manufacturing systems
- …