397 research outputs found

    Sharing sovereignty in the EU’s Area of Freedom, Security and Justice

    Get PDF
    European integration has turned the EU neither into a state, in which authority is fully centralized in Brussels, nor is the EU a classic international organization, in which member states remain fully sovereign. Instead, European integration is patchy. For some policies, decision-making authority still rests with the member states whereas, for others, policy-making authority was transferred to the EU. Why does the EU’s authority vary across policies? Taking policies belonging to the EU’s Area of Freedom, Security and Justice as a sample, Stefan Jagdhuber theorizes and empirically analyzes why integration proceeded on illegal immigration policy and judicial cooperation on civil law matters whereas it stagnated for legal immigration policy and judicial cooperation on criminal law matters. The findings show that uneven integration trajectories in the EU are likely when policy interdependence, supranational activism and domestic constraints differ across policies. Stefan Jagdhuber studied Political Science, Contemporary History and Sociology at the LMU Munich. During his doctoral studies at the LMU Munich, he specialized in questions of differentiated integration in the European Union and the EU in international negotiations. His research appeared in journals such as Politique Européenne, the Journal of European Public Policy and West European Politics

    Modelling with feature costs under a total cost budget constraint

    Get PDF
    In modern high-dimensional data sets, feature selection is an essential pre-processing step for many statistical modelling tasks. The field of cost-sensitive feature selection extends the concepts of feature selection by introducing so-called feature costs. These do not necessarily relate to financial costs, but can be seen as a general construct to numerically valuate any disfavored aspect of a feature, like for example the run-time of a measurement procedure, or the patient harm of a biomarker test. There are multiple ideas to define a cost-sensitive feature selection setup. The strategy applied in this thesis is to introduce an additive cost-budget as an upper bound of the total costs. This extends the standard feature selection problem by an additional constraint on the sum of costs for included features. Main areas of research in this field include adaptations of standard feature selection algorithms to account for this additional constraint. However, cost-aware selection criteria also play an important role for the overall performance of these methods and need to be discussed in detail as well. This cumulative dissertation summarizes the work of three papers in this field. Two of these introduce new methods for cost-sensitive feature selection with a fixed budget constraint. The other discusses a common trade-off criterion of performance and cost. For this criterion, an analysis of the selection outcome in different setups revealed a reduction of the ability to distinguish between information and noise. This can for example be counteracted by introducing a hyperparameter in the criterion. The presented research on new cost-sensitive methods comprises adaptations of Greedy Forward Selection, Genetic Algorithms, filter approaches and a novel Random Forest based algorithm, which selects individual trees from a low-cost tree ensemble. Central concepts of each method are discussed and thorough simulation studies to evaluate individual strengths and weaknesses are provided. Every simulation study includes artificial, as well as real-world data examples to validate results in a broad context. Finally, all chapters present discussions with practical recommendations on the application of the proposed methods and conclude with an outlook on possible further research for the respective topics

    Modelling PolSAR Scattering Signatures at Long Wavelengths of Glacier Ice Volumes

    Get PDF
    The crucial role of cryosphere for understanding the global climate change has been widely recognized in recent decades [1]. Glaciers and ice sheets are the main components of the cryosphere and constitute the basic reservoir of fresh water for high-latitudes and many densely populated areas at mid and low latitudes. The need of information on large scale and the inaccessibility of polar regions qualify synthetic aperture radar (SAR) sensors for glaciological applications. At long wavelengths (e.g. P- and L- band), SAR systems are capable to penetrate several tens of meters deep into the ice body. Consequently, they are sensitive to the glacier surface as well as to sub-surface ice structures. However, the complexity of the scattering mechanisms, occurring within the glacier ice volume, turns the interpretation of SAR scattering signatures into a challenge and large uncertainties remain in estimating reliably glacier accumulation rates, ice thickness, subsurface structures and discharge rates. In literature great attention has been given to model-based decomposition techniques of polarimetric SAR (PolSAR) data. The first model-based decomposition for glacier ice was proposed in [2] as an adaptation and extension of the well-known Freeman-Durden model [3]. Despite this approach was able to interpret many effects in the experimental data, it could not explain, for instance, co-polarization phase differences. The objective of this study is to develop a novel polarimetric model that attempts to explain PolSAR signatures of glacier ice. A new volume scattering component from a cloud of oriented particles will be presented. In particular, air and atmospheric gases inclusions, typically present in ice volumes [4], are modeled as oblate spheroidal particles, mainly horizontally oriented and embedded in a glacier ice background. Since the model has to account for an oriented ice volume, the anisotropic nature of the ice medium has to be incorporated. This phenomenon, neglected in [2], leads to different refraction indices, i.e. differential propagation velocities (phase differences) and losses of the electromagnetic wave along different polarizations [5]. Furthermore, the introduction of additional scattering components (e.g. from the glacier surface) will extend and complete the polarimetric model. For a first quality assessment, modeled polarimetric signatures are compared to airborne fully polarimetric SAR data at L- and P-band, collected over the Austfonna ice-cap, in Svalbard, Norway, by DLR’s E-SAR system within the ICESAR 2007 campaign

    Politicisation and international negotiations: why delivering on Brexit proved impossible for Theresa May

    Get PDF
    The Brexit negotiations led by Theresa May ultimately ended in failure for both British and European negotiators. Drawing on a new study, Felix Biermann and Stefan Jagdhuber explain why reaching a workable compromise proved impossible

    Implications on feature detection when using the benefit–cost ratio

    Get PDF
    In many practical machine learning applications, there are two objectives: one is to maximize predictive accuracy and the other is to minimize costs of the resulting model. These costs of individual features may be financial costs, but can also refer to other aspects, for example, evaluation time. Feature selection addresses both objectives, as it reduces the number of features and can improve the generalization ability of the model. If costs differ between features, the feature selection needs to trade-off the individual benefit and cost of each feature. A popular trade-off choice is the ratio of both, the benefit–cost ratio (BCR). In this paper, we analyze implications of using this measure with special focus to the ability to distinguish relevant features from noise. We perform simulation studies for different cost and data settings and obtain detection rates of relevant features and empirical distributions of the trade-off ratio. Our simulation studies exposed a clear impact of the cost setting on the detection rate. In situations with large cost differences and small effect sizes, the BCR missed relevant features and preferred cheap noise features. We conclude that a trade-off between predictive performance and costs without a controlling hyperparameter can easily overemphasize very cheap noise features. While the simple benefit–cost ratio offers an easy solution to incorporate costs, it is important to be aware of its risks. Avoiding costs close to 0, rescaling large cost differences, or using a hyperparameter trade-off are ways to counteract the adverse effects exposed in this paper
    • …
    corecore