446 research outputs found
Neo-Statecraft Theory, Historical Institutionalism and Institutional Change
This article provides a critical examination of the contribution that statecraft theory, which has been subject to recent revision and development, makes to the literature on institutional change. It articulates an emergent neo-statecraft approach that offers an agent-led form of historical institutionalism. This overcomes the common criticism that historical institutionalists underplay the creative role of actors. The article also argues that the approach brings back into focus the imperatives of electoral politics as a source of institutional change and provides a macro theory of change which is also commonly missing from historical institutionalist work. It can therefore identify previously unnoticed sources of stability and change, especially in states with strong executives and top-down political cultures
Political Leadership as Statecraft? Aligning Theory with Praxis in Conversation with British Party Leaders
How should prime ministerial and party leadership be understood and assessed? One leading approach posits that we should assess them in terms of whether they achieve statecraft, that is, winning and maintain office in government. This article supplements and then assesses that theory by drawing from Pawson and Tilleyâs (1997) concept of the realistic interview, in which practitioners are deployed as co-researchers to assess and revise theory. Unprecedented interviews with British party leaders were therefore undertaken. The article provides new empirical support for the framework because many of the key generative mechanisms identified within the neo-statecraft model were present in an analysis of the interviews. The interviews also allowed the limitations of the model to be demarcated. Statecraft focusses purely on cunning leadership where the aim is to maximise power and influence. This differs from leadership by conscious where the aim is to achieve normative goals
Epistemic Uncertainty-Weighted Loss for Visual Bias Mitigation
Deep neural networks are highly susceptible to learning biases in visual
data. While various methods have been proposed to mitigate such bias, the
majority require explicit knowledge of the biases present in the training data
in order to mitigate. We argue the relevance of exploring methods which are
completely ignorant of the presence of any bias, but are capable of identifying
and mitigating them. Furthermore, we propose using Bayesian neural networks
with an epistemic uncertainty-weighted loss function to dynamically identify
potential bias in individual training samples and to weight them during
training. We find a positive correlation between samples subject to bias and
higher epistemic uncertainties. Finally, we show the method has potential to
mitigate visual bias on a bias benchmark dataset and on a real-world face
detection problem, and we consider the merits and weaknesses of our approach.Comment: To be published in 2022 IEEE CVPR Workshop on Fair, Data Efficient
and Trusted Computer Visio
Bayesian uncertainty-weighted loss for improved generalisability on polyp segmentation task
While several previous studies have devised methods for segmentation of
polyps, most of these methods are not rigorously assessed on multi-center
datasets. Variability due to appearance of polyps from one center to another,
difference in endoscopic instrument grades, and acquisition quality result in
methods with good performance on in-distribution test data, and poor
performance on out-of-distribution or underrepresented samples. Unfair models
have serious implications and pose a critical challenge to clinical
applications. We adapt an implicit bias mitigation method which leverages
Bayesian epistemic uncertainties during training to encourage the model to
focus on underrepresented sample regions. We demonstrate the potential of this
approach to improve generalisability without sacrificing state-of-the-art
performance on a challenging multi-center polyp segmentation dataset (PolypGen)
with different centers and image modalities.Comment: To be presented at the Fairness of AI in Medical Imaging (FAIMI)
MICCAI 2023 Workshop and published in volumes of the Springer Lecture Notes
Computer Science (LNCS) serie
Recommended from our members
Left Ventricular Hypertrophy is a predictor of cardiovascular events in elderly hypertensives: hypertension in the the very elderly trial (HYVET)
Objective: We assessed the prognostic value of electrocardiographic left ventricular hypertrophy (LVH) using Sokolow-Lyon (SL-LVH), Cornell Voltage (CV-LVH) or Cornell Product (CP-LVH) Criteria in 3043 hypertensive people aged 80 years and over enrolled in the Hypertension in the Very Elderly Trial.
Methods: Multivariate Cox proportional hazard models were used to estimate hazard ratios (HR) with 95% confidence intervals (CI) for all-cause mortality, cardiovascular diseases, stroke and heart failure in participants with and without LVH at baseline. The mean follow-up was 2.1 years.
Results: LVH identified by CV- or CP-LVH Criteria was associated with a 1.6 to 1.9-fold risk of cardiovascular disease and stroke. The presence of CP-LVH was associated with an increased risk of heart failure (HR 2.38, 95% CL 1.16-4.86). In gender specific analyses, CV-LVH (HR 1.94, 95%Cl 1.06-3.55) and CP-LVH (HR 2.36, 95% CI 1.25-4.45) were associated with an increased risk of stroke in women and of heart failure in men, CV-LVH (HR 6.47, 95 % Cl 1.41-29.79) and CP-LVH (10.63, 95Cl % 3.58-31.57), respectively. There was no significant increase in the risk of any outcomes associated with SL LVH. LVH identified by these three methods was not a significant predictor of all-cause mortality.
Conclusions: Use of Cornell Voltage and Cornell Product criteria for LVH predicted the risk of cardiovascular disease and stroke. Only Cornell Product was associated with an increased the risk of heart failure. This was particularly the case in men. The identification of electrocardiographic LVH proved to be important in very elderly hypertensive people
First Extended Catalogue of Galactic bubble infrared fluxes from WISE and Herschel surveys
In this paper, we present the first extended catalogue of far-infrared fluxes of Galactic bubbles. Fluxes were estimated for 1814 bubbles, defined here as the âgolden sampleâ, and were selected from the Milky Way Project First Data Release (Simpson et al.) The golden sample was comprised of bubbles identified within the Wide-field Infrared Survey Explorer (WISE) dataset (using 12- and 22-ÎŒm images) and Herschel data (using 70-, 160-, 250-, 350- and 500-ÎŒm wavelength images). Flux estimation was achieved initially via classical aperture photometry and then by an alternative image analysis algorithm that used active contours. The accuracy of the two methods was tested by comparing the estimated fluxes for a sample of bubbles, made up of 126 Hâii regions and 43 planetary nebulae, which were identified by Anderson et al. The results of this paper demonstrate that a good agreement between the two was found. This is by far the largest and most homogeneous catalogue of infrared fluxes measured for Galactic bubbles and it is a step towards the fully automated analysis of astronomical datasets
Fracture risk and the use of a diuretic (indapamide sr) ± perindopril: a substudy of the Hypertension in the Very Elderly Trial (HYVET)
BACKGROUND: The Hypertension in the Very Elderly Trial (HYVET) is a placebo controlled double blind trial of treating hypertension with indapamide Slow Release (SR) ± perindopril in subjects over the age of 80 years. The primary endpoints are stroke (fatal and non fatal). In view of the fact that thiazide diuretics and indapamide reduce urinary calcium and may increase bone mineral density, a fracture sub study was designed to investigate whether or not the trial anti-hypertensive treatment will reduce the fracture rate in very elderly hypertensive subjects. METHODS: In the trial considerable care is taken to ascertain any fractures and to identify risk factors for fracture, such as falls, co-morbidity, drug treatment, smoking and drinking habits, levels of activity, biochemical abnormalities, cardiac irregularities, impaired cognitive function and symptoms of orthostatic hypotension. POTENTIAL RESULTS: The trial is expected to provide 10,500 patient years of follow-up. Given a fracture rate of 40/1000 patient years and a 20% difference in fracture rate, the power of the sub study is 58% to detect this difference at the 5% level of significance. The corresponding power for a reduction of 25% is 78%. CONCLUSION: The trial is well under way, expected to complete in 2009, and on target to detect, if present, the above differences in fracture rate
Understanding the power of the prime minister : structure and agency in models of prime ministerial power
Understanding the power of the prime minister is important because of the centrality of the prime minister within the core executive of British government, but existing models of prime ministerial power are unsatisfactory for various reasons. This article makes an original contribution by providing an overview and critique of the dominant models of prime ministerial power, highlighting their largely positivist bent and the related problem of the prevalence of overly parsimonious conceptions of the structural contexts prime ministers face. The central argument the paper makes is that much of the existing literature on prime ministerial power is premised on flawed understandings of the relationship between structure and agency, that this leads to misunderstandings of the real scope of prime ministerial agency, as well as its determinants, and that this can be rectified by adopting a strategic-relational view of structure and agency
Democratic cultural policy : democratic forms and policy consequences
The forms that are adopted to give practical meaning to democracy are assessed to identify what their implications are for the production of public policies in general and cultural policies in particular. A comparison of direct, representative, democratic elitist and deliberative versions of democracy identifies clear differences between them in terms of policy form and democratic practice. Further elaboration of these differences and their consequences are identified as areas for further research
Forty years studying British politics : the decline of Anglo-America
The still present belief some 40 years ago that British politics was both exceptional and superior has been replaced by more theoretically sophisticated analyses based on a wider and more rigorously deployed range of research techniques, although historical analysis appropriately remains important. The American influence on the study of British politics has declined, but the European Union dimension has not been fully integrated. The study of interest groups has been in some respects a fading paradigm, but important questions related to democratic health have still to be addressed. Public administration has been supplanted by public policy, but economic policy remains under-studied. A key challenge for the future is the study of the management of expectations
- âŠ