335 research outputs found
EU Turf War in External Affairs
Abstract After the Lisbon treaty came into effect in 2009 the European Commission lost its position as the main representative of the European Union in external affairs to the High Representative of Foreign affairs and the newly established External Action Service. This thesis explores why the European Commission despite the new treaty has remained a major player in External Affairs. European Commission power retention is analysed through two multiple case studies. One exploring the function of the External Action Service and making a comparative analysis with the European Commission and another exploring the involvement of the European Commission in the policy areas directly linked to the external dimension of the Union. Historical institutionalism and role theory as well as a look on institutional overlap are employed to analyse the methods used by the European Commission to retain its position in External Affairs. This paper makes the conclusion that the European Commission has been successful in retaining power through a varied method of long-term policy development, multi-lateral frameworks of foreign policy negotiations, policy overlapping to maximise use of mandate and path dependency of expertise based legitimacy to create a policy legacy with other European Union institutions
LoadSplunker
IKEA is today using HP’s application lifecycle testing environment which consists of several tools. One of the tools is Performance Center which is used to schedule and run performance tests. When a test run is finished, the tester uses LoadRunner Analysis which is another tool in HP’s testing environment. This tool is used to visualize a test run in graphs. These graphs are used by the tester to manually write a static report which is sent to the stakeholders. The creation of graphs and test reports can be very time consuming and easing this process would most likely result in a higher quality of testing and therefore also result in better deliverables. This was the objective of this thesis work and was accomplished by creating a system that could replace the analysis tool with Splunk, a log analyzing tool that can manage several large amounts and different types of data. The program that was developed during this thesis work is called LoadSplunker and integrates Performance Center and Splunk. LoadSplunker is a real time java system that is informed when a test run in Performance Center is finished and automatically retrieves results from the test run needed for an analysis. The test results are then uploaded into Splunk where different views of the test results can be created and modified
A comparative laboratory trial evaluating the immediate efficacy of fluralaner, afoxolaner, sarolaner and imidacloprid + permethrin against adult Rhipicephalus sanguineus (sensu lato) ticks attached to dogs
Variational methods are used for targeting specific correlation effects by tailoring the
configuration space. Independent sets of correlation orbitals, embedded in partitioned correlation
functions (PCFs), are produced from multiconfiguration Hartree-Fock (MCHF) and DiracHartree-Fock (MCDHF) calculations. These non-orthogonal functions span configuration state
function (CSF) spaces that are coupled to each other by solving the associated generalized
eigenvalue problem. The Hamiltonian and overlap matrix elements are evaluated using the
biorthonormal orbital transformations and efficient counter-transformations of the configuration
interaction eigenvectors [1]. This method was successfully applied for describing the total
energy of the ground state of beryllium [2]. Using this approach, we demonstrated the fast
energy convergence in comparison with the conventional SD-MCHF method optimizing a single
set of orthonormal one-electron orbitals for the complete configuration space.
In the present work, we investigate the Partitioned Correlation Function Interaction (PCFI)
approach for the two lowest states of neutral lithium, i.e. 1s
2
2s
2
S and 1s
2
2p
2
P
o
. For both states,
we evaluate the total energy, as well as the expectation values of the specific mass shift operator,
the hyperfine structure parameters and the transition probabilities using different models for
tailoring the configuration space. We quantify the “constraint effect” due to the use of fixed PCF
eigenvector compositions and illustrate the possibility of a progressive deconstraint, up to the
non-orthogonal configuration interaction limit case. The PCFI estimation of the position of the
quartet system relative to the ground state of B I will also be presented.
The PCFI method leads to an impressive improvement in the convergence pattern of all the
spectroscopic properties. As such, Li I, Be I and B I constitute perfect benchmarks for the PCFI
method. For larger systems, it becomes hopeless to saturate a single common set of orthonormal
orbitals and the PCFI method is a promising approach for getting high quality correlated wave
functions. The present study constitutes a major step in the current developments of both atsp2K
and grasp2K packages that adopt the biorthonormal treatment for estimating energies, isotope
shifts, hyperfine structures and transition probabilities
Toward Quantum Superposition of Living Organisms
The most striking feature of quantum mechanics is the existence of
superposition states, where an object appears to be in different situations at
the same time. The existence of such states has been tested with small objects,
like atoms, ions, electrons and photons, and even with molecules. More
recently, it has been possible to create superpositions of collections of
photons, atoms, or Cooper pairs. Current progress in optomechanical systems may
soon allow us to create superpositions of even larger objects, like micro-sized
mirrors or cantilevers, and thus to test quantum mechanical phenomena at larger
scales. Here we propose a method to cool down and create quantum superpositions
of the motion of sub-wavelength, arbitrarily shaped dielectric objects trapped
inside a high--finesse cavity at a very low pressure. Our method is ideally
suited for the smallest living organisms, such as viruses, which survive under
low vacuum pressures, and optically behave as dielectric objects. This opens up
the possibility of testing the quantum nature of living organisms by creating
quantum superposition states in very much the same spirit as the original
Schr\"odinger's cat "gedanken" paradigm. We anticipate our essay to be a
starting point to experimentally address fundamental questions, such as the
role of life and consciousness in quantum mechanics.Comment: 9 pages, 4 figures, published versio
Reducing distance errors for standard candles and standard sirens with weak-lensing shear and flexion maps
Gravitational lensing induces significant errors in the measured distances to
high-redshift standard candles and standard sirens such as type-Ia supernovae,
gamma-ray bursts, and merging supermassive black hole binaries. There will
therefore be a significant benefit from correcting for the lensing error by
using independent and accurate estimates of the lensing magnification. We
investigate how accurately the magnification can be inferred from convergence
maps reconstructed from galaxy shear and flexion data. We employ ray-tracing
through the Millennium Simulation to simulate lensing observations in large
fields, and perform a weak-lensing reconstruction on these fields. We identify
optimal ways to filter the reconstructed convergence maps and to convert them
to magnification maps. We find that a shear survey with 100 galaxies/arcmin^2
can help to reduce the lensing-induced distance errors for standard
candles/sirens at redshifts z=1.5 (z=5) on average by 20% (10%), whereas a
futuristic survey with shear and flexion estimates from 500 galaxies/arcmin^2
yields much larger reductions of 50% (35%). For redshifts z>=3, a further
improvement by 5% can be achieved, if the individual redshifts of the galaxies
are used in the reconstruction. Moreover, the reconstruction allows one to
identify regions for which the convergence is low, and in which an error
reduction by up to 75% can be achieved.Comment: 16 pages, 18 figures, submitted to MNRAS, minor changes, references
extended, comments welcom
Global data for ecology and epidemiology: a novel algorithm for temporal Fourier processing MODIS data
Background. Remotely-sensed environmental data from earth-orbiting satellites are increasingly used to model the distribution and abundance of both plant and animal species, especially those of economic or conservation importance. Time series of data from the MODerate-resolution Imaging Spectroradiometer (MODIS) sensors on-board NASA's Terra and Aqua satellites offer the potential to capture environmental thermal and vegetation seasonality, through temporal Fourier analysis, more accurately than was previously possible using the NOAA Advanced Very High Resolution Radiometer (AVHRR) sensor data. MODIS data are composited over 8- or 16-day time intervals that pose unique problems for temporal Fourier analysis. Applying standard techniques to MODIS data can introduce errors of up to 30% in the estimation of the amplitudes and phases of the Fourier harmonics. Methodology/Principal Findings. We present a novel spline-based algorithm that overcomes the processing problems of composited MODIS data. The algorithm is tested on artificial data generated using randomly selected values of both amplitudes and phases, and provides an accurate estimate of the input variables under all conditions. The algorithm was then applied to produce layers that capture the seasonality in MODIS data for the period from 2001 to 2005. Conclusions/Significance. Global temporal Fourier processed images of 1 km MODIS data for Middle Infrared Reflectance, day- and night-time Land Surface Temperature (LST), Normalised Difference Vegetation Index (NDVI), and Enhanced Vegetation Index (EVI) are presented for ecological and epidemiological applications. The finer spatial and temporal resolution, combined with the greater geolocational and spectral accuracy of the MODIS instruments, compared with previous multi-temporal data sets, mean that these data may be used with greater confidence in species' distribution modelling
A cell topography-based mechanism for ligand discrimination by the T cell receptor.
The T cell receptor (TCR) initiates the elimination of pathogens and tumors by T cells. To avoid damage to the host, the receptor must be capable of discriminating between wild-type and mutated self and nonself peptide ligands presented by host cells. Exactly how the TCR does this is unknown. In resting T cells, the TCR is largely unphosphorylated due to the dominance of phosphatases over the kinases expressed at the cell surface. However, when agonist peptides are presented to the TCR by major histocompatibility complex proteins expressed by antigen-presenting cells (APCs), very fast receptor triggering, i.e., TCR phosphorylation, occurs. Recent work suggests that this depends on the local exclusion of the phosphatases from regions of contact of the T cells with the APCs. Here, we developed and tested a quantitative treatment of receptor triggering reliant only on TCR dwell time in phosphatase-depleted cell contacts constrained in area by cell topography. Using the model and experimentally derived parameters, we found that ligand discrimination likely depends crucially on individual contacts being ∼200 nm in radius, matching the dimensions of the surface protrusions used by T cells to interrogate their targets. The model not only correctly predicted the relative signaling potencies of known agonists and nonagonists but also achieved this in the absence of kinetic proofreading. Our work provides a simple, quantitative, and predictive molecular framework for understanding why TCR triggering is so selective and fast and reveals that, for some receptors, cell topography likely influences signaling outcomes.This work was funded by The Wellcome Trust, the UK Medical Research Council, the UK Biotechnology and Biological Sciences Research Council and Cancer Research UK. We thank the Wolfson Imaging Centre, University of Oxford, for access to their microscope facility. We would like to thank the Wellcome Trust for the Sir Henry Dale Fellowship of R.A.F. (WT101609MA), the Royal Society for the University Research Fellowship of S.F.L. (UF120277) and acknowledge a GSK Professorship (D.K.). We are also grateful to Doug Tischer (UCSF, US) and Muaz Rushdi (Georgia Tech, US) for their critical comments on the manuscript
Interpreting Quantum Particles as Conceptual Entities
We elaborate an interpretation of quantum physics founded on the hypothesis
that quantum particles are conceptual entities playing the role of
communication vehicles between material entities composed of ordinary matter
which function as memory structures for these quantum particles. We show in
which way this new interpretation gives rise to a natural explanation for the
quantum effects of interference and entanglement by analyzing how interference
and entanglement emerge for the case of human concepts. We put forward a scheme
to derive a metric based on similarity as a predecessor for the structure of
'space, time, momentum, energy' and 'quantum particles interacting with
ordinary matter' underlying standard quantum physics, within the new
interpretation, and making use of aspects of traditional quantum axiomatics.
More specifically, we analyze how the effect of non-locality arises as a
consequence of the confrontation of such an emerging metric type of structure
and the remaining presence of the basic conceptual structure on the fundamental
level, with the potential of being revealed in specific situations.Comment: 19 pages, 1 figur
- …