435 research outputs found
Fundamental Moral Attitudes to Animals and Their Role in Judgment: An Empirical Model to Describe Fundamental Moral Attitudes to Animals and Their Role in Judgment on the Culling of Healthy Animals During an Animal Disease Epidemic
In this paper, we present and defend the theoretical framework of an empirical model to describe peopleâs fundamental moral attitudes (FMAs) to animals, the stratification of FMAs in society and the role of FMAs in judgment on the culling of healthy animals in an animal disease epidemic. We used philosophical animal ethics theories to understand the moral basis of FMA convictions. Moreover, these theories provide us with a moral language for communication between animal ethics, FMAs, and public debates. We defend that FMA is a two-layered concept. The first layer consists of deeply felt convictions about animals. The second layer consists of convictions derived from the first layer to serve as arguments in a debate on animal issues. In a debate, the latter convictions are variable, depending on the animal issue in a specific context, time, and place. This variability facilitates finding common ground in an animal issue between actors with opposing conviction
Beyond the Prevention of Harm: Animal Disease Policy as a Moral Question
European animal disease policy seems to find its justification in a âharm to otherâ principle. Limiting the freedom of animal keepersâe.g., by culling their animalsâis justified by the aim to prevent harm, i.e., the spreading of the disease. The picture, however, is more complicated. Both during the control of outbreaks and in the prevention of notifiable, animal diseases the government is confronted with conflicting claims of stakeholders who anticipate running a risk to be harmed by each other, and who ask for government intervention. In this paper, we first argue that in a policy that aims to prevent animal diseases, the focus shifts from limiting âharmâ to weighing conflicting claims with respect to ârisks of harm.â Therefore, we claim that the harm principle is no longer a sufficient justification for governmental intervention in animal disease prevention. A policy that has to deal with and distribute conflicting risks of harm needs additional value assumptions that guide this process of assessment and distribution. We show that currently, policies are based on assumptions that are mainly economic considerations. In order to show the limitations of these considerations, we use the interests and position of keepers of backyard animals as an example. Based on the problems they faced during and after the recent outbreaks, we defend the thesis that in order to develop a sustainable animal disease policy other than economic assumptions need to be taken into accoun
World-Sheet Duality, Space-Time Foam, and the Quantum Fate of a Stringy Black Hole
We interpret Minkowski black holes as world-sheet {\it spikes } which are
related by world-sheet { \it duality} to {\it vortices } that correspond to
Euclidean black holes. These world-sheet defects induce defects in the gauge
fields of the corresponding coset Wess-Zumino descriptions of
spherically-symmetric black holes. The low-temperature target space-time foam
is a Minkowski black hole (spike) plasma with confined Euclidean black holes
(vortices). The high-temperature phase is a {\it dense} vortex plasma described
by a topological gauge field theory on the world-sheet, which possesses
enhanced symmetry as in the target space-time singularity at the core of a
black hole. Quantum decay via higher-genus effects induces a back-reaction
which causes a Minkowski black hole to lose mass until it is indistinguishable
from intrinsic fluctuations in the space-time foam.Comment: 16 pages, CERN-TH.6534/92, (correction of a minor typographical error
on page 12
Correlation between centrality metrics and their application to the opinion model
In recent decades, a number of centrality metrics describing network
properties of nodes have been proposed to rank the importance of nodes. In
order to understand the correlations between centrality metrics and to
approximate a high-complexity centrality metric by a strongly correlated
low-complexity metric, we first study the correlation between centrality
metrics in terms of their Pearson correlation coefficient and their similarity
in ranking of nodes. In addition to considering the widely used centrality
metrics, we introduce a new centrality measure, the degree mass. The m order
degree mass of a node is the sum of the weighted degree of the node and its
neighbors no further than m hops away. We find that the B_{n}, the closeness,
and the components of x_{1} are strongly correlated with the degree, the
1st-order degree mass and the 2nd-order degree mass, respectively, in both
network models and real-world networks. We then theoretically prove that the
Pearson correlation coefficient between x_{1} and the 2nd-order degree mass is
larger than that between x_{1} and a lower order degree mass. Finally, we
investigate the effect of the inflexible antagonists selected based on
different centrality metrics in helping one opinion to compete with another in
the inflexible antagonists opinion model. Interestingly, we find that selecting
the inflexible antagonists based on the leverage, the B_{n}, or the degree is
more effective in opinion-competition than using other centrality metrics in
all types of networks. This observation is supported by our previous
observations, i.e., that there is a strong linear correlation between the
degree and the B_{n}, as well as a high centrality similarity between the
leverage and the degree.Comment: 20 page
Greedy D-Approximation Algorithm for Covering with Arbitrary Constraints and Submodular Cost
This paper describes a simple greedy D-approximation algorithm for any
covering problem whose objective function is submodular and non-decreasing, and
whose feasible region can be expressed as the intersection of arbitrary (closed
upwards) covering constraints, each of which constrains at most D variables of
the problem. (A simple example is Vertex Cover, with D = 2.) The algorithm
generalizes previous approximation algorithms for fundamental covering problems
and online paging and caching problems
Lorentz breaking Effective Field Theory and observational tests
Analogue models of gravity have provided an experimentally realizable test
field for our ideas on quantum field theory in curved spacetimes but they have
also inspired the investigation of possible departures from exact Lorentz
invariance at microscopic scales. In this role they have joined, and sometime
anticipated, several quantum gravity models characterized by Lorentz breaking
phenomenology. A crucial difference between these speculations and other ones
associated to quantum gravity scenarios, is the possibility to carry out
observational and experimental tests which have nowadays led to a broad range
of constraints on departures from Lorentz invariance. We shall review here the
effective field theory approach to Lorentz breaking in the matter sector,
present the constraints provided by the available observations and finally
discuss the implications of the persisting uncertainty on the composition of
the ultra high energy cosmic rays for the constraints on the higher order,
analogue gravity inspired, Lorentz violations.Comment: 47 pages, 4 figures. Lecture Notes for the IX SIGRAV School on
"Analogue Gravity", Como (Italy), May 2011. V.3. Typo corrected, references
adde
Spinor condensates and light scattering from Bose-Einstein condensates
These notes discuss two aspects of the physics of atomic Bose-Einstein
condensates: optical properties and spinor condensates. The first topic
includes light scattering experiments which probe the excitations of a
condensate in both the free-particle and phonon regime. At higher light
intensity, a new form of superradiance and phase-coherent matter wave
amplification were observed. We also discuss properties of spinor condensates
and describe studies of ground--state spin domain structures and dynamical
studies which revealed metastable excited states and quantum tunneling.Comment: 58 pages, 33 figures, to appear in Proceedings of Les Houches 1999
Summer School, Session LXXI
A laboratory-based scoring system predicts early treatment in Rai 0 chronic lymphocytic leukemia
We present a laboratory-based prognostic calculator (designated CRO score) to risk stratify treatment-free survival in early stage (Rai 0) chronic lymphocytic leukemia developed using a training-validation model in a series of 1,879 cases from Italy, the United Kingdom and the United States. By means of regression analysis, we identified five prognostic variables with weighting as follows: deletion of the short arm of chromosome 17 and unmutated immunoglobulin heavy chain gene status, 2 points; deletion of the long arm of chromosome 11, trisomy of chromosome 12, and white blood cell count>32.0x103/microliter, 1 point. Low, intermediate and high-risk categories were established by recursive partitioning in a training cohort of 478 cases, and then validated in four independent cohorts of 144/395/540/322 cases, as well as in the composite validation cohort. Concordance indices were 0.75 in the training cohort and ranged from 0.63 to 0.74 in the four validation cohorts (0.69 in the composite validation cohort). These findings advocate potential application of our novel prognostic calculator to better stratify early-stage chronic lymphocytic leukemia, and aid case selection in risk-adapted treatment for early disease. Furthermore, they support immunocytogenetic analysis in Rai 0 chronic lymphocytic leukemia being performed at the time of diagnosis to aid prognosis and treatment, particularly in today's chemo-free era
- âŠ