108 research outputs found
Why equality? On justifying liberal egalitarianism
The debate over the nature of egalitarianism has come to dominate political philosophy. As ever more sophisticated attempts are made to describe the principles of an egalitarian distribution or to specify the good or goods that should be distributed equally, little is said about the fundamental basis of equality. In virtue of what should people be regarded as equal? Egalitarians have tended to dismiss this question of fundamental equality. In the first part of the paper I will examine some of these strategies of marginalisation and assess whether the issue of fundamental equality matters. Jeremy Waldron has criticised this strategy of avoidance in his recent book God, Locke and equality. He argues that Locke's turn to a theistic grounding for fundamental equality provides a better approach to the problem than the approach taken by contemporary liberals such as John Rawls. I will examine Waldron's critique of Rawls and show that it is wanting. I will conclude by suggesting that Rawls's approach to the issue has a bearing on the way in which equality should be understood as a political value. This argument for the primacy of a political conception of egalitarianism has a bearing on the interconnection between core liberal values and the idea of the state that has been emphasised by Rawls, Dworkin and Nagel
Quantum Gravitational Corrections to the Nonrelativistic Scattering Potential of Two Masses
We treat general relativity as an effective field theory, obtaining the full
nonanalytic component of the scattering matrix potential to one-loop order. The
lowest order vertex rules for the resulting effective field theory are
presented and the one-loop diagrams which yield the leading nonrelativistic
post-Newtonian and quantum corrections to the gravitational scattering
amplitude to second order in G are calculated in detail. The Fourier
transformed amplitudes yield a nonrelativistic potential and our result is
discussed in relation to previous calculations. The definition of a potential
is discussed as well and we show how the ambiguity of the potential under
coordinate changes is resolved.Comment: 27 pages, 17 figure
Coercive redistribution and public agreement: re-evaluating the libertarian challenge of charity
In this article, we evaluate the capacity of liberal egalitarianism to rebut what we call
the libertarian challenge of charity. This challenge states that coercive redistributive
taxation is neither needed nor justified, since those who endorse redistribution can
give charitably, and those who do not endorse redistribution cannot justifiably be
coerced. We argue that contemporary developments in liberal political thought render
liberalism more vulnerable to this libertarian challenge. Many liberals have, in recent
years, sought to recast liberalism such that it is more hospitable to cultural, religious,
and ethnic diversity. This move has resulted in increased support for the claim that
liberalism should be understood as a political rather than comprehensive doctrine, and
that liberal institutions should draw their legitimacy from agreements made among
members of an appropriately conceived deliberative community, rather than from
controversial liberal principles like individual autonomy. We argue that, while this
move may indeed make liberalism more compatible with cultural diversity, it also
makes it more vulnerable to the libertarian challenge of charity. Not all versions of
liberalism are troubled by the challenge, but those that are troubled by it are
increasingly dominant. We also discuss G. A. Cohen’s claim that liberal equality
requires an ‘egalitarian ethos’ and argue that, if Cohen is right, it is difficult to see
how there can be an adequate response to the libertarian challenge of charity. In
general, our argument can be summarised as follows: the more that liberalism is
concerned accurately to model the actual democratic wishes and motivations of the
people it governs, the less it is able to justify coercively imposing redistributive
principles of justice
Recommended from our members
Unified concepts for understanding and modelling turnover of dissolved organic matter from freshwaters to the ocean: the UniDOM model
The transport of dissolved organic matter (DOM) across the land-ocean-aquatic continuum (LOAC), from freshwater to the ocean, is an important yet poorly understood component of the global carbon budget. Exploring and quantifying this flux is a significant challenge given the complexities of DOM cycling across these contrasting environments. We developed a new model, UniDOM, that unifies concepts, state variables and parameterisations of DOM turnover across the LOAC. Terrigenous DOM is divided into two pools, T1 (strongly-UV-absorbing) and T2 (non- or weakly-UV-absorbing), that exhibit contrasting responses to microbial consumption, photooxidation and flocculation. Data are presented to show that these pools are amenable to routine measurement based on specific UV absorbance (SUVA). In addition, an autochtonous DOM pool is defined to account for aquatic DOM production. A novel aspect of UniDOM is that rates of photooxidation and microbial turnover are parameterised as an inverse function of DOM age. Model results, which indicate that ~5% of the DOM originating in streams may penetrate into the open ocean, are sensitive to this parameterisation, as well as rates assigned to turnover of freshly produced DOM. The predicted contribution of flocculation to DOM turnover is remarkably low, although a mechanistic representation of this process in UniDOM was considered unachievable because of the complexities involved. Our work highlights the need for ongoing research into the mechanistic understanding and rates of photooxidation, microbial consumption and flocculation of DOM across the different environments of the LOAC, along with the development of models based on unified concepts and parameterisations
Integrating group Delphi, fuzzy logic and expert systems for marketing strategy development:the hybridisation and its effectiveness
A hybrid approach for integrating group Delphi, fuzzy logic and expert systems for developing marketing strategies is proposed in this paper. Within this approach, the group Delphi method is employed to help groups of managers undertake SWOT analysis. Fuzzy logic is applied to fuzzify the results of SWOT analysis. Expert systems are utilised to formulate marketing strategies based upon the fuzzified strategic inputs. In addition, guidelines are also provided to help users link the hybrid approach with managerial judgement and intuition. The effectiveness of the hybrid approach has been validated with MBA and MA marketing students. It is concluded that the hybrid approach is more effective in terms of decision confidence, group consensus, helping to understand strategic factors, helping strategic thinking, and coupling analysis with judgement, etc
Drug-gene interactions of antihypertensive medications and risk of incident cardiovascular disease: A pharmacogenomics study from the CHARGE consortium
Background Hypertension is a major risk factor for a spectrum of cardiovascular diseases (CVD), including myocardial infarction, sudden death, and stroke. In the US, over 65 million people have high blood pressure and a large proportion of these individuals are prescribed antihypertensive medications. Although large long-term clinical trials conducted in the last several decades have identified a number of effective antihypertensive treatments that reduce the risk of future clinical complications, responses to therapy and protection from cardiovascular events vary among individuals. Methods Using a genome-wide association study among 21,267 participants with pharmaceutically treated hypertension, we explored the hypothesis that genetic variants might influence or modify the effectiveness of common antihypertensive therapies on the risk ofmajor cardiovascular outcomes. The classes of drug treatments included angiotensin-converting enzyme inhibitors, beta-blockers, calcium channel blockers, and diuretics. In the setting of the Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE) consortium, each study performed array-based genome-wide genotyping, imputed to HapMap Phase II reference panels, and used additive genetic models in proportional hazards or logistic regressionmodels to evaluate drug-gene interactions for each of four therapeutic drug classes. We used meta-analysis to combine study-specific interaction estimates for approximately 2 million single nucleotide polymorphisms (SNPs) in a discovery analysis among 15,375 European Ancestry participants (3,527 CVD cases) with targeted follow-up in a case-only study of 1,751 European Ancestry GenHAT participants as well as among 4,141 African-Americans (1,267 CVD cases). Results Although drug-SNP interactions were biologically plausible, exposures and outcomes were well measured, and power was sufficient to detect modest interactions, we did not identify any statistically significant interactions from the four antihypertensive therapy meta-analyses (Pinteraction > 5.0×10-8). Similarly, findings were null for meta-analyses restricted to 66 SNPs with significant main effects on coronary artery disease or blood pressure from large published genom
Use of Repeated Blood Pressure and Cholesterol Measurements to Improve Cardiovascular Disease Risk Prediction: An Individual-Participant-Data Meta-Analysis
The added value of incorporating information from repeated blood pressure and cholesterol measurements to predict cardiovascular disease (CVD) risk has not been rigorously assessed. We used data on 191,445 adults from the Emerging Risk Factors Collaboration (38 cohorts from 17 countries with data encompassing 1962-2014) with more than 1 million measurements of systolic blood pressure, total cholesterol, and high-density lipoprotein cholesterol. Over a median 12 years of follow-up, 21,170 CVD events occurred. Risk prediction models using cumulative mean values of repeated measurements and summary measures from longitudinal modeling of the repeated measurements were compared with models using measurements from a single time point. Risk discrimination (Cindex) and net reclassification were calculated, and changes in C-indices were meta-analyzed across studies. Compared with the single-time-point model, the cumulative means and longitudinal models increased the C-index by 0.0040 (95% confidence interval (CI): 0.0023, 0.0057) and 0.0023 (95% CI: 0.0005, 0.0042), respectively. Reclassification was also improved in both models; compared with the single-time-point model, overall net reclassification improvements were 0.0369 (95% CI: 0.0303, 0.0436) for the cumulative-means model and 0.0177 (95% CI: 0.0110, 0.0243) for the longitudinal model. In conclusion, incorporating repeated measurements of blood pressure and cholesterol into CVD risk prediction models slightly improves risk prediction
E-retailing ethics in Egypt and its effect on customer repurchase intention
The theoretical understanding of online shopping behaviour has received much attention. Less focus has been given to the formation of the ethical issues that result from online shopper interactions with e-retailers. The vast majority of earlier research on this area is conceptual in nature and limited in scope by focusing on consumers’ privacy issues. Therefore, the purpose of this paper is to propose a theoretical model explaining what factors contribute to online retailing ethics and its effect on customer repurchase intention. The data were analysed using variance-based structural equation modelling, employing partial least squares regression. Findings indicate that the five factors of the online retailing ethics (security, privacy, non- deception, fulfilment/reliability, and corporate social responsibility) are strongly predictive of online consumers’ repurchase intention. The results offer important implications for e-retailers and are likely to stimulate further research in the area of e-ethics from the consumers’ perspective
Nanoparticles for Applications in Cellular Imaging
In the following review we discuss several types of nanoparticles (such as TiO2, quantum dots, and gold nanoparticles) and their impact on the ability to image biological components in fixed cells. The review also discusses factors influencing nanoparticle imaging and uptake in live cells in vitro. Due to their unique size-dependent properties nanoparticles offer numerous advantages over traditional dyes and proteins. For example, the photostability, narrow emission peak, and ability to rationally modify both the size and surface chemistry of Quantum Dots allow for simultaneous analyses of multiple targets within the same cell. On the other hand, the surface characteristics of nanometer sized TiO2allow efficient conjugation to nucleic acids which enables their retention in specific subcellular compartments. We discuss cellular uptake mechanisms for the internalization of nanoparticles and studies showing the influence of nanoparticle size and charge and the cell type targeted on nanoparticle uptake. The predominant nanoparticle uptake mechanisms include clathrin-dependent mechanisms, macropinocytosis, and phagocytosis
- …