459 research outputs found
A methodology for full-system power modeling in heterogeneous data centers
The need for energy-awareness in current data centers has encouraged the use of power modeling to estimate their power consumption. However, existing models present noticeable limitations, which make them application-dependent, platform-dependent, inaccurate, or computationally complex. In this paper, we propose a platform-and application-agnostic methodology for full-system power modeling in heterogeneous data centers that overcomes those limitations. It derives a single model per platform, which works with high accuracy for heterogeneous applications with different patterns of resource usage and energy consumption, by systematically selecting a minimum set of resource usage indicators and extracting complex relations among them that capture the impact on energy consumption of all the resources in the system. We demonstrate our methodology by generating power models for heterogeneous platforms with very different power consumption profiles. Our validation experiments with real Cloud applications show that such models provide high accuracy (around 5% of average estimation error).This work is supported by the Spanish Ministry of Economy and Competitiveness under contract TIN2015-65316-P, by the Gener-
alitat de Catalunya under contract 2014-SGR-1051, and by the European Commission under FP7-SMARTCITIES-2013 contract 608679 (RenewIT) and FP7-ICT-2013-10 contracts 610874 (AS- CETiC) and 610456 (EuroServer).Peer ReviewedPostprint (author's final draft
The Lore of Low Methane Livestock:Co-Producing Technology and Animals for Reduced Climate Change Impact
Methane emissions from sheep and cattle production have gained increasing profile in the context of climate change. Policy and scientific research communities have suggested a number of technological approaches to mitigate these emissions. This paper uses the concept of co-production as an analytical framework to understand farmers’ evaluation of a 'good animal’. It examines how technology and sheep and beef cattle are co-produced in the context of concerns about the climate change impact of methane. Drawing on 42 semi-structured interviews, this paper demonstrates that methane emissions are viewed as a natural and integral part of sheep and beef cattle by farmers, rather than as a pollutant. Sheep and beef cattle farmers in the UK are found to be an extremely heterogeneous group that need to be understood in their specific social, environmental and consumer contexts. Some are more amenable to appropriating methane reducing measures than others, but largely because animals are already co-constructed from the natural and the technical for reasons of increased production efficiency
Gauge-invariant correlation functions in light-cone superspace
We initiate a study of correlation functions of gauge-invariant operators in
N=4 super Yang-Mills theory using the light-cone superspace formalism. Our
primary aim is to develop efficient methods to compute perturbative corrections
to correlation functions. This analysis also allows us to examine potential
subtleties which may arise when calculating off-shell quantities in light-cone
gauge. We comment on the intriguing possibility that the manifest N=4
supersymmetry in this approach may allow for a compact description of entire
multiplets and their correlation functions.Comment: 35 pages, several figure
Is There A String Theory Landscape
We examine recent claims of a large set of flux compactification solutions of
string theory. We conclude that the arguments for AdS solutions are plausible.
The analysis of meta-stable dS solutions inevitably leads to situations where
long distance effective field theory breaks down. We then examine whether these
solutions are likely to lead to a description of the real world. We conclude
that one must invoke a strong version of the anthropic principle. We explain
why it is likely that this leads to a prediction of low energy supersymmetry
breaking, but that many features of anthropically selected flux
compactifications are likely to disagree with experiment.Comment: 39 pages, Latex, ``Terminology surrounding the anthropic principle
revised to conform with accepted usage. More history of the anthropic
principle included. Various references added.
Genetic characterisation of Norovirus strains in outpatient children from rural communities of Vhembe district / South Africa, 2014-2015
Background: Norovirus (NoV) is now the 24 most common causes of both outbreaks and sporadic non-bacterial gastroenteritis worldwide. However, data supporting the role of NoV in diarrheal disease are limited in the African continent. Objectives: This study investigates the distribution of NoV genotypes circulating in outpatient children from rural communities of Vhembe district / South Africa. Study design: Stool specimens were collected from children under five years of age with diarrhea, and controls without diarrhea, between July 2014 and April 2015. NoV positive samples, detected previously by Realtime PCR, were analysed using conventional RT-PCR targeting the partial capsid and polymerase genes. Nucleotide sequencing methods were performed to genotype the strains. Results: The sequence analyses demonstrated multiple NoV genotypes including GI.4 (13.8%), GI.5 (6.9%), GII.14 (6.9%), GII.4 (31%), GII.6 (3.4%), GII.P15 (3.4%), GII.P21 (3.4%) and GII.Pe (31%). The most prevalent NoV genotypes were GII.4 Sydney 2012 variants (n=7) among the capsid genotypes, GII.Pe (n=9) among the polymerase genotypes and GII.Pe/GII.4 Sydney 2012 (n=8) putative recombinants among the RdRp/Capsid genotypes. Two unassigned GII.4 variants were found. Conclusions: The findings highlighted NoV genetic diversity and revealed continuous pandemic spread and predominance of GII.Pe/GII.4 Sydney 2012, indicative of increased NoV activity. An unusual RdRp genotype GII.P15 and two unassigned GII.4 variants were also identified from rural settings of the Vhembe district/South Africa. NoV surveillance is warranted to help to inform investigations into NoV evolution and disease burden, and to support on-going vaccine development programmes
Risky business: factor analysis of survey data – assessing the probability of incorrect dimensionalisation
This paper undertakes a systematic assessment of the extent to which factor analysis the correct number of latent dimensions (factors) when applied to ordered categorical survey items (so-called Likert items). We simulate 2400 data sets of uni-dimensional Likert items that vary systematically over a range of conditions such as the underlying population distribution, the number of items, the level of random error, and characteristics of items and item-sets. Each of these datasets is factor analysed in a variety of ways that are frequently used in the extant literature, or that are recommended in current methodological texts. These include exploratory factor retention heuristics such as Kaiser’s criterion, Parallel Analysis and a non-graphical scree test, and (for exploratory and confirmatory analyses) evaluations of model fit. These analyses are conducted on the basis of Pearson and polychoric correlations.We find that, irrespective of the particular mode of analysis, factor analysis applied to ordered-categorical survey data very often leads to over-dimensionalisation. The magnitude of this risk depends on the specific way in which factor analysis is conducted, the number of items, the properties of the set of items, and the underlying population distribution. The paper concludes with a discussion of the consequences of overdimensionalisation, and a brief mention of alternative modes of analysis that are much less prone to such problems
QCD and strongly coupled gauge theories : challenges and perspectives
We highlight the progress, current status, and open challenges of QCD-driven physics, in theory and in experiment. We discuss how the strong interaction is intimately connected to a broad sweep of physical problems, in settings ranging from astrophysics and cosmology to strongly coupled, complex systems in particle and condensed-matter physics, as well as to searches for physics beyond the Standard Model. We also discuss how success in describing the strong interaction impacts other fields, and, in turn, how such subjects can impact studies of the strong interaction. In the course of the work we offer a perspective on the many research streams which flow into and out of QCD, as well as a vision for future developments.Peer reviewe
Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease
Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.
- …
