151 research outputs found

    Quantifying the Impact of Replication on the Quality-of-Service in Cloud Databases

    No full text
    Cloud databases achieve high availability by automatically replicating data on multiple nodes. However, the overhead caused by the replication process can lead to an increase in the mean and variance of transaction response times, causing unforeseen impacts on the offered quality-of-service (QoS). In this paper, we propose a measurement-driven methodology to predict the impact of replication on Database-as-a-Service (DBaaS) environments. Our methodology uses operational data to parameterize a closed queueing network model of the database cluster together with a Markov model that abstracts the dynamic replication process. Experiments on Amazon RDS show that our methodology predicts response time mean and percentiles with errors of just 1% and 15% respectively, and under operational conditions that are significantly different from the ones used for model parameterization. We show that our modeling approach surpasses standard modeling methods and illustrate the applicability of our methodology for automated DBaaS provisioning

    Estimating the optimal sampling rate using wavelet transform: an application to optimal turbulence

    Get PDF
    Sampling rate and frequency content determination for optical quantities related to light propagation through turbulence are paramount experimental topics. Some papers about estimating properties of the optical turbulence seem to use ad hoc assumptions to set the sampling frequency used; this chosen sampling rate is assumed good enough to perform a proper measurement. On the other hand, other authors estimate the optimal sampling rate via fast Fourier transform of data series associated to the experiment. When possible, with the help of analytical models, cut-off frequencies, or frequency content, can be determined; yet, these approaches require prior knowledge of the optical turbulence. The aim of this paper is to propose an alternative, practical, experimental method to estimate a proper sampling rate. By means of the discrete wavelet transform, this approach can prevent any loss of information and, at the same time, avoid oversampling. Moreover, it is independent of the statistical model imposed on the turbulence.Fil: Funes, Gustavo Luis. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico La Plata. Centro de Investigaciones Opticas (i); ArgentinaFil: Fernández, Angel. Pontificia Universidad Catolica de Valparaiso; Chile. Universidad Tecnica Federico Santa María; ChileFil: Peréz, Darío G.. Pontificia Universidad Catolica de Valparaiso; ChileFil: Zunino, Luciano José. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico La Plata. Centro de Investigaciones Opticas (i); ArgentinaFil: Serrano, Eduardo. Universidad Nacional de San Martín; Argentin

    Stochastic Resonance in a Dipole

    Get PDF
    We show that the dipole, a system usually proposed to model relaxation phenomena, exhibits a maximum in the signal-to-noise ratio at a non-zero noise level, thus indicating the appearance of stochastic resonance. The phenomenon occurs in two different situations, i.e. when the minimum of the potential of the dipole remains fixed in time and when it switches periodically between two equilibrium points. We have also found that the signal-to-noise ratio has a maximum for a certain value of the amplitude of the oscillating field.Comment: 4 pages, RevTex, 6 PostScript figures available upon request; to appear in Phys. Rev.

    The open future, bivalence and assertion

    Get PDF
    It is highly intuitive that the future is open and the past is closed—whereas it is unsettled whether there will be a fourth world war, it is settled that there was a first. Recently, it has become increasingly popular to claim that the intuitive openness of the future implies that contingent statements about the future, such as ‘there will be a sea battle tomorrow,’ are non-bivalent (neither true nor false). In this paper, we argue that the non-bivalence of future contingents is at odds with our pre-theoretic intuitions about the openness of the future. These are revealed by our pragmatic judgments concerning the correctness and incorrectness of assertions of future contingents. We argue that the pragmatic data together with a plausible account of assertion shows that in many cases we take future contingents to be true (or to be false), though we take the future to be open in relevant respects. It follows that appeals to intuition to support the non-bivalence of future contingents is untenable. Intuition favours bivalence

    Spitzer Observations of Cold Dust Galaxies

    Full text link
    We combine new Spitzer Space Telescope observations in the mid- and far-infrared with SCUBA 850 micron observations to improve the measurement of dust temperatures, masses and luminosities for 11 galaxies of the SCUBA Local Universe Galaxy Survey (SLUGS). By fitting dust models we measure typical dust masses of 10E7.9 M_sol and dust luminosities of ~ 10E10 L_sol, for galaxies with modest star formation rates. The data presented in this paper combined with previous observations show that cold dust is present in all types of spiral galaxies and is a major contributor to their total luminosity. Because of the lower dust temperature of the SCUBA sources measured in this paper, they have flatter Far-IR nu F_nu(160um)/nu F_nu(850um) slopes than the larger Spitzer Nearby Galaxies Survey (SINGS), the sample that provides the best measurements of the dust properties of galaxies in the nearby universe. The new data presented here added to SINGS extend the parameter space that is well covered by local galaxies, providing a comprehensive set of templates that can be used to interpret the observations of nearby and distant galaxies.Comment: Accepted by A.J. 16 pages, 10 figures, 7 tables. High resolution version at http://mips.as.arizona.edu/~cnaw/slugs_hires.pd

    One-mode Bosonic Gaussian channels: a full weak-degradability classification

    Get PDF
    A complete degradability analysis of one-mode Gaussian Bosonic channels is presented. We show that apart from the class of channels which are unitarily equivalent to the channels with additive classical noise, these maps can be characterized in terms of weak- and/or anti-degradability. Furthermore a new set of channels which have null quantum capacity is identified. This is done by exploiting the composition rules of one-mode Gaussian maps and the fact that anti-degradable channels can not be used to transfer quantum information.Comment: 23 pages, 3 figure

    PEP: first Herschel probe of dusty galaxy evolution up to z~3

    Full text link
    We exploit the deepest existing far-infrared (FIR) data obtained so far by Herschel at 100 and 160 um in the GOODS-N, as part of the PACS Evolutionary Probe (PEP) survey, to derive for the first time the evolution of the rest-frame 60-um, 90-um, and total IR luminosity functions (LFs) of galaxies and AGNs from z=0 to unprecedented high redshifts (z~2-3). The PEP LFs were computed using the 1/Vmax method. The FIR sources were classified by means of a detailed broad- band SED-fitting analysis and spectral characterisation. Based on the best-fit model results, k-correction and total IR (8-1000 um) luminosity were obtained for each source. LFs (monochromatic and total) were then derived for various IR populations separately in different redshift bins and compared to backward evolution model predictions. We detect strong evolution in the LF to at least z~2. Objects with SEDs similar to local spiral galaxies are the major contributors to the star formation density (SFD) at z< 0.3, then, as redshift increases, moderate SF galaxies - most likely containing a low-luminosity AGN - start dominating up to z ~= 1.5. At >1.5 the SFD is dominated by the contributions of starburst galaxies. In agreement with previous findings, the comoving IR LD derived from our data evolves approximately as (1 + z)^(3.8+/-0.3) up to z~1, there being some evidence of flattening up to z~2.Comment: Accepted for publication in the A&A Herschel first results Special Issu

    Effect of ABCB1 and ABCC3 Polymorphisms on Osteosarcoma Survival after Chemotherapy: A Pharmacogenetic Study

    Get PDF
    Background: Standard treatment for osteosarcoma patients consists of a combination of cisplatin, adriamycin, and methotrexate before surgical resection of the primary tumour, followed by postoperative chemotherapy including vincristine and cyclophosphamide. Unfortunately, many patients still relapse or suffer adverse events. We examined whether common germline polymorphisms in chemotherapeutic transporter and metabolic pathway genes of the drugs used in standard osteosarcoma treatment may predict treatment response. Methodology/Principal Findings: In this study we screened 102 osteosarcoma patients for 346 Single Nucleotide Polymorphisms (SNPs) and 2 Copy Number Variants (CNVs) in 24 genes involved in the metabolism or transport of cisplatin, adriamycin, methotrexate, vincristine, and cyclophosphamide. We studied the association of the genotypes with tumour response and overall survival. We found that four SNPs in two ATP-binding cassette genes were significantly associated with overall survival: rs4148416 in ABCC3 (per-allele HR = 8.14, 95%CI = 2.73-20.2, p-value = 5.1×10 -5), and three SNPs in ABCB1, rs4148737 (per-allele HR = 3.66, 95%CI = 1.85-6.11, p-value = 6.9×10 -5), rs1128503 and rs10276036 (r 2 = 1, per-allele HR = 0.24, 95%CI = 0.11-0.47 p-value = 7.9×10 -5). Associations with these SNPs remained statistically significant after correction for multiple testing (all corrected p-values [permutation test] ≤0.03). Conclusions: Our findings suggest that these polymorphisms may affect osteosarcoma treatment efficacy. If these associations are independently validated, these variants could be used as genetic predictors of clinical outcome in the treatment of osteosarcoma, helping in the design of individualized therapyThis work was supported by the AECC (Asociación Española contra el Cáncer), FIS (Fondo de Investigación Sanitaria-Instituto de Salud Carlos III) and the ‘‘Inocente Inocente’’ Foundatio

    Litter mixture interactions at the level of plant functional types are additive.

    Get PDF
    It is very difficult to estimate litter decomposition rates in natural ecosystems because litters of many species are mixed and idiosyncratic interactions occur among those litters. A way to tackle this problem is to investigate litter mixing effects not at the species level but at the level of Plant Functional Types (PFTs). We tested the hypothesis that at the PFT level positive and negative interactions balance each other, causing an overall additive effect (no significant interactions among PFTs). Thereto, we used litter of four PFTs from a temperate peatland in which random draws were taken from the litter species pool of each PFT for every combination of 2, 3, and 4 PFTs. Decomposition rates clearly differed among the 4 PFTs (Sphagnum spp. < graminoids = N-fixing tree < forbs) and showed little variation within the PFTs (notably for the Sphagnum mosses and the graminoids). Significant positive interactions (4 out of 11) in the PFT mixtures were only found after 20 weeks and in all these combinations Sphagnum was involved. After 36 and 56 weeks of incubation interactions were not significantly different from zero. However, standard deviations were larger than the means, indicating that positive and negative interactions balanced each other. Thus, when litter mixture interactions are considered at the PFT level the interactions are additive. From this we conclude that for estimating litter decomposition rates at the ecosystem level, it is sufficient to use the weighted (by litter production) average decomposition rates of the contributing PFTs. © 2009 The Author(s)
    • …
    corecore