738 research outputs found

    The gaseous extent of galaxies and the origin of Lyman alpha absorption systems. IV: Lyman alpha absorbers arising in a galaxy group

    Full text link
    We present new GHRS observations of Lyman alpha absorption lines associated with a group of galaxies towards the QSO 1545+2101. We have identified eight distinct Lyman alpha absorption features in the spectrum of QSO 1545+2101 at a mean redshift of z=0.2648 with a velocity dispersion of 163 km/s. A group of galaxies is detected in the vicinity of this QSO at a mean redshift of z=0.2645 and velocity dispersion 239 km/s. The identification of discrete absorption systems indicates that they arise in clouds of neutral hydrogen rather than in a diffuse intragroup medium. Our analysis suggests that the Lyman alpha absorption lines are associated with individual galaxies in the group, although a one-to-one relationship between absorbers and galaxies is difficult to establish in such a dense environment.Comment: 16 pages, 3 figures. Accepted for publication in Ap

    A new approach to the reduction of "Carte du Ciel" plates

    Get PDF
    A new procedure for the reduction of "Carte du Ciel" plates is presented. A typical "Carte du Ciel" plate corresponding to the Bordeaux zone has been taken as an example. It shows triple exposures for each object and the modelling of the data has been performed by means of a non-linear least squares fitting of the sum of three bivariate Gaussian distributions. A number of solutions for the problems present in this kind of plates (optical aberrations, adjacency photographic effects, presence of grid lines, emulsion saturation) have been investigated. An internal accuracy of 0.1'' in x and y was obtained for the position of each of the individual exposures. The external reduction to a catalogue led to results with an accuracy of 0.16'' in x and 0.13'' in y for the mean position of the three exposures. A photometric calibration has also been performed and magnitudes were determined with an accuracy of 0.09 mags.Comment: 10 pages, 12 enclosed post-script figures, uses l-aa.sty (included). Accepted for publication in Astronomy and Astrophysics Supp. Serie

    A methodology for economic evaluation of cloud-based web applications

    Full text link
    [EN] Cloud technology is an attractive infrastructure solution to optimize the scalability and performance of web applications. The workload of these applications typically fluctuates between peak and valley loads and sometimes in an unpredictable way. Cloud systems can easily deal with this fluctuation because they provide customers with an almost unlimited on-demand infrastructure capacity using a pay-per-use model, which enables internet-based companies to pay for the actual consumption instead of peak capacity. In this paradigm, this paper links the business model of an internet-based company to the performance evaluation of the infrastructure. To this end, the paper develops a new methodology for assessing the costs and benefits of implementing web-based applications in the cloud. Traditional performance models and indexes related to usage of the main system resources (such as processor, memory, storage, and bandwidth) have been reformulated to include new metrics (such as customer losses and service costs) that are useful for business managers. Additionally, the proposed methodology has been illustrated with a case study of a typical e-commerce scenario. Experimental results show that the proposed metrics enable internet-based companies to estimate the cost of adopting a particular cloud configuration more accurately in terms of the infrastructure cost and the cost of losing customers due to performance degradation. Consequently, the methodology can be a useful tool to assess the feasibility of business plans.This work has been partially supported by the Spanish Ministry of Economy and Competitiveness under Grant TIN2013-43913-R.Domenech, J.; Peña Ortiz, R.; Gil, JA.; Pont Sanjuan, A. (2016). A methodology for economic evaluation of cloud-based web applications. International Journal of Information Technology and Decision Making. 15(6):1555-1578. https://doi.org/10.1142/S021962201650036XS1555157815

    On the genesis of the Haumea system

    Get PDF
    The scenarios proposed in the literature for the genesis of the system formed by the dwarf planet 136108 Haumea, its two satellites and a group of some 10 bodies (the family) with semimajor axes, eccentricities and inclinations close to Haumea's values, are analysed against collisional, physical, dynamical and statistical arguments in order to assess their likelihood. All scenarios based on collisional events are reviewed under physical arguments and the corresponding formation probabilities in a collisional environment are evaluated according to the collisional evolution model alicandep. An alternative mechanism is proposed based on the potential possibility of (quasi-) independent origin of the family with respect to Haumea and its satellites. As a general conclusion the formation of the Haumea system is a low-probability event in the currently assumed frame for the evolution of the outer Solar system. However, it is possible that current knowledge is missing some key element in the whole story that may contribute to increase the odds for the formation of such a system.Facultad de Ciencias Astronómicas y Geofísica

    Transneptunian objects and Centaurs from light curves

    Full text link
    We analyze a vast light curve database by obtaining mean rotational properties of the entire sample, determining the spin frequency distribution and comparing those data with a simple model based on hydrostatic equilibrium. For the rotation periods, the mean value obtained is 6.95 h for the whole sample, 6.88 h for the Trans-neptunian objects (TNOs) alone and 6.75 h for the Centaurs. From Maxwellian fits to the rotational frequencies distribution the mean rotation rates are 7.35 h for the entire sample, 7.71 h for the TNOs alone and 8.95 h for the Centaurs. These results are obtained by taking into account the criteria of considering a single-peak light curve for objects with amplitudes lower than 0.15 mag and a double-peak light curve for objects with variability >0.15mag. The best Maxwellian fits were obtained with the threshold between 0.10 and 0.15mag. The mean light-curve amplitude for the entire sample is 0.26 mag, 0.25mag for TNOs only, and 0.26mag for the Centaurs. The amplitude versus Hv correlation clearly indicates that the smaller (and collisionally evolved) objects are more elongated than the bigger ones. From the model results, it appears that hydrostatic equilibrium can explain the statistical results of almost the entire sample, which means hydrostatic equilibrium is probably reached by almost all TNOs in the H range [-1,7]. This implies that for plausible albedos of 0.04 to 0.20, objects with diameters from 300km to even 100km would likely be in equilibrium. Thus, the great majority of objects would qualify as being dwarf planets because they would meet the hydrostatic equilibrium condition. The best model density corresponds to 1100 kg/m3.Comment: 21 pages, 8 figures. Astronomy & Astrophysics, in pres

    Water-ice driven activity on Main-Belt Comet P/2010 A2 (LINEAR) ?

    Full text link
    The dust ejecta of Main-Belt Comet P/2010 A2 (LINEAR) have been observed with several telescopes at the at the Observatorio del Roque de los Muchachos on La Palma, Spain. Application of an inverse dust tail Monte Carlo method to the images of the dust ejecta from the object indicates that a sustained, likely water-ice driven, activity over some eight months is the mechanism responsible for the formation of the observed tail. The total amount of dust released is estimated to be 5E7 kg, which represents about 0.3% of the nucleus mass. While the event could have been triggered by a collision, this cannot be decided from the currently available data.Comment: Accepted for ApJ Letter

    Analyzing web server performance under dynamic user workloads

    Full text link
    The increasing popularity of web applications has introduced a new paradigm where users are no longer passive web consumers but they become active contributors to the web, specially in the contexts of social networking, blogs, wikis or e-commerce. In this new paradigm, contents and services are even more dynamic, which consequently increases the level of dynamism in user's behavior. Moreover, this trend is expected to rise in the incoming web. This dynamism is a major adversity to define and model representative web workload, in fact, this characteristic is not fully represented in the most of the current web workload generators. This work proves that the web user's dynamic behavior is a crucial point that must be addressed in web performance studies in order to accurately estimate system performance indexes. In this paper, we analyze the effect of using a more realistic dynamic workload on the web performance metrics. To this end, we evaluate a typical e-commerce scenario and compare the results obtained using different levels of dynamic workload instead of traditional workloads. Experimental results show that, when a more dynamic and interactive workload is taken into account, performance indexes can widely differ and noticeably affect the stress borderline on the server. For instance, the processor usage can increase 30% due to dynamism, affecting negatively average response time perceived by users, which can also turn in unwanted effects in marketing and fidelity policies. © 2012 Elsevier B.V. All rights reserved.This work has been partially supported by the Spanish Ministry of Science and Innovation under Grant TIN-2009-08201.Peña Ortiz, R.; Gil Salinas, JA.; Sahuquillo Borrás, J.; Pont Sanjuan, A. (2013). Analyzing web server performance under dynamic user workloads. Computer Communications. 36(4):386-395. https://doi.org/10.1016/j.comcom.2012.11.005S38639536

    Sex-Specific Effects of High Yolk Androgen Levels on Constitutive and Cell-Mediated Immune Responses in Nestlings of an Altricial Passerine

    Get PDF
    Avian embryos are exposed to yolk androgens that are incorporated into the egg by the ovulating female. These steroids can affect several aspects of embryo development, often resulting in increases in overall size or the speed of growth of different traits. However, several studies suggest that they also entail immune costs to the offspring. In this study, we explored whether variation in yolk androgen concentration affected several measures of the constitutive and cell-mediated immune axes in the spotless starling (Sturnus unicolor). Using a within-brood design, we injected different doses of androgens (testosterone and androstenedione) into the eggs. Our study showed that experimentally increased yolk androgens led to sex-specific immunosuppression in both the innate and adaptive axes of the immune system. Both cell-mediated immune response (CMI) and lysozyme activity decreased with increasing androgen levels injected into the egg in the case of male nestlings, whereas there were no effects on females. The effects that we found were always linear: no quadratic or threshold patterns were detected. We found no effects of the experimental treatment in hemolysis or agglutination capacity, but these measures were negatively correlated with CMI, suggesting negative correlation among different branches of the immune system. Blood (trypanosomes and hemosporidians) and intestinal (coccidia) parasites were not affected by the experimental increase of yolk androgen levels. Our results show that in our study species yolk androgens induce immunosuppression in some axes of the male nestling immune system. Further studies should analyze the proximate causes for these contrasting effects in different axes of the immune system and the reason for the differential impact on males and females

    MOGEDA : Modelo Genérico de Desensamblado Automático

    Get PDF
    El desensamblado de productos es la clave del proceso de reciclado. En este artículo se plantea el modelado del proceso de desensamblado automático de productos. Se estudian, tanto los requerimientos necesarios para poder abordar el proceso de forma automática, como las herramientas necesarias para poderlo llevar a cabo: base de conocimiento basada en modelos y técnicas de reconocimiento y localización tridimensional de objetos mediante visión artificial.Tanto los trabajos realizados como los futuros están enmarcados en el proyecto de la CICYT “Sistema Robotizado de Desensamblado Automático basado en Modelos y Visión Artificial” (TAP1999-0436)
    corecore