211 research outputs found

    High-Performance Cloud Computing: A View of Scientific Applications

    Full text link
    Scientific computing often requires the availability of a massive number of computers for performing large scale experiments. Traditionally, these needs have been addressed by using high-performance computing solutions and installed facilities such as clusters and super computers, which are difficult to setup, maintain, and operate. Cloud computing provides scientists with a completely new model of utilizing the computing infrastructure. Compute resources, storage resources, as well as applications, can be dynamically provisioned (and integrated within the existing infrastructure) on a pay per use basis. These resources can be released when they are no more needed. Such services are often offered within the context of a Service Level Agreement (SLA), which ensure the desired Quality of Service (QoS). Aneka, an enterprise Cloud computing solution, harnesses the power of compute resources by relying on private and public Clouds and delivers to users the desired QoS. Its flexible and service based infrastructure supports multiple programming paradigms that make Aneka address a variety of different scenarios: from finance applications to computational science. As examples of scientific computing in the Cloud, we present a preliminary case study on using Aneka for the classification of gene expression data and the execution of fMRI brain imaging workflow.Comment: 13 pages, 9 figures, conference pape

    Aneka: A Software Platform for .NET-based Cloud Computing

    Full text link
    Aneka is a platform for deploying Clouds developing applications on top of it. It provides a runtime environment and a set of APIs that allow developers to build .NET applications that leverage their computation on either public or private clouds. One of the key features of Aneka is the ability of supporting multiple programming models that are ways of expressing the execution logic of applications by using specific abstractions. This is accomplished by creating a customizable and extensible service oriented runtime environment represented by a collection of software containers connected together. By leveraging on these architecture advanced services including resource reservation, persistence, storage management, security, and performance monitoring have been implemented. On top of this infrastructure different programming models can be plugged to provide support for different scenarios as demonstrated by the engineering, life science, and industry applications.Comment: 30 pages, 10 figure

    Plaga

    Get PDF
    La Plaga entendida como amenaza que promueve daños masivos para una comunidad es una construcción que en algunas oportunidades opera como estrategia para identificar y segregar a un otro. Indagando sobre la pregunta :¿Qué cosas nos generan rechazo?, este proyecto plantea una reflexión que se pregunta por la constitución de una identidad heterogénea y en tránsito. Con este propósito la plaga se constituye como la metáfora de los otros en una relocalización que opera - a través de impresos de alimañas de gran escala- tanto en el espacio público como en el interior del Centro de Arte de la UNLP. En ambos sitios, una fauna inmunda se filtra, penetra y toma el lugar. De esta manera, se discute la definición de una identidad propia, buscando la ampliación de fronteras sensibles por medio de la incorporación de la subjetividad del otro. La Plaga revela entonces, nuevas posibilidades en un contexto social propio.Facultad de Bellas Artes (FBA

    Plaga

    Get PDF
    La Plaga entendida como amenaza que promueve daños masivos para una comunidad es una construcción que en algunas oportunidades opera como estrategia para identificar y segregar a un otro. Indagando sobre la pregunta :¿Qué cosas nos generan rechazo?, este proyecto plantea una reflexión que se pregunta por la constitución de una identidad heterogénea y en tránsito. Con este propósito la plaga se constituye como la metáfora de los otros en una relocalización que opera - a través de impresos de alimañas de gran escala- tanto en el espacio público como en el interior del Centro de Arte de la UNLP. En ambos sitios, una fauna inmunda se filtra, penetra y toma el lugar. De esta manera, se discute la definición de una identidad propia, buscando la ampliación de fronteras sensibles por medio de la incorporación de la subjetividad del otro. La Plaga revela entonces, nuevas posibilidades en un contexto social propio.Facultad de Bellas Artes (FBA

    Plaga

    Get PDF
    La Plaga entendida como amenaza que promueve daños masivos para una comunidad es una construcción que en algunas oportunidades opera como estrategia para identificar y segregar a un otro. Indagando sobre la pregunta :¿Qué cosas nos generan rechazo?, este proyecto plantea una reflexión que se pregunta por la constitución de una identidad heterogénea y en tránsito. Con este propósito la plaga se constituye como la metáfora de los otros en una relocalización que opera - a través de impresos de alimañas de gran escala- tanto en el espacio público como en el interior del Centro de Arte de la UNLP. En ambos sitios, una fauna inmunda se filtra, penetra y toma el lugar. De esta manera, se discute la definición de una identidad propia, buscando la ampliación de fronteras sensibles por medio de la incorporación de la subjetividad del otro. La Plaga revela entonces, nuevas posibilidades en un contexto social propio.Facultad de Bellas Artes (FBA

    InterCloud: Utility-Oriented Federation of Cloud Computing Environments for Scaling of Application Services

    Full text link
    Cloud computing providers have setup several data centers at different geographical locations over the Internet in order to optimally serve needs of their customers around the world. However, existing systems do not support mechanisms and policies for dynamically coordinating load distribution among different Cloud-based data centers in order to determine optimal location for hosting application services to achieve reasonable QoS levels. Further, the Cloud computing providers are unable to predict geographic distribution of users consuming their services, hence the load coordination must happen automatically, and distribution of services must change in response to changes in the load. To counter this problem, we advocate creation of federated Cloud computing environment (InterCloud) that facilitates just-in-time, opportunistic, and scalable provisioning of application services, consistently achieving QoS targets under variable workload, resource and network conditions. The overall goal is to create a computing environment that supports dynamic expansion or contraction of capabilities (VMs, services, storage, and database) for handling sudden variations in service demands. This paper presents vision, challenges, and architectural elements of InterCloud for utility-oriented federation of Cloud computing environments. The proposed InterCloud environment supports scaling of applications across multiple vendor clouds. We have validated our approach by conducting a set of rigorous performance evaluation study using the CloudSim toolkit. The results demonstrate that federated Cloud computing model has immense potential as it offers significant performance gains as regards to response time and cost saving under dynamic workload scenarios.Comment: 20 pages, 4 figures, 3 tables, conference pape

    Développement d’une imagerie de résistance électrique locale par AFM à pointe conductrice en mode contact intermittent

    Get PDF
    The atomic force microscope (AFM) allows to characterize with excellent spatial resolution samples of different types of surfaces and can be implemented in various environments. This versatility has encouraged the development of a large number of derivative technics, intended to investigate various local physical properties. The LGEP thus achieved a module, the Résiscope, capable of measuring the local electrical resistance on the surface of a sample polarized continuously, on a range of 11 decades. Developed in contact mode, where the tip continuously exerts a force on the sample, this technic works well on hard materials, but finds its limits on soft or fragile samples since under certain conditions, the tip can alter the surface. For such samples, an intermittent contact mode, in which the tip comes at regular intervals touch very briefly the surface, is more appropriate, but complicates the achievement of electrical measurements. The aim of this thesis was to overcome this difficulty by changing the Résiscope to be able to join the "Pulsed Force Mode", intermittent mode where the tip oscillates at a frequency of 100Hz to 2000Hz.Different hardware and software changes have been made to permit the detailed temporal monitoring of the electrical resistance signal to each make / break contact (necessary to review the phenomena related to intermittency), as well as to be able to work in acceptable scan speeds. For imaging, the best contrasts were obtained through an electronic timing and treatment taking into account the electrical resistance values at specific times.To test this new system, we have initially compared resistance and deflection curves we get by this mode with those considered classically in the force-distance curves mode. We then investigated the influence of main parameters (frequency and amplitude of oscillation, setpoint, coating of the tips, etc.) on the topographical and electrical measurements, using the HOPG as reference material. These tests resulted to highlight a nearly systematic delay of the electrical signal relative to the deflection signal (other than the Resiscope measure time), which we were not able to elucidate the origin.Once these knowledge acquired, we studied two types of organic samples, one in academic nature - Self-Assembled Monolayers of alkanethiols (SAMs), the other more applicative purpose – formed of thin layers of an interpenetrating network of two components (P3HT:PCBM) for photovoltaic cells. In both cases we have shown the relevance of the Resiscope tool in intermittent mode to obtain qualitative and quantitative information. In addition to these work on fragile materials, we conducted an annex study on a phenomenon of growth material of insulating nature found in special conditions on various hard materials, which has been interpreted as the friction polymer formation as a result of repeatedly nano-sliding associated with the deflection of the cantilever.These investigations were conducted under a CIFRE agreement with the Concept Scientific Instruments company, backed by the ANR MELAMIN» (P2N 2011) project.Le microscope à force atomique (AFM) permet de caractériser avec une excellente résolution spatiale les surfaces d’échantillons de différentes natures et peut être mis en œuvre dans des milieux variés. Cette versatilité a favorisé le développement d’un grand nombre de techniques dérivées, destinées à investiguer diverses propriétés physiques locales. Le LGEP a ainsi réalisé un module, le Résiscope, capable de mesurer la résistance électrique locale à la surface d’un échantillon polarisé en continu, sur une gamme de 11 décades. Mise au point en mode contact, où la pointe exerce en permanence une force sur l’échantillon, cette technique fonctionne très bien sur des matériaux durs, mais trouve ses limites sur des échantillons mous ou fragiles puisque dans certaines conditions, la pointe peut altérer leur surface. Pour de tels échantillons, un mode contact intermittent, dans lequel la pointe vient à intervalles réguliers toucher très brièvement la surface, est plus approprié, mais complique la réalisation des mesures électriques. Le but de la thèse consistait à lever cette difficulté en modifiant le Résiscope pour pouvoir l’associer au « Pulsed Force Mode », mode intermittent où la pointe oscille à une fréquence de 100Hz à 2000Hz.Différentes évolutions matérielles et logicielles ont été apportées pour permettre le suivi temporel détaillé du signal de résistance électrique à chaque établissement/rupture de contact (indispensable pour passer en revue les phénomènes liés à l’intermittence), de même que pour pouvoir travailler à des vitesses de balayage acceptables. Pour l’imagerie, les meilleurs contrastes ont été obtenus grâce à une électronique de synchronisation et de traitement prenant en compte les valeurs de résistance électrique à des moments bien précis. Pour tester ce nouveau système, nous avons dans un premier temps comparé les courbes de résistance et de déflexion que nous obtenons par ce mode avec celles considérées classiquement dans le mode approche-retrait. Nous avons ensuite étudié l’influence des principaux paramètres (fréquence et amplitude d’oscillation, force d’appui, type de pointe, etc.) sur les mesures topographiques et électriques, en utilisant le HOPG comme matériau de référence. Ces essais ont notamment permis de mettre en évidence un retard quasi systématique du signal électrique par rapport au signal de déflexion (autre que le temps de mesure propre au Résiscope), dont nous n’avons pu élucider l’origine. Une fois ces connaissances acquises, nous avons étudié deux types d’échantillons organiques, l’un à caractère académique – des monocouches auto-assemblées d’alcanethiols (SAMs), l’autre à finalité plus applicative – des couches minces formées d’un réseau interpénétré de deux constituants (P3HT:PCBM) destinées aux cellules photovoltaïques. Dans les deux cas nous avons montré la pertinence de l’outil Résiscope en mode intermittent pour obtenir des informations qualitatives et quantitatives. Parallèlement à ces travaux sur matériaux fragiles, nous avons mené une étude annexe sur un phénomène de croissance de matière à caractère isolant constaté dans des conditions particulières sur différents matériaux durs, qui a été interprété comme la formation de polymère de friction sous l’effet des nano-glissements répétés associés à la déflexion du levier.Ces travaux ont été réalisés dans le cadre d’une convention CIFRE avec la société Concept Scientifique Instruments, adossée au projet ANR « MELAMIN » (P2N 2011)

    Cloudbus Toolkit for Market-Oriented Cloud Computing

    Full text link
    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.Comment: 21 pages, 6 figures, 2 tables, Conference pape
    corecore