22 research outputs found
Licensing the Use of Grid Services
In this paper, a flexible approach to license the use of WSRF-compliant Grid services as implemented in the Globus Toolkit 4 is presented. A license definition and recombination language which allows to create new licenses on demand in a fine-grained and user dependent manner is introduced. Implementation issues for some components of the proposed licensing system are described
Floating License Management - Automation Using Web Technologies
This paper examines the use of distributed computing based on web services with application to floating license management. The main goal is to automate the processes pertaining to the management activities and ensure at the same time that the security and flexibility requirements are met. We present the challenges posed by these requirements, propose a design and some implementation aspects using the latest .NET development platform
WISDOM: A Grid-Enabled Drug Discovery Initiative Against Malaria
The goal of this chapter is to present the WISDOM initiative, which is one of
the main accomplishments in the use of grids for biomedical sciences
achieved on grid infrastructures in Europe. Researchers in life sciences are
among the most active scientifi c communities on the EGEE infrastructure.
As a consequence, the biomedical virtual organization stands fourth in
terms of resources consumed in 2007, with an average of 7000 jobs submitted
every day to the grid and more than 4 million hours of CPU consumed in
the last 12 months. Only three experiments on the CERN Large Hadron
Collider have used more resources. Compared to particle physics, the use of
resources is much less centralized as about 40 different scientifi c applications
are now currently deployed on EGEE. Each of them requires an amount
of CPU which ranges from a few to a few hundred CPU years. Thanks to the
20,000 processors available to the users of the biomedical virtual organization,
crunching factors in the hundreds are witnessed routinely. Such
performances were already achieved on supercomputers but at the cost of
reservation and long delays in the access to resources. On the contrary, grid
infrastructures are constantly open to the user communities.
Such changes in the scale of the computing resources made continuously
available to the researchers in biomedical sciences open opportunities for
exploring new fi elds or changing the approach to existing challenges. In
this chapter, we would like to show the potential impact of grids in the fi eld
of drug discovery through the example of the WISDOM initiative
Requirements Management Tools: A Quantitative Assessment
This report is primarily aimed at people with some background in Requirements Engineering or practitioners wishing to assess tools available for managing requirements. We provide a starting point for this assessment, by presenting a brief survey of existing Requirements Management tools. As a part of the survey, we characterize a set of requirements management tools by outlining their features, capabilities and goals. The characterization offers a foundation to select and possibly customize a requirements engineering tool for a software project. This report consists of three parts. In Part I we define the terms requirements and requirements engineering and briefly point out the main components of the requirements engineering process. In Part II, we survey the characteristics and capabilities of 6 popular requirements management tools, available in the market. We enumerate the salient features of each of theses tools. In Part III, we briefly describe a Synergistic Environment for Requirement Generation. This environment captures additional tools augmenting the requirements generation process. A description of these tools is provided. In the concluding section, we present a discussion defining the ideal set of characteristics that should be embodied in a requirements management tool. This report is adapted from a compendium of assignments that were prepared by the students in a Requirements Engineering class offered in the Department of Computer Science at Virginia Tech
Grid enabled high throughput virtual screening against four different targets implicated in malaria
PCSVInternational audienceAfter having deployed a first data challenge on malaria and a second one on avian flu, respectively in summer 2005 and spring 2006, we are demonstrating here again how efficiently the computational grids can be used to produce massive docking data at a high-throughput. During more than 2 months and a half, we have achieved at least 140 million dockings, representing an average throughput of almost 80,000 dockings per hour. This was made possible by the availability of thousands of CPUs through different infrastructures worldwide. Through the acquired experience, the WISDOM production environment is evolving to enable an easy and fault-tolerant deployment of biological tools; in this case it is the FlexX commercial docking software which is used to dock the whole ZINC database against 4 different targets
The Feature-Architecture Mapping Method for Feature-Oriented Development of Software Product Lines
Software Produktlinien sind die Antwort von Software Engineering auf die
zu-nehmende Komplexität und kürzerenProdukteinführungszeiten von heutigen
Softwaresystemen. Nichtsdestotrotz erfordern Software Produktlinien
einefortgeschrittene Wartbarkeit und hohe Flexibilität. Das kann durch die
angemessene Trennung der Belange erreicht werden.Merkmale stellen die
Hauptbelange im Kontext von Software Produktlinien dar. Demzufolge sollte
ein Merkmal idealerweise ingenau einer Architekturkomponente implementiert
werden. In der Praxis ist das jedoch nicht immer machbar. Deshalb
solltezumindest ein starkes Mapping zwischen Merkmalen und der Architektur
bestehen. Die Methoden zur Entwicklung von SoftwareProduktlinien, die dem
Stand der Technik entsprechen, führen zu bedeutender Verstreutheit und
Vermischung von Merkmalen. Indieser Arbeit wird die Feature-Architecture
Mapping (FArM) Methode entwickelt, um ein stärkeres Mapping zwischen
Merkmalenund der Produktlinien-Architektur zu erzielen. Der Input für FArM
besteht in einem initialen Merkmalmodell, das anhand einerMethode zur
Domänenanalyse erstellt wurde. Dieses initiale Merkmalmodell wird einer
Serie von Transformationen unterzogen.Die Transformationen streben danach,
ein Gleichgewicht zwischen der Sichtweise von Kunden und
Softwarearchitekteneinzustellen. Die Merkmalinteraktionen werden während
der Transformationen ausdrücklich optimiert. Von jedem Merkmal
destransformierten Merkmalmodells wird eine Architekturkomponente
abgeleitet. Die Architekturkomponenten implementieren dieApplikationslogik
der entsprechenden Merkmale. Die Kommunikation zwischen den Komponenten
spiegelt die Interaktion zwischenden Merkmalen wider. Dieser Ansatz führt
im Vergleich zu den Produktlinien-Entwicklungsmethoden des Stands der
Technik zueinem stärkeren Mapping zwischen Merkmalen und der Architektur
und zu einer höheren Variabilität auf Merkmalebene. DieseEigenschaften
haben eine bessere Wartbarkeit und eine vereinfachte generative
Produktinstanzierung zur Folge, was wiederumdie Flexibilität der
Produktlinien steigert. FArM wurde durch ihre Anwendung in einigen Domänen
evaluiert, z.B. in denDomänen von Mobiltelefonen und Integrierten
Entwicklungsumgebungen (IDEs). Diese Arbeit wird FArM anhand einer
Fallstudie inder Domäne von Künstlichen Neuronalen Netzwerken präsentieren.Software product lines are the answer of software engineering to the
increasing complexity and shorter time-to-market ofcontemporary software
systems. Nonetheless, software product lines demand for advanced
maintainability and high flexibility.The latter can be achieved through the
proper separation of concerns. Features pose the main concerns in the
context ofsoftware product lines. Consequently, one feature should ideally
be implemented into exactly one architectural component. Inpractice, this
is not always feasible. Therefore, at least a strong mapping between
features and the architecture mustexist. The state of the art product line
development methodologies introduce significant scattering and tangling
offeatures. In this work, the Feature-Architecture Mapping (FArM) method is
developed, to provide a stronger mapping betweenfeatures and the product
line architecture. FArM receives as input an initial feature model created
by a domain analysismethod. The initial feature model undergoes a series of
transformations. The transformations strive to achieve a balancebetween the
customer and architectural perspectives. Feature interaction is explicitly
optimized during the feature modeltransformations. For each feature of the
transformed feature model, one architectural component is derived.
Thearchitectural components implement the application logic of the
respective features. The component communication reflectsthe feature
interaction. This approach, compared to the state of the art product line
methodologies, allows a strongerfeature-architecture mapping and for higher
variability on the feature level. These attributes provide
highermaintainability and an improved generative approach to product
instantiation, which in turn enhances product lineflexibility. FArM has
been evaluated through its application in a number of domains, e.g in the
mobile phone domain and theIntegrated Development Environment (IDE) domain.
This work will present FArM on the basis of a case study in the domain
ofartificial Neural Networks
Jetstream: A self-provisoned, scalable science and engineering cloud environment
The paper describes the motivation behind Jetstream, its functions, hardware configuration, software environment, user interface, design, use cases, relationships with other projects such as Wrangler and iPlant, and challenges in implementation.Funded by the National Science Foundation Award #ACI - 144560
Proceedings of the 1994 Monterey Workshop, Increasing the Practical Impact of Formal Methods for Computer-Aided Software Development: Evolution Control for Large Software Systems Techniques for Integrating Software Development Environments
Office of Naval Research, Advanced Research Projects Agency, Air Force Office of Scientific Research, Army Research Office, Naval Postgraduate School, National Science Foundatio
Analyzing and Developing Aspects of the Artist Pipeline for Clemson University Art
Major digital production facilities such as Sony Pictures Imageworks, Pixar Animation studio, Walt Disney Animation Studio, and Epic Games use a production system called a pipeline. The term “pipeline” refers to the structure and process of data flow between the various phases of production from story to final edit. This paper examines current production pipeline practices in the Digital Production Arts program at Clemson University and proposes updates and modifications to the workflow. Additionally, this thesis suggests tools that are intended to improve the pipeline with artist-friendly interfaces and customizable integration between software and remote-production capabilities