75 research outputs found

    Focus Issue on Legacy Information Systems and Business Process Change: Migrating Large-Scale Legacy Systems to Component-Based and Object Technology: The Evolution of a Pattern Language

    Get PDF
    The process of developing large-scale business critical software systems must boost the productivity both of the users and the developers of software, while at the same time responding flexibly to changing business requirements in the face of sharpening competition. Historically, these two forces were viewed as mutually hostile. Component-based software development using object technology promises a way of mediating the apparent contradiction. This paper presents a successful new approach which focuses primarily on the architecture of the software system to migrate an existing system to a new form. Best practice is captured by software patterns that address not only the design, but also the process and organizational issues. The approach was developed through four completed, successful live projects in different business and technical areas. It resulted in a still-evolving pattern language called ADAPTOR (Architecture-Driven and Pattern-based Techniques for Object Re-engineering). This article outlines the approach that underlies ADAPTOR. It challenges popular notions of legacy systems by emphasizing business requirements. Architectural approaches to migration are then contrasted with traditional reverse engineering approaches, including the weakness of reverse engineering in the face of paradigm shifts. The evolution of the ADAPTOR pattern language is outlined with a brief history of the projects from which the patterns were abstracted

    A Methodological Framework for Socio-Cognitive Analyses of Collaborative Design of Open Source Software

    Get PDF
    Open Source Software (OSS) development challenges traditional software engineering practices. In particular, OSS projects are managed by a large number of volunteers, working freely on the tasks they choose to undertake. OSS projects also rarely rely on explicit system-level design, or on project plans or schedules. Moreover, OSS developers work in arbitrary locations and collaborate almost exclusively over the Internet, using simple tools such as email and software code tracking databases (e.g. CVS). All the characteristics above make OSS development akin to weaving a tapestry of heterogeneous components. The OSS design process relies on various types of actors: people with prescribed roles, but also elements coming from a variety of information spaces (such as email and software code). The objective of our research is to understand the specific hybrid weaving accomplished by the actors of this distributed, collective design process. This, in turn, challenges traditional methodologies used to understand distributed software engineering: OSS development is simply too "fibrous" to lend itself well to analysis under a single methodological lens. In this paper, we describe the methodological framework we articulated to analyze collaborative design in the Open Source world. Our framework focuses on the links between the heterogeneous components of a project's hybrid network. We combine ethnography, text mining, and socio-technical network analysis and visualization to understand OSS development in its totality. This way, we are able to simultaneously consider the social, technical, and cognitive aspects of OSS development. We describe our methodology in detail, and discuss its implications for future research on distributed collective practices

    Integrating Seismic Activity Into Land Use Management: A Case Study From Central Arkansas Using HAZUS Software Application

    Get PDF
    Almost 20 years after a remarkable swarm of more than 30,000 micro-earthquakes, a new swarm revisited the same region of central Arkansas, less than 30 miles northeast of Conway, Arkansas. A main shock on May 4, 2001 of magnitude MR = 4.4 was followed by a large number of aftershocks in a small crustal volume about 2,500 events for about 2 months. Preliminary locations of aftershocks from the portable network together with the locations based on data from regional networks lead us to conclude that both swarms (2001 and 1982) occupy virtually the same crustal volume. In following years several other active faults were found in Arkansas, yet few studies have been done to investigate the potential damages that an earthquake would produce in Central Arkansas. The HAZUS-MH software tool, developed by the Federal Emergency Management Agency and the National Institute of Building Sciences was used to identify areas most physically and socially vulnerable to earthquake ground shaking and to present earthquake loss estimations for downtown Conway, Arkansas for this study. As the thrust of this research, it was found that the accuracy of the loss estimation is dependent on several factors. The greatest amount of losses occurred when (a) stronger ground shaking occurred greater than MR=5.0 hitting (b)unreinforced masonry such as non rebar brick and mortar and (c)commercial buildings such as large open-beamed warehouses (d) in the afternoon 3pm-5pm

    A unifying mathematical definition enables the theoretical study of the algorithmic class of particle methods.

    Get PDF
    Mathematical definitions provide a precise, unambiguous way to formulate concepts. They also provide a common language between disciplines. Thus, they are the basis for a well-founded scientific discussion. In addition, mathematical definitions allow for deeper insights into the defined subject based on mathematical theorems that are incontrovertible under the given definition. Besides their value in mathematics, mathematical definitions are indispensable in other sciences like physics, chemistry, and computer science. In computer science, they help to derive the expected behavior of a computer program and provide guidance for the design and testing of software. Therefore, mathematical definitions can be used to design and implement advanced algorithms. One class of widely used algorithms in computer science is the class of particle-based algorithms, also known as particle methods. Particle methods can solve complex problems in various fields, such as fluid dynamics, plasma physics, or granular flows, using diverse simulation methods, including Discrete Element Methods (DEM), Molecular Dynamics (MD), Reproducing Kernel Particle Methods (RKPM), Particle Strength Exchange (PSE), and Smoothed Particle Hydrodynamics (SPH). Despite the increasing use of particle methods driven by improved computing performance, the relation between these algorithms remains formally unclear. In particular, particle methods lack a unifying mathematical definition and precisely defined terminology. This prevents the determination of whether an algorithm belongs to the class and what distinguishes the class. Here we present a rigorous mathematical definition for determining particle methods and demonstrate its importance by applying it to several canonical algorithms and those not previously recognized as particle methods. Furthermore, we base proofs of theorems about parallelizability and computational power on it and use it to develop scientific computing software. Our definition unified, for the first time, the so far loosely connected notion of particle methods. Thus, it marks the necessary starting point for a broad range of joint formal investigations and applications across fields.:1 Introduction 1.1 The Role of Mathematical Definitions 1.2 Particle Methods 1.3 Scope and Contributions of this Thesis 2 Terminology and Notation 3 A Formal Definition of Particle Methods 3.1 Introduction 3.2 Definition of Particle Methods 3.2.1 Particle Method Algorithm 3.2.2 Particle Method Instance 3.2.3 Particle State Transition Function 3.3 Explanation of the Definition of Particle Methods 3.3.1 Illustrative Example 3.3.2 Explanation of the Particle Method Algorithm 3.3.3 Explanation of the Particle Method Instance 3.3.4 Explanation of the State Transition Function 3.4 Conclusion 4 Algorithms as Particle Methods 4.1 Introduction 4.2 Perfectly Elastic Collision in Arbitrary Dimensions 4.3 Particle Strength Exchange 4.4 Smoothed Particle Hydrodynamics 4.5 Lennard-Jones Molecular Dynamics 4.6 Triangulation refinement 4.7 Conway's Game of Life 4.8 Gaussian Elimination 4.9 Conclusion 5 Parallelizability of Particle Methods 5.1 Introduction 5.2 Particle Methods on Shared Memory Systems 5.2.1 Parallelization Scheme 5.2.2 Lemmata 5.2.3 Parallelizability 5.2.4 Time Complexity 5.2.5 Application 5.3 Particle Methods on Distributed Memory Systems 5.3.1 Parallelization Scheme 5.3.2 Lemmata 5.3.3 Parallelizability 5.3.4 Bounds on Time Complexity and Parallel Scalability 5.4 Conclusion 6 Turing Powerfulness and Halting Decidability 6.1 Introduction 6.2 Turing Machine 6.3 Turing Powerfulness of Particle Methods Under a First Set of Constraints 6.4 Turing Powerfulness of Particle Methods Under a Second Set of Constraints 6.5 Halting Decidability of Particle Methods 6.6 Conclusion 7 Particle Methods as a Basis for Scientific Software Engineering 7.1 Introduction 7.2 Design of the Prototype 7.3 Applications, Comparisons, Convergence Study, and Run-time Evaluations 7.4 Conclusion 8 Results, Discussion, Outlook, and Conclusion 8.1 Problem 8.2 Results 8.3 Discussion 8.4 Outlook 8.5 Conclusio

    Annual report of the officers of the town of Conway, New Hampshire for the fiscal year ending December 31, 1992.

    Get PDF
    This is an annual report containing vital statistics for a town/city in the state of New Hampshire

    An Agent-Based Variogram Modeller: Investigating Intelligent, Distributed-Component Geographical Information Systems

    Get PDF
    Geo-Information Science (GIScience) is the field of study that addresses substantive questions concerning the handling, analysis and visualisation of spatial data. Geo- Information Systems (GIS), including software, data acquisition and organisational arrangements, are the key technologies underpinning GIScience. A GIS is normally tailored to the service it is supposed to perform. However, there is often the need to do a function that might not be supported by the GIS tool being used. The normal solution in these circumstances is to go out and look for another tool that can do the service, and often an expert to use that tool. This is expensive, time consuming and certainly stressful to the geographical data analyses. On the other hand, GIS is often used in conjunction with other technologies to form a geocomputational environment. One of the complex tools in geocomputation is geostatistics. One of its functions is to provide the means to determine the extent of spatial dependencies within geographical data and processes. Spatial datasets are often large and complex. Currently Agent system are being integrated into GIS to offer flexibility and allow better data analysis. The theis will look into the current application of Agents in within the GIS community, determine if they are used to representing data, process or act a service. The thesis looks into proving the applicability of an agent-oriented paradigm as a service based GIS, having the possibility of providing greater interoperability and reducing resource requirements (human and tools). In particular, analysis was undertaken to determine the need to introduce enhanced features to agents, in order to maximise their effectiveness in GIS. This was achieved by addressing the software agent complexity in design and implementation for the GIS environment and by suggesting possible solutions to encountered problems. The software agent characteristics and features (which include the dynamic binding of plans to software agents in order to tackle the levels of complexity and range of contexts) were examined, as well as discussing current GIScience and the applications of agent technology to GIS, agents as entities, objects and processes. These concepts and their functionalities to GIS are then analysed and discussed. The extent of agent functionality, analysis of the gaps and the use these technologies to express a distributed service providing an agent-based GIS framework is then presented. Thus, a general agent-based framework for GIS and a novel agent-based architecture for a specific part of GIS, the variogram, to examine the applicability of the agent- oriented paradigm to GIS, was devised. An examination of the current mechanisms for constructing variograms, underlying processes and functions was undertaken, then these processes were embedded into a novel agent architecture for GIS. Once the successful software agent implementation had been achieved, the corresponding tool was tested and validated - internally for code errors and externally to determine its functional requirements and whether it enhances the GIS process of dealing with data. Thereafter, its compared with other known service based GIS agents and its advantages and disadvantages analysed

    Risk management in data science projects in Portugal

    Get PDF
    The increasing popularity of data projects has influenced many development initiatives aimed at improving business performance and decision-making. However, Data Science projects carry in their essence a set of specific risks and uncertainties. Good risk management is one of the most crucial components of a project. Its effective conduct increases the probability of project success, however, it is necessary to understand the environment and the components surrounding risks. In this context, this investigation was conducted to create a base list of the risks of Data Science projects and their surrounding factors. This research was guided by the Design Science Research approach and the data collection process was conducted through the Delphi technique, where it was possible to identify and analyze the risks, their factors, the failure scenarios of the projects, and to understand the contribution of the development methodologies in these projects. The study enabled the creation of an artifact, consisting of a list of specific data management-related risks and best practice recommendations. However, it was found that more than half of the risks at the top of the rankings are similar to the risks of other types of IT projects. This research contributes a consolidated list of 25 risks of Data Science projects intending to help decrease the failures of projects in this area.A grande popularidade dos projetos de dados tem influenciado muitas iniciativas de desenvolvimento com o intuito de melhorar a performance dos negócios e tomadas de decisões. Contudo, os projetos de Ciência de Dados carregam na sua essência um conjunto de riscos e incertezas específicos. Ter uma boa gestão de risco é um dos componentes mais cruciais de um projeto. A sua eficaz conduta aumenta as probabilidades de sucesso do projeto, contudo, é necessário compreender os componentes envolventes aos riscos. Neste contexto, foi conduzida esta investigação com o propósito de criar uma lista base dos riscos dos projetos de Ciência de Dados e seus fatores envolventes. Esta investigação foi guiada pela abordagem de Design Science Research e o processo de recolha de dados foi conduzida através da técnica de Delphi, onde foi possível identificar e analisar os riscos, seus fatores, os cenários de falhas dos projetos e perceber o contributo das metodologias de desenvolvimento nesses projetos. O estudo permitiu a criação de um artefacto, que consiste em uma lista de riscos específicos relacionados com a gestão de dados e recomendações de boas práticas. Contudo, foi possível verificar que mais de metade dos riscos no topo das classificações são semelhantes aos riscos de outros tipos de projetos de IT. Esta investigação contribui com uma lista consolidada de 25 riscos dos projetos de Ciência de Dados com o intuído de auxiliar na diminuição das falhas dos projetos deste âmbito

    Eleventh European Powder Diffraction Conference. Warsaw, September 19-22, 2008

    Get PDF
    Zeitschrift für Kristallographie. Supplement Volume 30 presents the complete Proceedings of all contributions to the XI European Powder Diffraction Conference in Warsaw 2008: Method Development and Application,Instrumental, Software Development, Materials. Supplement Series of Zeitschrift für Kristallographie publishes Proceedings and Abstracts of international conferences on the interdisciplinary field of crystallography
    corecore