837 research outputs found

    A method for re-modularising legacy code

    Get PDF
    This thesis proposes a method for the re-modularisation of legacy COBOL. Legacy code often performs a number of functions that if split, would improve software maintainability. For instance, program comprehension would benefit from a reduction in the size of the code modules. The method aims to identify potential reuse candidates from the functions re-modularised, and to ensure clear interfaces are present between the new modules. Furthermore, functionality is often replicated across applications and so the re-modularisation process can also seek to reduce commonality and hence the overall amount of a company's code requiring maintenance. A 10 step method is devised which assembles a number of new and existing techniques into an approach suitable for use by staff not having significant reengineering experience. Three main approaches are used throughout the method; that is the analysis of the PERFORM structure, the analysis of the data, and the use of graphical representations. Both top-down and bottom-up strategies to program comprehension are incorporated within the method as are automatable, and user controlled processes to reuse candidate selection. Three industrial case studies are used to demonstrate and evaluate the method. The case studies range in size to gain an indication of the scalability of the method. The case studies are used to evaluate the method on a step by step basis; both strong points and deficiencies are identified, as well as potential solutions to the deficiencies. A review is also presented to assesses the three main approaches of the methods; the analysis of the PERFORM and data structures, and the use of graphical representations. The review uses the process of software evolution for its evaluation using successive versions of COBOL software. The method is retrospectively applied to the earliest version and the known changes identified from the following versions are used to evaluate the re-modularisations. Within the evaluation chapters a new link within the dominance tree is proposed as is an approach for dealing with multiple dominance trees. The results show that «ach approach provides an important contribution to the method as well as giving a useful insight (in the form of graphical representations) of the process of software evolution

    Extraction of objects from legacy systems: an example using cobol legacy systems

    Get PDF
    In the last few years the interest in legacy information system has increased because of the escalating resources spent on their maintenance. On the other hand, the importance of extracting knowledge from business rules is becoming a crucial issue for modern business: sometime, because of inappropriate documentation, this knowledge is essentially only stored in the code. A way to improve their use and maintainability in the present environment is to migrate them into a new hardware / software platform reusing as much of their experience as possible during this process. This migration process promotes the population of a repository of reusable software components for their reuse in the development of a new system in that application domain or in the later maintenance processes. The actual trend in the migration of a legacy information system, is to exploit the potentialities of object oriented technology as a natural extension of earlier structured programming techniques. This is done by decomposing the program into several agent-like modules communicating via message passing, and providing to this system some object oriented key features. The key step is the "object isolation", i.e. the isolation of .groups of routines and related data items : to candidates in order to implement an abstraction in the application domain. The main idea of the object isolation method presented here is to extract information from the data flow, to cluster all the procedures on the base of their data accesses. It will examine "how" a procedure accesses the data in order to distinguish several types of accesses and to permit a better understanding of the functionality of the candidate objects. These candidate modules support the population of a repository of reusable software components that might be used as a basis of the process of evolution leading to a new object oriented system reusing the extracted objects

    The consistent representation of scientific knowledge : investigations into the ontology of karyotypes and mitochondria

    Get PDF
    PhD ThesisOntologies are widely used in life sciences to model scienti c knowledge. The engineering of these ontologies is well-studied and there are a variety of methodologies and techniques, some of which have been re-purposed from software engineering methodologies and techniques. However, due to the complex nature of bio-ontologies, they are not resistant to errors and mistakes. This is especially true for more expressive and/or larger ontologies. In order to improve on this issue, we explore a variety of software engineering techniques that were re-purposed in order to aid ontology engineering. This exploration is driven by the construction of two light-weight ontologies, The Mitochondrial Disease Ontology and The Karyotype Ontology. These ontologies have speci c and useful computational goals, as well as providing exemplars for our methodology. This thesis discusses the modelling decisions undertaken as well as the overall success of each ontological model. Due to the added knowledge capture steps required for the mitochondrial knowledge, The Karyotype Ontology is further developed than The Mitochondrial Disease Ontology. Speci cally, this thesis explores the use of a pattern-driven and programmatic approach to bio-medical ontology engineering. During the engineering of our biomedical ontologies, we found many of the components of each model were similar in logical and textual de nitions. This was especially true for The Karyotype Ontology. In software engineering a common technique to avoid replication is to abstract through the use of patterns. Therefore we utilised localised patterns to model these highly repetitive models. There are a variety of possible tools for the encoding of these patterns, but we found ontology development using Graphical User Interface (GUI) tools to be time-consuming due to the necessity of manual GUI interaction when the ontology needed updating. With the development of Tawny- OWL, a programmatic tool for ontology construction, we are able to overcome this issue, with the added bene t of using a single syntax to express both simple and - i - patternised parts of the ontology. Lastly, we brie y discuss how other methodologies and tools from software engineering, namely unit tests, di ng, version control and Continuous Integration (CI) were re-purposed and how they aided the engineering of our two domain ontologies. Together, this knowledge increases our understanding in ontology engineering techniques. By re-purposing software engineering methodologies, we have aided construction, quality and maintainability of two novel ontologies, and have demonstrated their applicability more generally

    Reusable Space Vehicle Ground Operations Baseline Conceptual Model

    Get PDF
    Modeling efforts for future space operation vehicles at the United States Air Force Research Labs Air Vehicles Directorate have been focused towards the in-flight mission. To better serve the research and development effort, a simulation of the ground operations is required allowing for trade-offs within turnaround operations and between the components that drive those procedures. However, before a simulation can be developed a conceptual model must be generated to guide the model building process. This research provides a baseline conceptual model for reusable space vehicles based on the space shuttle as the only operational vehicle of its kind. The model is built utilizing the Integrated Definition (IDEF) methodology, specifically IDEF3. IDEF3 is focused towards process-viewpoint diagramming and layout. The model is developed using the hierarchical development capabilities of the IDEF3 methodology and is broken into modules allowing for greater reuse and usability. This model captures the scheduled maintenance performed to turnaround the space shuttle for the next launch but does not contain every activity. The idea was to capture the baseline activities that may be found in future Reusable Space Vehicles and provide a description of what happens at Kennedy Space Center when preparing the space shuttle for the next launch

    Design Information Recovery from Legacy System COBOL Source Code: Research on a Reverse Engineering Methodology

    Get PDF
    Much of the software in the world today was developed from the mid-1960s to the mid- 1970s.This legacy software deteriorates as it is modified to satisfy new organizational requirements. Currently, legacy system maintenance requires more time than new system development. Eventually, legacy systems must be replaced. Identifying their functionality is a critical part of the replacement effort. Recovering functions from source code is difficult because the domain knowledge used to develop the system is not routinely retained. The source code is frequently the only reliable source of functional information. This dissertation describes functional process information recovery from COBOL source code in the military logistics system domain. The methodology was developed as an information processing application. Conceptual and logical models to convert source code to functional design information were created to define the process. A supporting data structure was also developed. The process reverse engineering methodology was manually applied to a test case to demonstrate feasibility, practicality, and usefulness. Metrics for predicting the time required were developed and analyzed based on the results of the test case. The methodology was found to be effective in recovering functional process information from source code. A prototype program information database was developed and implemented to aid in data collection and manipulation; it also supported the process of preparing program structure models. Recommendations for further research include applying the methodology. to a larger test case to validate findings and extending it to include a comparable data reverse engineering procedure

    Being mathematical : an exploration of epistemological implications of embodied cognition

    Get PDF
    In this thesis I explore epistemological implications of embodied cognition in the hope of developing my apprehension of what it means to think mathematically. I allow my understanding of embodied cognition to emerge in stages, early in the piece laying contrasts against which it may be set, infolding elements to the purpose of qualitatively interpreting data as my thesis finds form. I use the language of autopoiesis to frame an understanding of change in the context of an individual’s learning and also within broader constructs, such as in mathematics classrooms. I recognise dualisms and set them aside in an attempt to reread what it means to think mathematically.Research from a variety of fields constitutes one part of my data, the second part being a selection of experiences drawn from mathematics classes I have taught. In balancing the two, I find that an embodied account contributes a means of interpreting mathematical experience wherein received boundaries, such as between you and me, and categories, such as "number", are not globally robust, and intentionality pervades and shapes the worlds we create.The perspective that embodiment affords my apprehension of mathematical thinking is consistent with a formulation in which judgements of what is good are aligned in part with a kind of aesthetic, whereby being moral is founded in innate dispositions. The question of what one is to do with an embodied epistemology is therefore focused on a consideration of how I am to orient myself to teaching mathematics.Throughout all of this, the locus of my attention remains within the classroom, fixed upon the goal of eliciting perspective and on developing skill in interpreting experience; on becoming a tactful teacher, sensitive to the tacit language of the body

    On Being the Right Size, Revisited: The Problem with Engineering Metaphors in Molecular Biology

    Get PDF
    In 1926, Haldane published an essay titled 'On Being the Right Size' in which he argued that the structure, function, and behavior of an organism are strongly conditioned by the physical forces that exert the greatest impact at the scale at which it exists. This chapter puts Haldane’s insight to work in the context of contemporary cell and molecular biology. Owing to their minuscule size, cells and molecules are subject to very different forces than macroscopic organisms. In a sense, macroscopic and microscopic entities inhabit different “worlds”: the former is ruled by gravity and inertia, whereas the latter is governed by Brownian motion. One implication is that we should be extremely skeptical of models and analogies that seek to explain properties of microscopic entities by appealing to properties of macroscopic ones. Unfortunately, this is precisely what the appeal to engineering metaphors in molecular biology attempts to do. Molecular biologists routinely resort to such metaphors because they are familiar and intuitively intelligible. But if our machines were the size of molecules it would be impossible for them to function the way they do. It follows that we should avoid distorting biological reality by construing it in engineering terms. In this chapter I examine four key metaphors in molecular biology – “genetic program,” “cellular circuitry,” “molecular machine,” and “molecular motor” – and I argue that their deficiencies derive from their neglect of scale. I also try to explain why many biologists today appear to have forgotten the importance of scale that Haldane drew attention to in his essay. I suggest that the reason has to do with the influence of Schrödinger’s argument in 'What is Life?' regarding the stability of the gene

    Argumentative zoning information extraction from scientific text

    Get PDF
    Let me tell you, writing a thesis is not always a barrel of laughs—and strange things can happen, too. For example, at the height of my thesis paranoia, I had a re-current dream in which my cat Amy gave me detailed advice on how to restructure the thesis chapters, which was awfully nice of her. But I also had a lot of human help throughout this time, whether things were going fine or beserk. Most of all, I want to thank Marc Moens: I could not have had a better or more knowledgable supervisor. He always took time for me, however busy he might have been, reading chapters thoroughly in two days. He both had the calmness of mind to give me lots of freedom in research, and the right judgement to guide me away, tactfully but determinedly, from the occasional catastrophe or other waiting along the way. He was great fun to work with and also became a good friend. My work has profitted from the interdisciplinary, interactive and enlightened atmosphere at the Human Communication Centre and the Centre for Cognitive Science (which is now called something else). The Language Technology Group was a great place to work in, as my research was grounded in practical applications develope

    An holistic approach to architectural theory and structuralism

    Get PDF
    The author's interest in this subject emerges from seeing the environment as a whole, consisting of entities which are systems for transformation and which are responsible for the evolution of society.The approach comes from the mutual interaction of man and the environment. This interaction is expressed in many cases by building concepts, rules and theories. Architecture is considered one of the obvious means of this type of interaction by which man, over time, tried to clarify this interaction by building his shelter to accommodate his different life activities. This led to the creation and establishment of rules, constraints, and then theories in architecture that control this interaction.Architecture cannot be seen as a synchronic phenomenon but it is diachronic and in a continous evolution and development. There is a distinction between what one can see in the environment as surface structure and the embedded meaning and symbolism of deep structure. In order to analyse this distinction, the research adopts structuralism as an holistic tool to address this relationship within the environment.For this reason, architectural theories and structuralism are the two pillars to build and test the statement of the study that leads to the provision of an holistic approach to architectural theory based on structuralism.The study takes an empirical approach to test and confirm the holistic approach, hence, it adopts a methodology to analyse and interpret the case study entities. This methodology follows two main approaches to fulfil these objectives:Deductive: A theoretical investigation of the ideas of the interaction between man and the environment which leads to emphasising environmental entities as systems for transformation. This premise leads to the ji adoption of structuralism as an holistic method and as a tool for the better understanding and analysis of these entities.nductive: An empirical approach takes Salt city in Jordan as a case study area. This part represents a real field of information and application. The empirical work supports the propositions that architectural phenomena are an embodiment of cultural values and the social structure. The empirical work collected and elicited people's opinions and preferences through an open -ended questionnaire and drawings of cognitive maps.This study helps architects and designers to understand and then analyse the deep structure of the society as a base to design, after taking into consideration the mechanism that connects the surface structure to the underlying cultural values and meanings that are responding to people's needs and requirements. This may be achieved in architecture and urban planning through holistic thinking that is based on structuralism

    Physics Avoidance & Cooperative Semantics: Inferentialism and Mark Wilson’s Engagement with Naturalism Qua Applied Mathematics

    Get PDF
    Mark Wilson argues that the standard categorizations of "Theory T thinking"— logic-centered conceptions of scientific organization (canonized via logical empiricists in the mid-twentieth century)—dampens the understanding and appreciation of those strategic subtleties working within science. By "Theory T thinking," we mean to describe the simplistic methodology in which mathematical science allegedly supplies ‘processes’ that parallel nature's own in a tidily isomorphic fashion, wherein "Theory T’s" feigned rigor and methodological dogmas advance inadequate discrimination that fails to distinguish between explanatory structures that are architecturally distinct. One of Wilson's main goals is to reverse such premature exclusions and, thus, early on Wilson returns to John Locke's original physical concerns regarding material science and the congeries of descriptive concern insofar as capturing varied phenomena (i.e., cohesion, elasticity, fracture, and the transmission of coherent work) encountered amongst ordinary solids like wood and steel are concerned. Of course, Wilson methodologically updates such a purview by appealing to multiscalar techniques of modern computing, drawing from Robert Batterman's work on the greediness of scales and Jim Woodward's insights on causation
    • 

    corecore