426 research outputs found

    Data re-engineering using formal transformations

    Get PDF
    This thesis presents and analyses a solution to the problem of formally re- engineering program data structures, allowing new representations of a program to be developed. The work is based around Ward's theory of program transformations which uses a Wide Spectrum Language, WSL, whose semantics were specially developed for use in proof of program transformations. The re-engineered code exhibits equivalent functionality to the original but differs in the degree of data abstraction and representation. Previous transformational re-engineering work has concentrated upon control flow restructuring, which has highlighted a lack of support for data restructuring in the maintainer's tool-set. Problems have been encountered during program transformation due to the lack of support for data re-engineering. A lack of strict data semantics and manipulation capabilities has left the maintainer unable to produce optimally re-engineered solutions. It has also hindered the migration of programs into other languages because it has not been possible to convert data structures into an appropriate form in the target language. The main contribution of the thesis is the Data Re-Engineering and Abstraction Mechanism (DREAM) which allows theories about type equivalence to be represented and used in a re-engineering environment. DREAM is based around the technique of "ghosting", a way of introducing different representations of data, which provides the theoretical underpinning of the changes applied to the program. A second major contribution is the introduction of data typing into the WSL language. This allows DREAM to be integrated into the existing transformation theories within WSL. These theoretical extensions of the original work have been shown to be practically viable by implementation within a prototype transformation tool, the Maintainer's Assistant. The extended tool has been used to re-engineer heavily modified, commercial legacy code. The results of this have shown that useful re-engineering work can be performed and that DREAM integrates well with existing control flow transformations

    Generalized and Customizable Sets in R

    Get PDF
    We present data structures and algorithms for sets and some generalizations thereof (fuzzy sets, multisets, and fuzzy multisets) available for R through the sets package. Fuzzy (multi-)sets are based on dynamically bound fuzzy logic families. Further extensions include user-definable iterators and matching functions.

    A method for maintaining new software

    Get PDF
    This thesis describes a novel method for perfective maintenance of software which has been developed from specifications using formal transformations. The list of applied transformations provides a suitable derivation history to use when changes are made to the software. The method uses transformations which have been implemented in a tool called the Maintainer's Assistant for the purposes of restructuring code. The method uses these transformations for refinement. Comparisons are made between sequential transformations, refinement calculi and standard proof based refinement techniques for providing a suitable derivation history to use when changes are made in the requirements of a system. Two case studies are presented upon which these comparisons are based and on which the method is tested. Criteria such as saleability, speed, ease, design improvements and software quality is used to argue that transformations are a more favourable basis of refinement. Metrics are used to evaluate the complexity of the code developed using the method. Conclusions of how to develop different types of specifications into code and on how best to apply various changes are presented. An approach which is recommended is to use transformations for splitting the specification so that original refinement paths can still be used. Using transformations for refining a specification and recording this path produces software of a better structure and of higher maintainability. Having such a path improves the speed and ease of future alterations to the system. This is more cost effective than redeveloping the software from a new specification

    The Concept of Mandatory Jurisdiction

    Get PDF

    Classifying the suras by their lexical semantics :an exploratory multivariate analysis approach to understanding the Qur'an

    Get PDF
    PhD ThesisThe Qur'an is at the heart of Islamic culture. Careful, well-informed interpretation of it is fundamental both to the faith of millions of Muslims throughout the world, and also to the non-Islamic world's understanding of their religion. There is a long and venerable tradition of Qur'anic interpretation, and it has necessarily been based on literary-historical methods for exegesis of hand-written and printed text. Developments in electronic text representation and analysis since the second half of the twentieth century now offer the opportunity to supplement traditional techniques by applying the newly-emergent computational technology of exploratory multivariate analysis to interpretation of the Qur'an. The general aim of the present discussion is to take up that opportunity. Specifically, the discussion develops and applies a methodology for discovering the thematic structure of the Qur'an based on a fundamental idea in a range of computationally oriented disciplines: that, with respect to some collection of texts, the lexical frequency profiles of the individual texts are a good indicator of their semantic content, and thus provide a reliable criterion for their conceptual categorization relative to one another. This idea is applied to the discovery of thematic interrelationships among the suras that constitute the Qur'an by abstracting lexical frequency data from them and then analyzing that data using exploratory multivariate methods in the hope that this will generate hypotheses about the thematic structure of the Qur'an. The discussion is in eight main parts. The first part introduces the discussion. The second gives an overview of the structure and thematic content of the Qur'an and of the tradition of Qur'anic scholarship devoted to its interpretation. The third part xvi defines the research question to be addressed together with a methodology for doing so. The fourth reviews the existing literature on the research question. The fifth outlines general principles of data creation and applies them to creation of the data on which the analysis of the Qur'an in this study is based. The sixth outlines general principles of exploratory multivariate analysis, describes in detail the analytical methods selected for use, and applies them to the data created in part five. The seventh part interprets the results of the analyses conducted in part six with reference to the existing results in Qur'anic interpretation described in part two. And, finally, the eighth part draws conclusions relative to the research question and identifies directions along which the work presented in this study can be developed

    Differential Probability Functions for Investigating Long-Term Changes in Local and Regional Air Pollution Sources

    Get PDF
    Conditional probability functions are commonly used for source identification purposes in air pollution studies. CBPF (conditional bivariate probability function) categorizes the probability of high concentrations being observed at a location by wind direction/speed and investigate the directionality of local sources. PSCF (potential source contribution function), a trajectory-ensemble method, identifies the source regions most likely to be associated with high measured concentrations. However, these techniques do not allow the direct identification of areas where changes in emissions have occurred. This study presents an extension of conditional probability methods in which the differences between conditional probability values for temporally different sets of data can be used to explore changes in emissions from source locations. The differential CBPF and differential PSCF were tested using a long-term series of air quality data (12 years; 2005/2016) collected in Rochester, NY. The probability functions were computed for each of 4 periods that represent known changes in emissions. Correlation analyses were also performed on the results to find pollutants undergoing similar changes in local and regional sources. The differential probability functions permitted the identification of major changes in local and regional emission location. In Rochester, changes in local air pollution were related to the shutdown of a large coal power plant (SO2) and to the abatement measures applied to road and off-road traffic (primary pollutants). The concurrent effects of these changes in local emissions were also linked to reduced concentrations of nucleation mode particles. Changes in regional source areas were related to the decreases in secondary inorganic aerosol and organic carbon. The differential probabilities for sulfate, nitrate, and organic aerosol were consistent with differences in the available National Emission Inventory annual emission values. Changes in the source areas of black carbon and PM2.5 mass concentrations were highly correlated

    Price Transmission and Marketing Margins in the Slovenian Beef and Pork Markets During Transition

    Get PDF
    As in many other transition countries processing and marketing margins are also larger in the Slovenian meat market than respective margins in market economies. In addition, margin of the Slovenian pork chain is greater than in the beef chain. Its decline in the pork market indicates an adjustment to more competitive markets. Co-integration models are applied to estimate vertical price transmission and to examine margins and degree of competition in the meat marketing chains. Results indicate the existence of a long run equilibrium regarding vertical price transmission in the beef and pork sectors. Both the farm-gate beef and pork prices are identified as weakly exogenous in the long run. The structural tests imposing a homogeneity restriction suggest a mark-up long-run price strategy for beef and a competitive price strategy for pork after 1994 in the meat processing and marketing chains.price transmission, marketing margin, co-integration, competition, Marketing, D4, L1, C3, Q1,
    • …
    corecore