10,068 research outputs found

    Abstracting object interactions using composition filters

    Get PDF
    It is generally claimed that object-based models are very suitable for building distributed system architectures since object interactions follow the client-server model. To cope with the complexity of today's distributed systems, however, we think that high-level linguistic mechanisms are needed to effectively structure, abstract and reuse object interactions. For example, the conventional object-oriented model does not provide high-level language mechanisms to model layered system architectures. Moreover, we consider the message passing model of the conventional object-oriented model as being too low-level because it can only specify object interactions that involve two partner objects at a time and its semantics cannot be extended easily. This paper introduces Abstract Communication Types (ACTs), which are objects that abstract interactions among objects. ACTs make it easier to model layered communication architectures, to enforce the invariant behavior among objects, to reduce the complexity of programs by hiding the interaction details in separate modules and to improve reusability through the application of object-oriented principles to ACT classes. We illustrate the concept of ACTs using the composition filters model

    Specifying Reusable Components

    Full text link
    Reusable software components need expressive specifications. This paper outlines a rigorous foundation to model-based contracts, a method to equip classes with strong contracts that support accurate design, implementation, and formal verification of reusable components. Model-based contracts conservatively extend the classic Design by Contract with a notion of model, which underpins the precise definitions of such concepts as abstract equivalence and specification completeness. Experiments applying model-based contracts to libraries of data structures suggest that the method enables accurate specification of practical software

    Modelling mechanisms of change in crop populations

    Get PDF
    Computer -based simulation models of changes occurring within crop populations when subjected to agents of phenotypic change, have been developed for use on commonly available personal computer equipment. As an underlying developmental principle, the models have been designed as general -case, mechanistic, stochastic models, in contrast to the predominantly empirically- derived, system -specific, deterministic (predictive) models currently available. A modelling methodology has evolved, to develop portable simulation models, written in high - level, general purpose code, allowing for use, modification and continued development by biologists with little requirement for computer programming expertise.The initial subject of these modelling activities was the simulation of the effects of selection and other agents of genetic change in crop populations, resulting in the computer model, PSELECT. Output from PSELECT, specifically phenotypic and genotypic response to phenotypic truncation selection, conformed to expectation, as defined by results from established analogue modelling work. Validation of the model by comparison of output with the results from an experimental -scale plant breeding exercise was less conclusive, and, owing to the fact that the genetic basis of the phenotypic characters used in the selection programme was insufficiently defined, the validation exercise provided only broad qualitative agreement with the model output. By virtue of the predominantly subjective nature of plant breeding programmes, the development of PSELECT resulted in a model of theoretical interest, but with little current practical application.Modelling techniques from the development of the PSELECT model were applied to the simulation of plant disease epidemics, where the modelled system is well characterised, and simulation modelling is an area of active research. The model SATSUMA, simulating the spatial and temporal development of diseases within crop populations, was developed. The model generates output which conforms to current epidemiological theory, and is compatible with contemporary methods of temporal and spatial analysis of crop disease epidemics. Temporal disease progress in the simulations was accurately described by variations of a generalised logistic model. Analysis of the spatial pattern of simulated epidemics by frequency distribution fitting or distance class methods was found to give good qualitative agreement with observed biological systems.The mechanistic nature of SATSUMA and its deliberate design as a general case model make it especially suitable for the investigation of component processes in a generalised plant disease epidemic, and valuable as an educational tool. Subject to validation against observational data, such models can be utilised as predictive tools by the incorporation of information (concerning crop species, pathogen etc.) specifically relevant to the modelled system. In addition to its educational use, SATSUMA has been used as research tool for the examination of the effect of spatial pattern of disease and disease incidence on the efficiency of sampling protocols and in parameterising a general theoretical model for describing the spatio -temporal development of plant diseases

    Logic-Based Decision Support for Strategic Environmental Assessment

    Full text link
    Strategic Environmental Assessment is a procedure aimed at introducing systematic assessment of the environmental effects of plans and programs. This procedure is based on the so-called coaxial matrices that define dependencies between plan activities (infrastructures, plants, resource extractions, buildings, etc.) and positive and negative environmental impacts, and dependencies between these impacts and environmental receptors. Up to now, this procedure is manually implemented by environmental experts for checking the environmental effects of a given plan or program, but it is never applied during the plan/program construction. A decision support system, based on a clear logic semantics, would be an invaluable tool not only in assessing a single, already defined plan, but also during the planning process in order to produce an optimized, environmentally assessed plan and to study possible alternative scenarios. We propose two logic-based approaches to the problem, one based on Constraint Logic Programming and one on Probabilistic Logic Programming that could be, in the future, conveniently merged to exploit the advantages of both. We test the proposed approaches on a real energy plan and we discuss their limitations and advantages.Comment: 17 pages, 1 figure, 26th Int'l. Conference on Logic Programming (ICLP'10

    Probabilistic Methodology and Techniques for Artefact Conception and Development

    Get PDF
    The purpose of this paper is to make a state of the art on probabilistic methodology and techniques for artefact conception and development. It is the 8th deliverable of the BIBA (Bayesian Inspired Brain and Artefacts) project. We first present the incompletness problem as the central difficulty that both living creatures and artefacts have to face: how can they perceive, infer, decide and act efficiently with incomplete and uncertain knowledge?. We then introduce a generic probabilistic formalism called Bayesian Programming. This formalism is then used to review the main probabilistic methodology and techniques. This review is organized in 3 parts: first the probabilistic models from Bayesian networks to Kalman filters and from sensor fusion to CAD systems, second the inference techniques and finally the learning and model acquisition and comparison methodologies. We conclude with the perspectives of the BIBA project as they rise from this state of the art
    • …
    corecore