39,891 research outputs found
Prototyping Formal System Models with Active Objects
We propose active object languages as a development tool for formal system
models of distributed systems. Additionally to a formalization based on a term
rewriting system, we use established Software Engineering concepts, including
software product lines and object orientation that come with extensive tool
support. We illustrate our modeling approach by prototyping a weak memory
model. The resulting executable model is modular and has clear interfaces
between communicating participants through object-oriented modeling.
Relaxations of the basic memory model are expressed as self-contained variants
of a software product line. As a modeling language we use the formal active
object language ABS which comes with an extensive tool set. This permits rapid
formalization of core ideas, early validity checks in terms of formal invariant
proofs, and debugging support by executing test runs. Hence, our approach
supports the prototyping of formal system models with early feedback.Comment: In Proceedings ICE 2018, arXiv:1810.0205
Extending the Real-Time Maude Semantics of Ptolemy to Hierarchical DE Models
This paper extends our Real-Time Maude formalization of the semantics of flat
Ptolemy II discrete-event (DE) models to hierarchical models, including modal
models. This is a challenging task that requires combining synchronous
fixed-point computations with hierarchical structure. The synthesis of a
Real-Time Maude verification model from a Ptolemy II DE model, and the formal
verification of the synthesized model in Real-Time Maude, have been integrated
into Ptolemy II, enabling a model-engineering process that combines the
convenience of Ptolemy II DE modeling and simulation with formal verification
in Real-Time Maude.Comment: In Proceedings RTRTS 2010, arXiv:1009.398
Object-Oriented Modeling and Design Using DELTA, an Incremental Design Language.
Object-oriented technology has opened the doors for many new ideas in system development. The object-oriented paradigm has produced many new object-oriented programming languages. As with any new methodology, a need for formalism arises to remove ambiguities and inconsistencies and to bring a sense of continuity to software design. Formal languages provide a sound basis for software development throughout the software life cycle. This work presents a set of characteristic features for object-oriented design languages and defines a formal object-oriented design language, DELTA. The rapidly changing face of software has led to an ever increasing need to update out-of-date methods and user interfaces. Software developers want to be able to use the same type of visual interfaces available in application software. The introduction of windowing environments has led to a market for methodologies which incorporate graphical features to supplement textual components of software. The present genre of formal languages must evolve in the same direction to be considered as effective in the design process. DELTA meets this need by providing a modern development environment with graphical features to complement the text that is necessary in any design specification. Researchers and prominent software engineers have provided a litany of object-oriented methodologies. The commonality of these methods is the step-by-step approach to software development. Software engineers agree in theory that the best approach to designing software which will stand the test of time is one which has a sound established discipline. Such a discipline produces a design in incrementations. DELTA supports this theory by providing established levels of incremental design representation. The advent of computer-aided design has led to the evolution of rapid-prototyping. Changes in system requirements, detection of errors, competition in the market, and the ongoing maintenance of software systems can be addressed by the development of system prototypes. DELTA responds to this challenge by establishing a design specification representation which can be easily mapped to an object-oriented programming language. This transition from design to prototype can be enhanced by formal annotations to the chosen implementation language. Annotations have been developed for DELTA software designs prototyped in the object-oriented language Actor
Knowledge and perceptions in participatory policy processes: lessons from the delta-region in the Netherlands
Water resources management issues tend to affect a variety of uses and users. Therefore, they often exhibit complex and unstructured problems. The complex, unstructured nature of these problems originates from uncertain knowledge and from the existence of divergent perceptions among various actors. Consequently, dealing with these problems is not just a knowledge problem; it is a problem of ambiguity too. This paper focuses on a complex, unstructured water resources management issue, the sustainable development—for ecology, economy and society—of the Delta-region of the Netherlands. In several areas in this region the ecological quality decreased due to hydraulic constructions for storm water safety, the Delta Works. To improve the ecological quality, the Dutch government regards the re-establishment of estuarine dynamics in the area as the most important solution. However, re-establishment of estuarine dynamics will affect other uses and other users. Among the affected users are farmers in the surrounding areas, who use freshwater from a lake for agricultural purposes. This problem has been addressed in a participatory decision-making process, which is used as a case study in this paper. We investigate how the dynamics in actors’ perceptions and the knowledge base contribute to the development of agreed upon and valid knowledge about the problem–solution combination, using our conceptual framework for problem structuring. We found that different knowledge sources—expert and practical knowledge—should be integrated to create a context-specific knowledge base, which is scientifically valid and socially robust. Furthermore, we conclude that for the convergence of actors’ perceptions, it is essential that actors learn about the content of the process (cognitive learning) and about the network in which they are involved (strategic learning). Our findings form a plea for practitioners in water resources management to adopt a problem structuring approach in order to deal explicitly with uncertainty and ambiguity
Recommended from our members
Beyond Standard Assumptions - Semiparametric Models, A Dyadic Item Response Theory Model, and Cluster-Endogenous Random Intercept Models
In most statistical analyses, quantitative education researchers often make simplifying assumptions regarding the manner in which their data was generated in order to answer some of these questions. These assumptions can help to reduce the complexity of the problem, and allow the researcher to describe their data using a simpler, and often times more interpretable, statistical model. However, making some of these assumptions when they are not true can lead to biased estimates and misleading answers. While the standard sets of assumptions associated with commonly-used statistical models are usually sufficient in a wide range of contexts, it will always be beneficial for education researchers to understand what they are, when they are reasonable, and how to modify them if necessary. This dissertation focuses on three of the most common models used in quantitative education research (viz. parametric models like Linear Models (LMs), Item Response Theory (IRT) models, and Random-Intercept Models (RIMs)), discusses the standard sets of assumptions that accompany these models, and then describes related models with less stringent sets of assumptions. In each of the following three chapters, we either explicitly unpack existing models that are useful but are currently still uncommon in the field of education research, or propose novel models and/or estimation strategies for these models. We begin in Chapter 1 with a common parametric model known as the Gaussian LM, and use it as a scaffold to better understand semiparametric models and their estimation. We begin by reviewing how the coefficients of the Gaussian LM are usually estimated using Maximum Likelihood (ML) or Least-Squares (LS). We then introduce the notion of an -estimator as well as that of a Regular Asymptotically Linear estimator, and show how they relate to the ML estimator. In particular, we introduce the notion of influence functions/curves and discuss their geometry together with concepts such as Hilbert spaces and tangent spaces. We then demonstrate, concretely, how to derive the so-called efficient influence function under the Gaussian LM, and show that it is precisely the influence function of the ML and (Ordinary) LS estimators. This shows that the ML estimator (at least under the Gaussian LM) is efficient. Using the foundation built, we move on from the Gaussian LM by relaxing both the assumption that the residuals are normally distributed, as well as the assumption that they have a constant variance, and define this as the Heteroskedastic Linear Model. Unlike the Gaussian LM, this is a semiparametric model. Where possible, we make use of intuition and analogous results from the parametric setting to help describe the workflow for obtaining an efficient estimator for the coefficients of the Heteroskedastic Linear Model. In particular, we derive the nuisance tangent space for this semiparametric model, and use it to obtain the efficient influence function for our model. We then show how to use the efficient influence function to obtain an efficient estimator (which happens to be the Weighted LS estimator) from the (Ordinary) LS estimator via a one-step approach as well as an estimating equations approach. We then conclude by directing readers to more advanced material, including references on more modern approaches to estimating more general semiparametric models such as Targeted Maximum Likelihood Estimation. In Chapter 2, we focus on a class of measurement models known as Item Response Theory models which are useful for measuring latent traits of a subject based on the subject's response to items. We relax the condition that the responses are only a result of the individual's latent trait (and possibly an external rater), and propose a dyadic Item Response Theory (dIRT) model for measuring interactions of pairs of individuals when the responses to items represent the actions (or behaviors, perceptions, etc.) of each individual (actor) made within the context of a dyad formed with another individual (partner). Examples of its use in education include the assessment of collaborative problem solving among students, or the evaluation of intra-departmental dynamics among teachers. The dIRT model generalizes both Item Response Theory models for measurement and the Social Relations Model for dyadic data. Here, the responses of an actor when paired with a partner are modeled as a function of not only the actor's inclination to act and the partner's tendency to elicit that action, but also the unique relationship of the pair, represented by two directional, possibly correlated, interaction latent variables. We discuss generalizations such as accommodating triads or larger groups, but focus on demonstrating the key idea in the dyadic case. We show that estimation may be performed using Markov-chain Monte Carlo implemented in \texttt{Stan}, making it straightforward to extend the dIRT model in various ways. Specifically, we show how the basic dIRT model can be extended to accommodate latent regressions, random effects, distal outcomes. We perform a simulation study that demonstrates that our estimation approach performs well. In the absence of educational data of this form, we demonstrate the usefulness of our proposed approach using speed-dating data instead, and find new evidence of pairwise interactions between participants, describing a mutual attraction that is inadequately characterized by individual properties alone.Finally, in Chapter 3, we consider the often implicit assumption made when estimating the coefficients of structural Random Intercept Models (RIMs) that covariates at all levels do not co-vary with the random intercepts. A violation of this assumption (called cluster-level endogeneity) leads to inconsistent estimates when using standard estimation procedures. For two-level RIMs with such endogeneity, Hausman and Taylor (HT) devised a consistent multi-step instrumental variable estimator using only internal instruments. We, instead, approach this problem by explicitly modeling the endogeneity using a Structural Equation Model (SEM). In this chapter, we compare, through simulation, the HT and SEM estimators, and evaluate their asymptotic and finite sample properties. We show that the SEM approach is also flexible enough to deal with different exchangeability assumptions for the covariates (e.g., whether the correlations between pairs of all units in a cluster are the same) and investigate how these exchangeability assumptions affect finite sample properties of the HT estimator. For the simulations, we propose a new procedure for generating cluster- and unit-level covariates and random intercepts with a fully flexible covariance structure. We also compare our approach to another common approach known as Multilevel Matching using data from the High School and Beyond survey
Reconciling Actors' Preferences in Agricultural Policy - Towards a New Management of Public Decisions
To attain sustainable development in the 21st century, the world's population still has to overcome many challenges; hunger, poverty, environmental degradation and depletion. Policy design in such a context is and will remain a complex task. On one hand, policy makers often lack information on stakeholders' strategies and constraints as well as on potential options for improvement. On the other hand, stakeholders do not always adhere to policies for lack of understanding of the pursued goals. It is not unusual to observe that often, real policy effects are not those initially expected. Furthermore, existing decision-making mechanisms for public intervention are increasingly questioned due to pressure for market liberalization, decentralization processes and the increasing role of the civil society. However, while the classical role of government is challenged, few methods have been proposed to enable the design of viable alternatives. The approach presented in this book is a contribution to the improvement of efficiency in public decision-making. Based on practical experience from Viet Nam, Indonesia and other countries, it proposes new methods for the identification of policy objective, stakeholders and issues at stake, and for the definition and implementation of concrete actions. It also provides means and guidance to foster progressive actors' participation and involvement in decision-making and policy implementation processes. Key Words: analytical methods, tools, decision making, agricultural policiesanalytical methods, tools, decision making, agricultural policies, Agricultural and Food Policy,
- …