397,551 research outputs found

    In defense of mechanism

    Get PDF
    In Life Itself and in Essays on Life Itself, Robert Rosen (1991, 2000) argued that machines were, in principle, incapable of modeling the defining feature of living systems, which he claimed to be the existence of closed causal loops. Rosen's argument has been used to support critiques of computational models in ecological psychology. This article shows that Rosen's attack on mechanism is fundamentally misconceived. It is, in fact, of the essence of a mechanical system that it contains closed causal loops. Moreover, Rosen's epistemology is based on a strong form of indirect realism and his arguments, if correct, would call into question some of the fundamental principles of ecological psychology

    JPI Feature Models: Exploring a JPI and FOP Symbiosis for Software Modeling

    Get PDF
    Looking for a complete modular software development paradigm, this article presents Join Point Interface JPI Feature Models, in the context of a JPI and Feature-Oriented Programming FOP symbiosis paradigm. Therefore, this article describes pros and cons of JPI and FOP approaches for the modular software and software product line production, respective; and highlights the benefits of this mixing proposal; in particular, the JPI Feature Model benefits for a high-level software product line modeling. As an application example, this article applies JPI Features Models on a classic FOP example already modeled using a previous aspect-oriented feature model proposal. Main goals of this application are to visualize traditional feature models preserved components such alternative and optional feature sets and optional and mandatory features as well as special features associations (cross-tree constraints), and differences and advantages with respect to previous research works about extending feature model to support aspect-oriented modeling principles

    Uncertainty quantification for kinetic models in socio-economic and life sciences

    Full text link
    Kinetic equations play a major rule in modeling large systems of interacting particles. Recently the legacy of classical kinetic theory found novel applications in socio-economic and life sciences, where processes characterized by large groups of agents exhibit spontaneous emergence of social structures. Well-known examples are the formation of clusters in opinion dynamics, the appearance of inequalities in wealth distributions, flocking and milling behaviors in swarming models, synchronization phenomena in biological systems and lane formation in pedestrian traffic. The construction of kinetic models describing the above processes, however, has to face the difficulty of the lack of fundamental principles since physical forces are replaced by empirical social forces. These empirical forces are typically constructed with the aim to reproduce qualitatively the observed system behaviors, like the emergence of social structures, and are at best known in terms of statistical information of the modeling parameters. For this reason the presence of random inputs characterizing the parameters uncertainty should be considered as an essential feature in the modeling process. In this survey we introduce several examples of such kinetic models, that are mathematically described by nonlinear Vlasov and Fokker--Planck equations, and present different numerical approaches for uncertainty quantification which preserve the main features of the kinetic solution.Comment: To appear in "Uncertainty Quantification for Hyperbolic and Kinetic Equations

    The Semantic Student: Using Knowledge Modeling Activities to Enhance Enquiry-Based Group Learning in Engineering Education

    Get PDF
    This paper argues that training engineering students in basic knowledge modeling techniques, using linked data principles, and semantic Web tools – within an enquiry-based group learning environment – enables them to enhance their domain knowledge, and their meta-cognitive skills. Knowledge modeling skills are in keeping with the principles of Universal Design for instruction. Learners are empowered with the regulation of cognition as they become more aware of their own development. This semantic student approach was trialed with a group of 3rd year Computer Engineering Students taking a module on computer architecture. An enquiry-based group learning activity was developed to help learners meet selected module learning outcomes. Learners were required to use semantic feature analysis and linked data principles to create a visual model of their knowledge structure. Results show that overall student attainment was increased when knowledge modeling activities were included as part of the learning process. A recommendation for practice to incorporate knowledge modeling as a learning strategy within an overall engineering curriculum framework is described. This can be achieved using semantic Web technologies such as semantic wikis and linked data tools

    Data granulation by the principles of uncertainty

    Full text link
    Researches in granular modeling produced a variety of mathematical models, such as intervals, (higher-order) fuzzy sets, rough sets, and shadowed sets, which are all suitable to characterize the so-called information granules. Modeling of the input data uncertainty is recognized as a crucial aspect in information granulation. Moreover, the uncertainty is a well-studied concept in many mathematical settings, such as those of probability theory, fuzzy set theory, and possibility theory. This fact suggests that an appropriate quantification of the uncertainty expressed by the information granule model could be used to define an invariant property, to be exploited in practical situations of information granulation. In this perspective, a procedure of information granulation is effective if the uncertainty conveyed by the synthesized information granule is in a monotonically increasing relation with the uncertainty of the input data. In this paper, we present a data granulation framework that elaborates over the principles of uncertainty introduced by Klir. Being the uncertainty a mesoscopic descriptor of systems and data, it is possible to apply such principles regardless of the input data type and the specific mathematical setting adopted for the information granules. The proposed framework is conceived (i) to offer a guideline for the synthesis of information granules and (ii) to build a groundwork to compare and quantitatively judge over different data granulation procedures. To provide a suitable case study, we introduce a new data granulation technique based on the minimum sum of distances, which is designed to generate type-2 fuzzy sets. We analyze the procedure by performing different experiments on two distinct data types: feature vectors and labeled graphs. Results show that the uncertainty of the input data is suitably conveyed by the generated type-2 fuzzy set models.Comment: 16 pages, 9 figures, 52 reference

    The Role of Tarski’s Declarative Semantics in the Design of Modeling Languages

    Get PDF
    This paper focuses on Tarski`s declarative semantics and their usefulness in the design of a modeling language. We introduce the principles behind Tarski`s approach to semantics and explain what advantages this offers in the context of modeling languages. Using sentential logic we demonstrate the necessity and sufficiency of Tarski`s semantics for effectively addressing several issues that arise in the design of modeling languages. We explain what role Tarski`s semantics play in the organization of a modeling language. This role is compared to the analogous roles of denotational semantics and operational semantics. We show that in the context of a modeling language Tarski`s semantics are complementary to the other two kinds of semantics. The paper is intended to assist modeling language researchers and designers, particularly in connection with the UML - a language that in its current form does not feature Tarski`s declarative semantics

    Pharmacokinetic-Pharmacodynamic Modeling in Pediatric Drug Development, and the Importance of Standardized Scaling of Clearance.

    Get PDF
    Pharmacokinetic/pharmacodynamic (PKPD) modeling is important in the design and conduct of clinical pharmacology research in children. During drug development, PKPD modeling and simulation should underpin rational trial design and facilitate extrapolation to investigate efficacy and safety. The application of PKPD modeling to optimize dosing recommendations and therapeutic drug monitoring is also increasing, and PKPD model-based dose individualization will become a core feature of personalized medicine. Following extensive progress on pediatric PK modeling, a greater emphasis now needs to be placed on PD modeling to understand age-related changes in drug effects. This paper discusses the principles of PKPD modeling in the context of pediatric drug development, summarizing how important PK parameters, such as clearance (CL), are scaled with size and age, and highlights a standardized method for CL scaling in children. One standard scaling method would facilitate comparison of PK parameters across multiple studies, thus increasing the utility of existing PK models and facilitating optimal design of new studies
    • 

    corecore