363,444 research outputs found

    Simulation Models of the Evolution of Cooperation as Proofs of Logical Possibilities. How Useful Are They?

    Get PDF
    This paper discusses critically what simulation models of the evolution of cooperation can possibly prove by examining Axelrod’s “Evolution of Cooperation” (1984) and the modeling tradition it has inspired. Hardly any of the many simulation models in this tradition have been applicable empirically. Axelrod’s role model suggested a research design that seemingly allowed to draw general conclusions from simulation models even if the mechanisms that drive the simulation could not be identified empirically. But this research design was fundamentally flawed. At best such simulations can claim to prove logical possibilities, i.e. they prove that certain phenomena are possible as the consequence of the modeling assumptions built into the simulation, but not that they are possible or can be expected to occur in reality. I suggest several requirements under which proofs of logical possibilities can nevertheless be considered useful. Sadly, most Axelrod-style simulations do not meet these requirements. It would be better not to use this kind of simulations at all

    A parallelized micro-simulation platform for population and mobility behavior. Application to Belgium.

    Get PDF
    In this book we aim at developing an agent-based micro-simulation framework for (large) population evolution and mobility behaviour. More specifically we focus on the agents generation and the traffic simulation parts of the platform, and its application to Belgium. Hence we firstly develop a synthetic population generator whose main characteristics are its sample-free nature, its ability to cope with moderate data inconsistencies and different levels of aggregation. We then generate the traffic demand forecasting with a stochastic and flexible activity-based model relying on weak data requirements. Finally, a traffic simulation is completed by considering the assignment of the generated demand on the road network. We give the initial developments of a strategic agent-based alternative to the conventional simulation-based dynamic traffic assignment models

    Graph Based Verification of Software Evolution Requirements

    Get PDF
    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the following evolution problems: designing runtime reconfigurable software, obeying software design constraints while coping with evolution, reusing old software solutions for new evolution problems. This thesis presents 3 processes and tool suits that aid the developers/designers when tackling these problems.\ud The first process and tool set allow early verification of runtime reconfiguration requirements. In this process the UML models are converted into a graph-based model. The execution semantics of UML are modeled by graph transformation rules. Using these graph transformation rules, the execution of the UML models is simulated. The simulation generates a state-space showing all possible reconfigurations. The runtime reconfiguration requirements are expressed by computational tree logic or with a visual state-based language, which are verified over the generated state-space. When the verification fails a feedback on the problem is provided.\ud The second process and tool set are developed for computer aided detection of static program constraint violations. We developed a modeling language called Source Code Modeling Language (SCML) in which program elements from the source code can be represented. In the proposed process for constraint violation detection, the source code is converted into SCML models. The constraint detection is realized by graph transformation rules. The rules detect the violation and extract information from the SCML model to provide feedback on the location of the problem.\ud The third process and tool set provide computer aided verification of whether a design idiom can be used to implement a change request. The developers tend to implement evolution requests using software structures that are familiar to them; called design idioms. Graph transformations are used for detecting whether the constraints of the design idiom are satisfied or not. For a given design idiom and given source files in SCML, the implementation of the idiom is simulated. If the simulation succeeds, then the models are converted to source code.\u

    The Evolution of Complex Muscle Cell In Vitro Models to Study Pathomechanisms and Drug Development of Neuromuscular Disease

    Get PDF
    Many neuromuscular disease entities possess a significant disease burden and therapeutic options remain limited. Innovative human preclinical models may help to uncover relevant disease mechanisms and enhance the translation of therapeutic findings to strengthen neuromuscular disease precision medicine. By concentrating on idiopathic inflammatory muscle disorders, we summarize the recent evolution of the novel in vitro models to study disease mechanisms and therapeutic strategies. A particular focus is laid on the integration and simulation of multicellular interactions of muscle tissue in disease phenotypes in vitro. Finally, the requirements of a neuromuscular disease drug development workflow are discussed with a particular emphasis on cell sources, co-culture systems (including organoids), functionality, and throughput.Peer Reviewe

    Leveraging Evolutionary Changes for Software Process Quality

    Full text link
    Real-world software applications must constantly evolve to remain relevant. This evolution occurs when developing new applications or adapting existing ones to meet new requirements, make corrections, or incorporate future functionality. Traditional methods of software quality control involve software quality models and continuous code inspection tools. These measures focus on directly assessing the quality of the software. However, there is a strong correlation and causation between the quality of the development process and the resulting software product. Therefore, improving the development process indirectly improves the software product, too. To achieve this, effective learning from past processes is necessary, often embraced through post mortem organizational learning. While qualitative evaluation of large artifacts is common, smaller quantitative changes captured by application lifecycle management are often overlooked. In addition to software metrics, these smaller changes can reveal complex phenomena related to project culture and management. Leveraging these changes can help detect and address such complex issues. Software evolution was previously measured by the size of changes, but the lack of consensus on a reliable and versatile quantification method prevents its use as a dependable metric. Different size classifications fail to reliably describe the nature of evolution. While application lifecycle management data is rich, identifying which artifacts can model detrimental managerial practices remains uncertain. Approaches such as simulation modeling, discrete events simulation, or Bayesian networks have only limited ability to exploit continuous-time process models of such phenomena. Even worse, the accessibility and mechanistic insight into such gray- or black-box models are typically very low. To address these challenges, we suggest leveraging objectively [...]Comment: Ph.D. Thesis without appended papers, 102 page

    Industrial process simulation for manufacturing performance assessment

    Get PDF
    Industrial process simulation for manufacturing process assessment As the industrial requirements change at an important pace due to the evolution of Technology and the digitalization of Manufacturing and Production operations, the necessity of investigating potential alternatives toward more efficient industrial line design arises more intensely than ever. The urge towards the digitalization of production in the context of the industry 4.0 framework has shaped the rise of simulation in the design and operation of manufacturing systems. Industrial system simulation is a power tool for designing and evaluating the performance of manufacturing systems, due to its low cost, low risk, and quick analysis and insight that it provides. This paper studies the usage of simulation models and ARENA simulation software in the analysis and simulation of an industrial manufacturing line located in lab TR2 at UPC, using Discrete Event System technique, which is based on queue theory. This paper proposed a methodic method and steps used for modelling the lined by using DES technique, which describes a system response in occurrence of an event possibly required to meet certain conditions. Finally, the paper addresses the improvement opportunity on the retainers of the line to better its production capacity.Incomin

    Environmental Innovations and Industrial Dynamics (In French)

    Get PDF
    This article presents an empirical and theoretical analysis of environmental innovations in an evolutionary framework. Such an evolutionary analysis enables us to develop a dynamic analysis of environmental innovations emphasizing their multidimensional character and their evolution along technological trajectories embedded in dominant technological paradigms. In this perspective, environmental innovations appear as technological compromises which aim at combining regulatory objectives and environmental performances with productivity and competitiveness objectives of firms. The sectoral analyses presented in the first section illustrate these concepts, in particular the trajectories of clean technology and the sources of technological lock-in in the automotive industry and in green chemistry. In section 2, we focus on the role of demand and of environmental quality requirements in vertical relationships between firms. We present an evolutionary model of industrial dynamics which explicitly takes into account innovative activities of suppliers and environmental requirements of industrial clients. Simulation results underscore the determining role of demand in technological paradigm shifts, and more particularly the role of a critical mass of users characterized by high environmental requirements and high willingness to pay.environmental innovations; technological trajectories and paradigms; technological compromises; industrial dynamics and evolutionary simulation models

    Agent-based simulation of open source evolution

    Get PDF
    We present an agent-based simulation model developed to study how size, complexity and effort relate to each other in the development of open source software (OSS). In the model, many developer agents generate, extend, and re-factor code modules independently and in parallel. This accords with empirical observations of OSS development. To our knowledge, this is the first model of OSS evolution that includes the complexity of software modules as a limiting factor in productivity, the fitness of the software to its requirements, and the motivation of developers. Validation of the model was done by comparing the simulated results against four measures of software evolution (system size, proportion of highly complex modules, level of complexity control work, and distribution of changes) for four large OSS systems. The simulated results resembled the observed data, except for system size: three of the OSS systems showed alternating patterns of super-linear and sub-linear growth, while the simulations produced only super-linear growth. However, the fidelity of the model for the other measures suggests that developer motivation and the limiting effect of complexity on productivity have a significant effect on the development of OSS systems and should be considered in any model of OSS development
    • 

    corecore