16,683 research outputs found
Enterprise model verification and validation : an approach
This article presents a verification and validation approach which is used here in order to complete the classical tool box the industrial user may utilize in enterprise modeling and integration domain. This approach, which has been defined independently from any application domain is based on several formal concepts and tools presented in this paper. These concepts are property concepts, property reference matrix, properties graphs, enterprise modeling domain ontology, conceptual graphs and formal reasoning mechanisms
An Ontological Basis for Design Methods
This paper presents a view of design methods as process artefacts that can be represented using the function-behaviour-structure (FBS) ontology. This view allows identifying five fundamental approaches to methods: black-box, procedural, artefact-centric, formal and managerial approaches. They all describe method structure but emphasise different aspects of it. Capturing these differences addresses common terminological confusions relating to methods. The paper provides an overview of the use of the fundamental method approaches for different purposes in designing. In addition, the FBS ontology is used for developing a notion of prescriptiveness of design methods as an aggregate construct defined along four dimensions: certainty, granularity, flexibility and authority. The work presented in this paper provides an ontological basis for describing, understanding and managing design methods throughout their life cycle.
Keywords:
Design Methods; Function-Behaviour-Structure (FBS) Ontology; Prescriptive Design Knowledge</p
Modelling public transport accessibility with Monte Carlo stochastic simulations: A case study of Ostrava
Activity-based micro-scale simulation models for transport modelling provide better evaluations of public transport accessibility, enabling researchers to overcome the shortage of reliable real-world data. Current simulation systems face simplifications of personal behaviour, zonal patterns, non-optimisation of public transport trips (choice of the fastest option only), and do not work with real targets and their characteristics. The new TRAMsim system uses a Monte Carlo approach, which evaluates all possible public transport and walking origin-destination (O-D) trips for k-nearest stops within a given time interval, and selects appropriate variants according to the expected scenarios and parameters derived from local surveys. For the city of Ostrava, Czechia, two commuting models were compared based on simulated movements to reach (a) randomly selected large employers and (b) proportionally selected employers using an appropriate distance-decay impedance function derived from various combinations of conditions. The validation of these models confirms the relevance of the proportional gravity-based model. Multidimensional evaluation of the potential accessibility of employers elucidates issues in several localities, including a high number of transfers, high total commuting time, low variety of accessible employers and high pedestrian mode usage. The transport accessibility evaluation based on synthetic trips offers an improved understanding of local situations and helps to assess the impact of planned changes.Web of Science1124art. no. 709
Recommended from our members
Propagation of Pericentral Necrosis During Acetaminophen-Induced Liver Injury: Evidence for Early Interhepatocyte Communication and Information Exchange.
Acetaminophen (APAP)-induced liver injury is clinically significant, and APAP overdose in mice often serves as a model for drug-induced liver injury in humans. By specifying that APAP metabolism, reactive metabolite formation, glutathione depletion, and mitigation of mitochondrial damage within individual hepatocytes are functions of intralobular location, an earlier virtual model mechanism provided the first concrete multiattribute explanation for how and why early necrosis occurs close to the central vein (CV). However, two characteristic features could not be simulated consistently: necrosis occurring first adjacent to the CV, and subsequent necrosis occurring primarily adjacent to hepatocytes that have already initiated necrosis. We sought parsimonious model mechanism enhancements that would manage spatiotemporal heterogeneity sufficiently to enable meeting two new target attributes and conducted virtual experiments to explore different ideas for model mechanism improvement at intrahepatocyte and multihepatocyte levels. For the latter, evidence supports intercellular communication via exosomes, gap junctions, and connexin hemichannels playing essential roles in the toxic effects of chemicals, including facilitating or counteracting cell death processes. Logic requiring hepatocytes to obtain current information about whether downstream and lateral neighbors have triggered necrosis enabled virtual hepatocytes to achieve both new target attributes. A virtual hepatocyte that is glutathione-depleted uses that information to determine if it will initiate necrosis. When a less-stressed hepatocyte is flanked by at least two neighbors that have triggered necrosis, it too will initiate necrosis. We hypothesize that the resulting intercellular communication-enabled model mechanism is analogous to the actual explanation for APAP-induced hepatotoxicity at comparable levels of granularity
Microservice Transition and its Granularity Problem: A Systematic Mapping Study
Microservices have gained wide recognition and acceptance in software
industries as an emerging architectural style for autonomic, scalable, and more
reliable computing. The transition to microservices has been highly motivated
by the need for better alignment of technical design decisions with improving
value potentials of architectures. Despite microservices' popularity, research
still lacks disciplined understanding of transition and consensus on the
principles and activities underlying "micro-ing" architectures. In this paper,
we report on a systematic mapping study that consolidates various views,
approaches and activities that commonly assist in the transition to
microservices. The study aims to provide a better understanding of the
transition; it also contributes a working definition of the transition and
technical activities underlying it. We term the transition and technical
activities leading to microservice architectures as microservitization. We then
shed light on a fundamental problem of microservitization: microservice
granularity and reasoning about its adaptation as first-class entities. This
study reviews state-of-the-art and -practice related to reasoning about
microservice granularity; it reviews modelling approaches, aspects considered,
guidelines and processes used to reason about microservice granularity. This
study identifies opportunities for future research and development related to
reasoning about microservice granularity.Comment: 36 pages including references, 6 figures, and 3 table
Microeconomic Structure determines Macroeconomic Dynamics. Aoki defeats the Representative Agent
Masanao Aoki developed a new methodology for a basic problem of economics:
deducing rigorously the macroeconomic dynamics as emerging from the
interactions of many individual agents. This includes deduction of the fractal
/ intermittent fluctuations of macroeconomic quantities from the granularity of
the mezo-economic collective objects (large individual wealth, highly
productive geographical locations, emergent technologies, emergent economic
sectors) in which the micro-economic agents self-organize.
In particular, we present some theoretical predictions, which also met
extensive validation from empirical data in a wide range of systems: - The
fractal Levy exponent of the stock market index fluctuations equals the Pareto
exponent of the investors wealth distribution. The origin of the macroeconomic
dynamics is therefore found in the granularity induced by the wealth / capital
of the wealthiest investors. - Economic cycles consist of a Schumpeter
'creative destruction' pattern whereby the maxima are cusp-shaped while the
minima are smooth. In between the cusps, the cycle consists of the sum of 2
'crossing exponentials': one decaying and the other increasing.
This unification within the same theoretical framework of short term market
fluctuations and long term economic cycles offers the perspective of a genuine
conceptual synthesis between micro- and macroeconomics. Joining another giant
of contemporary science - Phil Anderson - Aoki emphasized the role of rare,
large fluctuations in the emergence of macroeconomic phenomena out of
microscopic interactions and in particular their non self-averaging, in the
language of statistical physics. In this light, we present a simple stochastic
multi-sector growth model.Comment: 42 pages, 6 figure
- …