237 research outputs found

    Supporting fine-grained generative model-driven evolution

    Get PDF
    In the standard generative Model-driven Architecture (MDA), adapting the models of an existing system requires re-generation and restarting of that system. This is due to a strong separation between the modeling environment and the runtime environment. Certain current approaches remove this separation, allowing a system to be changed smoothly when the model changes. These approaches are, however, based on interpretation of modeling information rather than on generation, as in MDA. This paper describes an architecture that supports fine-grained evolution combined with generative model-driven development. Fine-grained changes are applied in a generative model-driven way to a system that has itself been developed in this way. To achieve this, model changes must be propagated correctly toward impacted elements. The impact of a model change flows along three dimensions: implementation, data (instances), and modeled dependencies. These three dimensions are explicitly represented in an integrated modeling-runtime environment to enable traceability. This implies a fundamental rethinking of MDA

    A modelling approach for evaluating impacts of hydropeaking in a sub-arctic river

    Get PDF
    Abstract. The release of pulses of water to increase hydroelectric power production at hydropower dams to meet daily peaks in electricity demands is called hydropeaking. Due to energy supply and demand fluctuations, the energy markets direct hydropower companies to balance load fluctuations through variations in power generation which result in flow regulation. More recently, this regulation is being carried out at shorter time intervals i.e., intra-daily and intra-hourly levels. The hydropeaking phenomenon increases drastically at shorter time intervals, severely impacting the riverine and riparian ecosystem. Social, economic, and ecological impacts arise from short-term hydropeaking. Furthermore, recreational services offered by the river are also impacted. This research develops a novel methodology for assessing these impacts in a strongly regulated sub-arctic river in Finland, i.e., Kemijoki River, Ossauskoski-Tervola reach. The methodology combines assessment of seasonal variations in sub-daily hydropeaking, two-dimensional hydrodynamic modelling, and a high-resolution land cover map developed through supervised land use classification via a machine learning algorithm. The results obtained include; the identification of a zone of influence of hydropeaking at sub-daily levels during each season, the total and class-wise area affected during each peaking event, and vulnerability zonation for water-based recreation in the river reach. The overall area of reach affected by peaking in Winter was (1.05 km2), Spring (0.96 km2), Summer (1.39 km2), and Autumn (0.66 km2). A vulnerability mapping was also carried out for the suitability of water-based recreation in the study reach. The novel methodology developed in this research which defines the vulnerable zone of hydropeaking can be used as the first step in detailed impacts assessment studies such as those for impacts on fish habitat and sediment transport processes in the river. The hydropeaking-influenced zone can be used to set thresholds for ecological flows and ramping rates downstream of power stations and opens avenues for future research, development, and policy endeavors for riparian ecosystem impact assessment and mitigation

    A Framework to Build a Big Data Ecosystem Oriented to the Collaborative Networked Organization

    Get PDF
    A Collaborative Networked Organization (CNO) is a set of entities that operate in heterogeneous contexts and aim to collaborate to take advantage of a business opportunity or solve a problem. Big data allows CNOs to be more competitive by improving their strategy, management and business processes. To support the development of big data ecosystems in CNOs, several frameworks have been reported in the literature. However, these frameworks limit their application to a specific CNO manifestation and cannot conduct intelligent processing of big data to support decision making at the CNO. This paper makes two main contributions: (1) the proposal of a metaframework to analyze existing and future frameworks for the development of big data ecosystems in CNOs and (2) to show the Collaborative Networked Organizations–big data (CNO-BD) framework, which includes guidelines, tools, techniques, conceptual solutions and good practices for the building of a big data ecosystem in different kinds of Collaborative Networked Organizations, overcoming the weaknesses of previous issues. The CNO-BD framework consists of seven dimensions: levels, approaches, data fusion, interoperability, data sources, big data assurance and programmable modules. The framework was validated through expert assessment and a case study

    Observing a Moving Agent

    Get PDF
    We address the problem of observing a moving agent. In particular, we propose a system for observing a manipulation process, where a robot hand manipulates an object. A discrete event dynamic system (DEDS) from work is developed for the hand/object interaction over time and a stabilizing observer is constructed. Low-level modules are developed for recognizing the events that causes state transitions within the dynamic manipulation system. The work examines closely the possibilities for errors, mistakes and uncertainties in the manipulation system, observer construction process and event identification mechanisms. The system utilizes different tracking techniques in order to observe the task in an active, adaptive and goal-directed manner

    Development Of A Methodology To Enhance Design For Assembly

    Get PDF
    Assembly is the activity in which the activities of design, engineering, manufacturing, and logistics are brought together to create an object that performs a function. Design for Assembly (DFA) is an approach that addresses product structure simplification, where the total number of parts in a product is a key indicator of product assembly quality. The objective of this research is to develop a methodology to help the designer improve product design and development. The two main concepts that are central to the development of Design for Assembly (DFA) method are firstly how to make enhancement of DFA, and secondly how to develop an evaluation system for DFA. The enhancement of DFA is characterized by a robust procedure to improve product design. a reliable method to solve problems in design for assembly activities and the ability to increase ideas in product design. An integrated framework and a software implementation are developed in order to realize the research objective. The integrated approach is called 'Enhanced Design For Assembly' (EDFA) and consists of the principles adopted from the Hitachi AEM, virtual manufacturing system and assembly principles and guidelines. From the case studies that have been carried out using the EDFA approach the results showed that it can minimize assembly complexity, reduce the overall assembly cost and reduce the number of part and component in product development and improvement. The software "EDFA' is easy to use and helps the designer's work in product design development and improvement. The comparison of EDFA method with Boothroyd-Dewhurst shows the capability of producing good results

    Normal and abnormal tissue identification system and method for medical images such as digital mammograms

    Get PDF
    A system and method for analyzing a medical image to determine whether an abnormality is present, for example, in digital mammograms, includes the application of a wavelet expansion to a raw image to obtain subspace images of varying resolution. At least one subspace image is selected that has a resolution commensurate with a desired predetermined detection resolution range. A functional form of a probability distribution function is determined for each selected subspace image, and an optimal statistical normal image region test is determined for each selected subspace image. A threshold level for the probability distribution function is established from the optimal statistical normal image region test for each selected subspace image. A region size comprising at least one sector is defined, and an output image is created that includes a combination of all regions for each selected subspace image. Each region has a first value when the region intensity level is above the threshold and a second value when the region intensity level is below the threshold. This permits the localization of a potential abnormality within the image

    Geometric Algorithms for Sampling the Flux Space of Metabolic Networks

    Get PDF
    Systems Biology is a fundamental field and paradigm that introduces a new era in Biology. The crux of its functionality and usefulness relies on metabolic networks that model the reactions occurring inside an organism and provide the means to understand the underlying mechanisms that govern biological systems. Even more, metabolic networks have a broader impact that ranges from resolution of ecosystems to personalized medicine. The analysis of metabolic networks is a computational geometry oriented field as one of the main operations they depend on is sampling uniformly points from polytopes; the latter provides a representation of the steady states of the metabolic networks. However, the polytopes that result from biological data are of very high dimension (to the order of thousands) and in most, if not all, the cases are considerably skinny. Therefore, to perform uniform random sampling efficiently in this setting, we need a novel algorithmic and computational framework specially tailored for the properties of metabolic networks. We present a complete software framework to handle sampling in metabolic networks. Its backbone is a Multiphase Monte Carlo Sampling (MMCS) algorithm that unifies rounding and sampling in one pass, obtaining both upon termination. It exploits an improved variant of the Billiard Walk that enjoys faster arithmetic complexity per step. We demonstrate the efficiency of our approach by performing extensive experiments on various metabolic networks. Notably, sampling on the most complicated human metabolic network accessible today, Recon3D, corresponding to a polytope of dimension 5335, took less than 30 hours. To our knowledge, that is out of reach for existing software

    A Calculus for Variational Programming

    Get PDF
    Variation is ubiquitous in software. Many applications can benefit from making this variation explicit, then manipulating and computing with it directly---a technique we call "variational programming". This idea has been independently discovered in several application domains, such as efficiently analyzing and verifying software product lines, combining bounded and symbolic model-checking, and computing with alternative privacy profiles. Although these domains share similar core problems, and there are also many similarities in the solutions, there is no dedicated programming language support for variational programming. This makes the various implementations tedious, prone to errors, hard to maintain and reuse, and difficult to compare. In this paper we present a calculus that forms the basis of a programming language with explicit support for representing, manipulating, and computing with variation in programs and data. We illustrate how such a language can simplify the implementation of variational programming tasks. We present the syntax and semantics of the core calculus, a sound type system, and a type inference algorithm that produces principal types
    corecore