191 research outputs found

    A Model-based Approach for Designing Cyber-Physical Production Systems

    Get PDF
    The most recent development trend related to manufacturing is called "Industry 4.0". It proposes to transition from "blind" mechatronics systems to Cyber-Physical Production Systems (CPPSs). Such systems are capable of communicating with each other, acquiring and transmitting real-time production data. Their management and control require a structured software architecture, which is tipically referred to as the "Automation Pyramid". The design of both the software architecture and the components (i.e., the CPPSs) is a complex task, where the complexity is induced by the heterogeneity of the required functionalities. In such a context, the target of this thesis is to propose a model-based framework for the analysis and the design of production lines, compliant with the Industry 4.0 paradigm. In particular, this framework exploits the Systems Modeling Language (SysML) as a unified representation for the different viewpoints of a manufacturing system. At the components level, the structural and behavioral diagrams provided by SysML are used to produce a set of logical propositions about the system and components under design. Such an approach is specifically tailored towards constructing Assume-Guarantee contracts. By exploiting reactive synthesis techniques, contracts are used to prototype portions of components' behaviors and to verify whether implementations are consistent with the requirements. At the software level, the framework proposes a particular architecture based on the concept of "service". Such an architecture facilitates the reconfiguration of components and integrates an advanced scheduling technique, taking advantage of the production recipe SysML model. The proposed framework has been built coupled with the construction of the ICE Laboratory, a research facility consisting of a full-fledged production line. Such an approach has been adopted to construct models of the laboratory, to virtual prototype parts of the system and to manage the physical system through the proposed software architecture

    Prediction-based techniques for the optimization of mobile networks

    Get PDF
    Mención Internacional en el título de doctorMobile cellular networks are complex system whose behavior is characterized by the superposition of several random phenomena, most of which, related to human activities, such as mobility, communications and network usage. However, when observed in their totality, the many individual components merge into more deterministic patterns and trends start to be identifiable and predictable. In this thesis we analyze a recent branch of network optimization that is commonly referred to as anticipatory networking and that entails the combination of prediction solutions and network optimization schemes. The main intuition behind anticipatory networking is that knowing in advance what is going on in the network can help understanding potentially severe problems and mitigate their impact by applying solution when they are still in their initial states. Conversely, network forecast might also indicate a future improvement in the overall network condition (i.e. load reduction or better signal quality reported from users). In such a case, resources can be assigned more sparingly requiring users to rely on buffered information while waiting for the better condition when it will be more convenient to grant more resources. In the beginning of this thesis we will survey the current anticipatory networking panorama and the many prediction and optimization solutions proposed so far. In the main body of the work, we will propose our novel solutions to the problem, the tools and methodologies we designed to evaluate them and to perform a real world evaluation of our schemes. By the end of this work it will be clear that not only is anticipatory networking a very promising theoretical framework, but also that it is feasible and it can deliver substantial benefit to current and next generation mobile networks. In fact, with both our theoretical and practical results we show evidences that more than one third of the resources can be saved and even larger gain can be achieved for data rate enhancements.Programa Oficial de Doctorado en Ingeniería TelemåticaPresidente: Albert Banchs Roca.- Presidente: Pablo Serrano Yañez-Mingot.- Secretario: Jorge Ortín Gracia.- Vocal: Guevara Noubi

    Cross-Layer Optimization of Message Broadcast in MANETs

    Get PDF

    A Blueprint for Knowledge Management in the Biopharmaceutical Sector

    Get PDF
    This research examined current industry Knowledge Management (KM) methodologies and capabilities in order to gain insights into the level of maturity and understanding of KM within the biopharmaceutical sector. In addition, the researcher has developed models, tools and processes that can assist the sector to gain greater clarity of the value and merits that KM can offer to organizations. The researcher proposes that a systematic KM program can be used to “unlock” the knowledge and organizational capabilities necessary to convey real competitive advantage, but more importantly for the patient, to enable organizations to successfully develop and deliver the next generation of advanced therapeutics. The research questions asked; What are the current levels of adoption of KM within the biopharmaceutical sector? How is ‘critical knowledge’ defined within organizations? What might represent the core elements of a Pharma KM Blueprint to better enable knowledge flow within organizations? The research approach adopted a pragmatic worldview which is most suited to a research topic that is both real world practice orientated and problemcentered and sought to examine the consequences of actions within the biopharmaceutical sector when knowledge is not managed effectively. There were three primary phases of inquiry employed in the thesis and a mixed methods approach was used to explore the problems addressed. The first phase involved quantitative and qualitative data analysis of relevant literature sources, including available international KM benchmarking data. The second phase involved a biopharmaceutical industry consultation phase comprising of focus groups, polls and philosophical dialogues with KM experts, sector KM practitioners and knowledge workers. The third and final phase of inquiry involved the adaptation and development of the Pharma KM Blueprint including practical KM tools, frameworks and models for use within the biopharmaceutical sector. This phase also included a detailed case study executed within one large biopharmaceutical organization of a KM diagnostic tool and process developed as part of this research. The research findings have established a core principle that knowledge must be valued and managed as a critical asset within an organization, in the same manner as physical assets. In addition, the research identified that in order to realize the ambitions of ICH Q10, stated as, ‘enhance the quality and availability of medicines around the world in the interest of public health’, (ICH Q10, 2008), there is a crucial need to enhance the effective and efficient flow of knowledge across the product lifecycle within organizations. The research finds that in order to extract value from this organizational knowledge there must be practical, integrated and systematic KM approaches implemented for the identification, capture, curation and visibility of the critical knowledge assets before the matter of enhancing the flow of knowledge can be addressed. The research indicates that while these concepts are important to any business within the traditional biopharmaceutical sector planning on remaining competitive, they represent a “game changer” (or “game over”) opportunity for any organization planning to develop, manufacture or market advanced therapeutic products, personalized medicines or next generation products. A key output of the research is the Pharma KM Blueprint that illustrates the holistic integration of core KM principles, models and tools to deliver the real benefits to the patients and the business

    Functional Ontologies and Their Application to Hydrologic Modeling: Development of an Integrated Semantic and Procedural Knowledge Model and Reasoning Engine

    Get PDF
    This dissertation represents the research and development of new concepts and techniques for modeling the knowledge about the many concepts we as hydrologists must understand such that we can execute models that operate in terms of conceptual abstractions and have those abstractions translate to the data, tools, and models we use every day. This hydrologic knowledge includes conceptual (i.e. semantic) knowledge, such as the hydrologic cycle concepts and relationships, as well as functional (i.e. procedural) knowledge, such as how to compute the area of a watershed polygon, average basin slope or topographic wetness index. This dissertation is presented as three papers and a reference manual for the software created. Because hydrologic knowledge includes both semantic aspects as well as procedural aspects, we have developed, in the first paper, a new form of reasoning engine and knowledge base that extends the general-purpose analysis and problem-solving capability of reasoning engines by incorporating procedural knowledge, represented as computer source code, into the knowledge base. The reasoning engine is able to compile the code and then, if need be, execute the procedural code as part of a query. The potential advantage to this approach is that it simplifies the description of procedural knowledge in a form that can be readily utilized by the reasoning engine to answer a query. Further, since the form of representation of the procedural knowledge is source code, the procedural knowledge has the full capabilities of the underlying language. We use the term functional ontology to refer to the new semantic and procedural knowledge models. The first paper applies the new knowledge model to describing and analyzing polygons. The second and third papers address the application of the new functional ontology reasoning engine and knowledge model to hydrologic applications. The second paper models concepts and procedures, including running external software, related to watershed delineation. The third paper models a project scenario that includes integrating several models. A key advance demonstrated in this paper is the use of functional ontologies to apply metamodeling concepts in a manner that both abstracts and fully utilizes computational models and data sets as part of the project modeling process

    Knowledge composition methodology for effective analysis problem formulation in simulation-based design

    Get PDF
    In simulation-based design, a key challenge is to formulate and solve analysis problems efficiently to evaluate a large variety of design alternatives. The solution of analysis problems has benefited from advancements in commercial off-the-shelf math solvers and computational capabilities. However, the formulation of analysis problems is often a costly and laborious process. Traditional simulation templates used for representing analysis problems are typically brittle with respect to variations in artifact topology and the idealization decisions taken by analysts. These templates often require manual updates and "re-wiring" of the analysis knowledge embodied in them. This makes the use of traditional simulation templates ineffective for multi-disciplinary design and optimization problems. Based on these issues, this dissertation defines a special class of problems known as variable topology multi-body (VTMB) problems that characterizes the types of variations seen in design-analysis interoperability. This research thus primarily answers the following question: How can we improve the effectiveness of the analysis problem formulation process for VTMB problems? The knowledge composition methodology (KCM) presented in this dissertation answers this question by addressing the following research gaps: (1) the lack of formalization of the knowledge used by analysts in formulating simulation templates, and (2) the inability to leverage this knowledge to define model composition methods for formulating simulation templates. KCM overcomes these gaps by providing: (1) formal representation of analysis knowledge as modular, reusable, analyst-intelligible building blocks, (2) graph transformation-based methods to automatically compose simulation templates from these building blocks based on analyst idealization decisions, and (3) meta-models for representing advanced simulation templates VTMB design models, analysis models, and the idealization relationships between them. Applications of the KCM to thermo-mechanical analysis of multi-stratum printed wiring boards and multi-component chip packages demonstrate its effectiveness handling VTMB and idealization variations with significantly enhanced formulation efficiency (from several hours in existing methods to few minutes). In addition to enhancing the effectiveness of analysis problem formulation, KCM is envisioned to provide a foundational approach to model formulation for generalized variable topology problems.Ph.D.Committee Co-Chair: Dr. Christiaan J. J. Paredis; Committee Co-Chair: Dr. Russell S. Peak; Committee Member: Dr. Charles Eastman; Committee Member: Dr. David McDowell; Committee Member: Dr. David Rosen; Committee Member: Dr. Steven J. Fenve

    Industrialising Software Development in Systems Integration

    No full text
    Compared to other disciplines, software engineering as of today is still dependent on craftsmanship of highly-skilled workers. However, with constantly increasing complexity and efforts, existing software engineering approaches appear more and more inefficient. A paradigm shift towards industrial production methods seems inevitable. Recent advances in academia and practice have lead to the availability of industrial key principles in software development as well. Specialization is represented in software product lines, standardization and systematic reuse are available with component-based development, and automation has become accessible through model-driven engineering. While each of the above is well researched in theory, only few cases of successful implementation in the industry are known. This becomes even more evident in specialized areas of software engineering such as systems integration. Today’s IT systems need to quickly adapt to new business requirements due to mergers and acquisitions and cooperations between enterprises. This certainly leads to integration efforts, i.e. joining different subsystems into a cohesive whole in order to provide new functionality. In such an environment. the application of industrial methods for software development seems even more important. Unfortunately, software development in this field is a highly complex and heterogeneous undertaking, as IT environments differ from customer to customer. In such settings, existing industrialization concepts would never break even due to one-time projects and thus insufficient economies of scale and scope. This present thesis, therefore, describes a novel approach for a more efficient implementation of prior key principles while considering the characteristics of software development for systems integration. After identifying the characteristics of the field and their affects on currently-known industrialization concepts, an organizational model for industrialized systems integration has thus been developed. It takes software product lines and adapts them in a way feasible for a systems integrator active in several business domains. The result is a three-tiered model consolidating recurring activities and reducing the efforts for individual product lines. For the implementation of component-based development, the present thesis assesses current component approaches and applies an integration metamodel to the most suitable one. This ensures a common understanding of systems integration across different product lines and thus alleviates component reuse, even across product line boundaries. The approach is furthermore aligned with the organizational model to depict in which way component-based development may be applied in industrialized systems integration. Automating software development in systems integration with model-driven engineering was found to be insufficient in its current state. The reason herefore lies in insufficient tool chains and a lack of modelling standards. As an alternative, an XML-based configuration of products within a software product line has been developed. It models a product line and its products with the help of a domain-specific language and utilizes stylesheet transformations to generate compliable artefacts. The approach has been tested for its feasibility within an exemplarily implementation following a real-world scenario. As not all aspects of industrialized systems integration could be simulated in a laboratory environment, the concept was furthermore validated during several expert interviews with industry representatives. Here, it was also possible to assess cultural and economic aspects. The thesis concludes with a detailed summary of the contributions to the field and suggests further areas of research in the context of industrialized systems integration

    Model-Driven Development of Interactive Multimedia Applications

    Get PDF
    The development of highly interactive multimedia applications is still a challenging and complex task. In addition to the application logic, multimedia applications typically provide a sophisticated user interface with integrated media objects. As a consequence, the development process involves different experts for software design, user interface design, and media design. There is still a lack of concepts for a systematic development which integrates these aspects. This thesis provides a model-driven development approach addressing this problem. Therefore it introduces the Multimedia Modeling Language (MML), a visual modeling language supporting a design phase in multimedia application development. The language is oriented on well-established software engineering concepts, like UML 2, and integrates concepts from the areas of multimedia development and model-based user interface development. MML allows the generation of code skeletons from the models. Thereby, the core idea is to generate code skeletons which can be directly processed in multimedia authoring tools. In this way, the strengths of both are combined: Authoring tools are used to perform the creative development tasks while models are used to design the overall application structure and to enable a well-coordinated development process. This is demonstrated using the professional authoring tool Adobe Flash. MML is supported by modeling and code generation tools which have been used to validate the approach over several years in various student projects and teaching courses. Additional prototypes have been developed to demonstrate, e.g., the ability to generate code for different target platforms. Finally, it is discussed how models can contribute in general to a better integration of well-structured software development and creative visual design

    A model-driven approach to machine learning and software modeling for the IoT

    Get PDF
    Models are used in both Software Engineering (SE) and Artificial Intelligence (AI). SE models may specify the architecture at different levels of abstraction and for addressing different concerns at various stages of the software development life-cycle, from early conceptualization and design, to verification, implementation, testing and evolution. However, AI models may provide smart capabilities, such as prediction and decision-making support. For instance, in Machine Learning (ML), which is currently the most popular sub-discipline of AI, mathematical models may learn useful patterns in the observed data and can become capable of making predictions. The goal of this work is to create synergy by bringing models in the said communities together and proposing a holistic approach to model-driven software development for intelligent systems that require ML. We illustrate how software models can become capable of creating and dealing with ML models in a seamless manner. The main focus is on the domain of the Internet of Things (IoT), where both ML and model-driven SE play a key role. In the context of the need to take a Cyber-Physical System-of-Systems perspective of the targeted architecture, an integrated design environment for both SE and ML sub-systems would best support the optimization and overall efficiency of the implementation of the resulting system. In particular, we implement the proposed approach, called ML-Quadrat, based on ThingML, and validate it using a case study from the IoT domain, as well as through an empirical user evaluation. It transpires that the proposed approach is not only feasible, but may also contribute to the performance leap of software development for smart Cyber-Physical Systems (CPS) which are connected to the IoT, as well as an enhanced user experience of the practitioners who use the proposed modeling solution
    • 

    corecore