1,437 research outputs found

    Multi-agent system for modelling the restructured energy market

    Get PDF
    Master'sMASTER OF ENGINEERIN

    Statistical Physics of Design

    Full text link
    Modern life increasingly relies on complex products that perform a variety of functions. The key difficulty of creating such products lies not in the manufacturing process, but in the design process. However, design problems are typically driven by multiple contradictory objectives and different stakeholders, have no obvious stopping criteria, and frequently prevent construction of prototypes or experiments. Such ill-defined, or "wicked" problems cannot be "solved" in the traditional sense with optimization methods. Instead, modern design techniques are focused on generating knowledge about the alternative solutions in the design space. In order to facilitate such knowledge generation, in this dissertation I develop the "Systems Physics" framework that treats the emergent structures within the design space as physical objects that interact via quantifiable forces. Mathematically, Systems Physics is based on maximal entropy statistical mechanics, which allows both drawing conceptual analogies between design problems and collective phenomena and performing numerical calculations to gain quantitative understanding. Systems Physics operates via a Model-Compute-Learn loop, with each step refining our thinking of design problems. I demonstrate the capabilities of Systems Physics in two very distinct case studies: Naval Engineering and self-assembly. For the Naval Engineering case, I focus on an established problem of arranging shipboard systems within the available hull space. I demonstrate the essential trade-off between minimizing the routing cost and maximizing the design flexibility, which can lead to abrupt phase transitions. I show how the design space can break into several locally optimal architecture classes that have very different robustness to external couplings. I illustrate how the topology of the shipboard functional network enters a tight interplay with the spatial constraints on placement. For the self-assembly problem, I show that the topology of self-assembled structures can be reliably encoded in the properties of the building blocks so that the structure and the blocks can be jointly designed. The work presented here provides both conceptual and quantitative advancements. In order to properly port the language and the formalism of statistical mechanics to the design domain, I critically re-examine such foundational ideas as system-bath coupling, coarse graining, particle distinguishability, and direct and emergent interactions. I show that the design space can be packed into a special information structure, a tensor network, which allows seamless transition from graphical visualization to sophisticated numerical calculations. This dissertation provides the first quantitative treatment of the design problem that is not reduced to the narrow goals of mathematical optimization. Using statistical mechanics perspective allows me to move beyond the dichotomy of "forward" and "inverse" design and frame design as a knowledge generation process instead. Such framing opens the way to further studies of the design space structures and the time- and path-dependent phenomena in design. The present work also benefits from, and contributes to the philosophical interpretations of statistical mechanics developed by the soft matter community in the past 20 years. The discussion goes far beyond physics and engages with literature from materials science, naval engineering, optimization problems, design theory, network theory, and economic complexity.PHDPhysicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/163133/1/aklishin_1.pd

    Domain architecture a design framework for system development and integration

    Get PDF
    The ever growing complexity of software systems has revealed many short-comings in existing software engineering practices and has raised interest in architecture-driven software development. A system\u27s architecture provides a model of the system that suppresses implementation detail, allowing the architects to concentrate on the analysis and decisions that are most critical to structuring the system to satisfy its requirements. Recently, interests of researchers and practi-tioners have shifted from individual system architectures to architectures for classes of software systems which provide more general, reusable solutions to the issues of overall system organization, interoperability, and allocation of services to system components. These generic architectures, such as product line architectures and domain architectures, promote reuse and interoperability, and create a basis for cost effective construction of high-quality systems. Our focus in this dissertation is on domain architectures as a means of development and integration of large-scale, domain-specific business software systems. Business imperatives, including flexibility, productivity, quality, and ability to adapt to changes, have fostered demands for flexible, coherent and enterprise--wide integrated business systems. The components of such systems, developed separately or purchased off the shelf, need to cohesively form an overall compu-tational environment for the business. The inevitable complexity of such integrated solutions and the highly-demanding process of their construction, management, and evolution support require new software engineering methodologies and tools. Domain architectures, prescribing the organization of software systems in a business domain, hold a promise to serve as a foundation on which such integrated business systems can be effectively constructed. To meet the above expectations, software architectures must be properly defined, represented, and applied, which requires suitable methodologies as well as process and tool support. Despite research efforts, however, state-of-the-art methods and tools for architecture-based system development do not yet meet the practical needs of system developers. The primary focus of this dissertation is on developing methods and tools to support domain architecture engineering and on leveraging architectures to achieve improved system development and integration in presence of increased complexity. In particular, the thesis explores issues related to the following three aspects of software technology: system complexity and software architectures as tools to alleviate complexity; domain architectures as frameworks for construction of large scale, flexible, enterprise-wide software systems; and architectural models and representation techniques as a basis for good” design. The thesis presents an archi-tectural taxonomy to help categorize and better understand architectural efforts. Furthermore, it clarifies the purpose of domain architectures and characterizes them in detail. To support the definition and application of domain architectures we have developed a method for domain architecture engineering and representation: GARM-ASPECT. GARM, the Generic Architecture Reference Model, underlying the method, is a system of modeling abstractions, relations and recommendations for building representations of reference software architectures. The model\u27s focus on reference and domain architectures determines its main distinguishing features: multiple views of architectural elements, a separate rule system to express constraints on architecture element types, and annotations such as “libraries” of patterns and “logs” of guidelines. ASPECT is an architecture description language based on GARM. It provides a normalized vocabulary for representing the skeleton of an architecture, its structural view, and establishes a framework for capturing archi-tectural constraints. It also allows extensions of the structural view with auxiliary information, such as behavior or quality specifications. In this respect, ASPECT provides facilities for establishing relationships among different specifications and gluing them together within an overall architectural description. This design allows flexibility and adaptability of the methodology to the specifics of a domain or a family of systems. ASPECT supports the representation of reference architectures as well as individual system architectures. The practical applicability of this method has been tested through a case study in an industrial setting. The approach to architecture engineering and representation, presented in this dissertation, is pragmatic and oriented towards software practitioners. GARM-ASPECT, as well as the taxonomy of architectures are of use to architects, system planners and system engineers. Beyond these practical contributions, this thesis also creates a more solid basis for expbring the applicability of architectural abstractions, the practicality of representation approaches, and the changes required to the devel-opment process in order to achieve the benefits from an architecture-driven software technology

    An ethnographic investigation into the relationship between mental models and the implementation of total quality management

    Get PDF
    Includes bibliography.The objective of my project was to find the reasons why the Quality Improvement Process (QIP) which started enthusiastically in Old Mutual in 1987, has lost momentum. Its initial implementation was characterised by success, but later, certain shortcomings became evident. In brief, the initial success of Crosby's QIP programme was attributable to its organised implementation throughout the organisation. It created a general awareness of key quality principles and gave a common understanding of a uniform language and standards throughout the organisation. However, after some years, senior management realised that this process was too simplistic, and that more was needed. A 'second phase' was implemented. This phase built onto the foundations laid by the QIP and focused on achieving client-orientated improvements in all business processes within the organisation. But this phase gradually lost momentum, as it failed to take into account the fact that lasting and continuous improvement in an organisation requires fundamental changes in almost every facet or part of the organisational whole. These fundamental changes include changes to the organisational structure, its management practices, its work processes and systems , changes in the way that managers view the organisation (that is, their mental models) and not merely a focus on process improvement within the organisation. The hypothesis propounded in this thesis would attempt to prove or disprove a component of the aforementioned, namely that a certain dominant mental model, that is, a belief of how the organisation works, is needed amongst the management of an organisation to bring about genuine improvements. This hypothesis propounds that a high-performing organisation would exhibit a strong correlation between the mental model implied by quality improvement and the organisation's managers' dominant mental model of how their organisation works

    Decision Support Systems

    Get PDF
    Decision support systems (DSS) have evolved over the past four decades from theoretical concepts into real world computerized applications. DSS architecture contains three key components: knowledge base, computerized model, and user interface. DSS simulate cognitive decision-making functions of humans based on artificial intelligence methodologies (including expert systems, data mining, machine learning, connectionism, logistical reasoning, etc.) in order to perform decision support functions. The applications of DSS cover many domains, ranging from aviation monitoring, transportation safety, clinical diagnosis, weather forecast, business management to internet search strategy. By combining knowledge bases with inference rules, DSS are able to provide suggestions to end users to improve decisions and outcomes. This book is written as a textbook so that it can be used in formal courses examining decision support systems. It may be used by both undergraduate and graduate students from diverse computer-related fields. It will also be of value to established professionals as a text for self-study or for reference

    Alternate Means of Digital Design Communication

    Get PDF
    This thesis reconceptualises communication in digital design as an integrated social and technical process. The friction in the communicative processes pertaining to digital design can be traced to the fact that current research and practice emphasise technical concerns at the expense of social aspects of design communication. With the advent of BIM (Building Information Modelling), a code model of communication (machine-to-machine) is inadequately applied to design communication. This imbalance is addressed in this thesis by using inferential models of communication to capture and frame the psychological and social aspects behind the communicative contracts between people. Three critical aspects of the communicative act have been analysed, namely (1) data representation, (2) data classification and (3) data transaction, with the help of a new digital design communication platform, Speckle, which was developed during this research project for this purpose. By virtue of an applied living laboratory context, Speckle facilitated both qualitative and quantitative comparisons against existing methodologies with data from real-world settings. Regarding data representation (1), this research finds that the communicative performance of a low-level composable object model is better than that of a complete and universal one as it enables a more dynamic process of ontological revision. This implies that current practice and research operates at an inappropriate level of abstraction. On data classification (2), this thesis shows that a curatorial object-based data sharing methodology, as opposed to the current file-based approaches, leads to increased relevancy and a reduction in noise (information without intent, or meaning). Finally, on data transaction (3), the analysis shows that an object-based data sharing methodology is technically better suited to enable communicative contracts between stakeholders. It allows for faster and more meaningful change-dependent transactions, as well as allow for the emergence of traceable communicative networks outside of the predefined exchanges of current practices

    The 1993 Goddard Conference on Space Applications of Artificial Intelligence

    Get PDF
    This publication comprises the papers presented at the 1993 Goddard Conference on Space Applications of Artificial Intelligence held at the NASA/Goddard Space Flight Center, Greenbelt, MD on May 10-13, 1993. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed

    Embedded System Design

    Get PDF
    A unique feature of this open access textbook is to provide a comprehensive introduction to the fundamental knowledge in embedded systems, with applications in cyber-physical systems and the Internet of things. It starts with an introduction to the field and a survey of specification models and languages for embedded and cyber-physical systems. It provides a brief overview of hardware devices used for such systems and presents the essentials of system software for embedded systems, including real-time operating systems. The author also discusses evaluation and validation techniques for embedded systems and provides an overview of techniques for mapping applications to execution platforms, including multi-core platforms. Embedded systems have to operate under tight constraints and, hence, the book also contains a selected set of optimization techniques, including software optimization techniques. The book closes with a brief survey on testing. This fourth edition has been updated and revised to reflect new trends and technologies, such as the importance of cyber-physical systems (CPS) and the Internet of things (IoT), the evolution of single-core processors to multi-core processors, and the increased importance of energy efficiency and thermal issues

    Business strategy driven IT systems for engineer-to-order and make-to-order manufacturing enterprises

    Get PDF
    This thesis reports research into the specification and implementation of an Information Technology (IT) Route Map. The purpose of the Route Map is to enable rapid design and deployment of IT solutions capable of semi-automating business processes in a manufacturing enterprise. The Map helps structure transition processes involved in “identification of key business strategies and design of business processes” and “choice of enterprise systems and supporting implementation techniques”. Common limitations of current Enterprise Resource Planning (ERP) systems are observed and incorporated as Route Map implications and constraints. Scope of investigation is targeted at Small to Medium Sized Enterprises (SMEs) that employ Engineer-To-Order (ETO) and Make-To-Order (MTO) business processes. However, a feature of the Route Map is that it takes into account contemporary business concerns related to “globalisation”, “mergers and acquisitions” and “typical resource constraint problems of SMEs”. In the course of the research a “Business Strategy Driven IT System Concept” was conceived and examined. The main purpose of this concept is to promote the development of agile and innovative business activity in SMEs. The Road Map encourages strategy driven solutions to be (a) specified based on the use of emerging enterprise engineering theories and (b) implemented and changed using componentbased systems design and composition techniques. Part-evaluation of the applicability and capabilities of the Road Map has been carried out by conducting industrial survey and case study work. This assesses requirements of real industrial problems and solutions. The evaluation work has also been enabled by conducting a pilot implementation of the thesis concepts at the premises of a partner SME

    Bayesian belief networks for dementia diagnosis and other applications: a comparison of hand-crafting and construction using a novel data driven technique

    Get PDF
    The Bayesian network (BN) formalism is a powerful representation for encoding domains characterised by uncertainty. However, before it can be used it must first be constructed, which is a major challenge for any real-life problem. There are two broad approaches, namely the hand-crafted approach, which relies on a human expert, and the data-driven approach, which relies on data. The former approach is useful, however issues such as human bias can introduce errors into the model. We have conducted a literature review of the expert-driven approach, and we have cherry-picked a number of common methods, and engineered a framework to assist non-BN experts with expert-driven construction of BNs. The latter construction approach uses algorithms to construct the model from a data set. However, construction from data is provably NP-hard. To solve this problem, approximate, heuristic algorithms have been proposed; in particular, algorithms that assume an order between the nodes, therefore reducing the search space. However, traditionally, this approach relies on an expert providing the order among the variables --- an expert may not always be available, or may be unable to provide the order. Nevertheless, if a good order is available, these order-based algorithms have demonstrated good performance. More recent approaches attempt to ``learn'' a good order then use the order-based algorithm to discover the structure. To eliminate the need for order information during construction, we propose a search in the entire space of Bayesian network structures --- we present a novel approach for carrying out this task, and we demonstrate its performance against existing algorithms that search in the entire space and the space of orders. Finally, we employ the hand-crafting framework to construct models for the task of diagnosis in a ``real-life'' medical domain, dementia diagnosis. We collect real dementia data from clinical practice, and we apply the data-driven algorithms developed to assess the concordance between the reference models developed by hand and the models derived from real clinical data
    corecore