834 research outputs found

    BRAHMS: Novel middleware for integrated systems computation

    Get PDF
    Biological computational modellers are becoming increasingly interested in building large, eclectic models, including components on many different computational substrates, both biological and non-biological. At the same time, the rise of the philosophy of embodied modelling is generating a need to deploy biological models as controllers for robots in real-world environments. Finally, robotics engineers are beginning to find value in seconding biomimetic control strategies for use on practical robots. Together with the ubiquitous desire to make good on past software development effort, these trends are throwing up new challenges of intellectual and technological integration (for example across scales, across disciplines, and even across time) - challenges that are unmet by existing software frameworks. Here, we outline these challenges in detail, and go on to describe a newly developed software framework, BRAHMS. that meets them. BRAHMS is a tool for integrating computational process modules into a viable, computable system: its generality and flexibility facilitate integration across barriers, such as those described above, in a coherent and effective way. We go on to describe several cases where BRAHMS has been successfully deployed in practical situations. We also show excellent performance in comparison with a monolithic development approach. Additional benefits of developing in the framework include source code self-documentation, automatic coarse-grained parallelisation, cross-language integration, data logging, performance monitoring, and will include dynamic load-balancing and 'pause and continue' execution. BRAHMS is built on the nascent, and similarly general purpose, model markup language, SystemML. This will, in future, also facilitate repeatability and accountability (same answers ten years from now), transparent automatic software distribution, and interfacing with other SystemML tools. (C) 2009 Elsevier Ltd. All rights reserved

    FPGA design methodology for industrial control systems—a review

    Get PDF
    This paper reviews the state of the art of fieldprogrammable gate array (FPGA) design methodologies with a focus on industrial control system applications. This paper starts with an overview of FPGA technology development, followed by a presentation of design methodologies, development tools and relevant CAD environments, including the use of portable hardware description languages and system level programming/design tools. They enable a holistic functional approach with the major advantage of setting up a unique modeling and evaluation environment for complete industrial electronics systems. Three main design rules are then presented. These are algorithm refinement, modularity, and systematic search for the best compromise between the control performance and the architectural constraints. An overview of contributions and limits of FPGAs is also given, followed by a short survey of FPGA-based intelligent controllers for modern industrial systems. Finally, two complete and timely case studies are presented to illustrate the benefits of an FPGA implementation when using the proposed system modeling and design methodology. These consist of the direct torque control for induction motor drives and the control of a diesel-driven synchronous stand-alone generator with the help of fuzzy logic

    The Layer-Oriented Approach to Declarative Languages for Biological Modeling

    Get PDF
    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language

    Safe-guarded multi-agent control for mechatronic systems: implementation framework and design patterns

    Get PDF
    This thesis addresses two issues: (i) developing an implementation framework for Multi-Agent Control Systems (MACS); and (ii) developing a pattern-based safe-guarded MACS design method.\ud \ud The Multi-Agent Controller Implementation Framework (MACIF), developed by Van Breemen (2001), is selected as the starting point because of its capability to produce MACS for solving complex control problems with two useful features:\ud ‱ MACS is hierarchically structured in terms of a coordinated group of elementary and/or composite controller-agents;\ud ‱ MACS has an open architecture such that controller-agents can be added, modified or removed without redesigning and/or reprogramming the remaining part of the MACS

    Machine Learning with Time Series: A Taxonomy of Learning Tasks, Development of a Unified Framework, and Comparative Benchmarking of Algorithms

    Get PDF
    Time series data is ubiquitous in real-world applications. Such data gives rise to distinct but closely related learning tasks (e.g. time series classification, regression or forecasting). In contrast to the more traditional cross-sectional setting, these tasks are often not fully formalized. As a result, different tasks can become conflated under the same name, algorithms are often applied to the wrong task, and performance estimates are are potentially unreliable. In practice, software frameworks such as scikit-learn have become essential tools for data science. However, most existing frameworks focus on cross-sectional data. To our know- ledge, no comparable frameworks exist for temporal data. Moreover, despite the importance of these framework, their design principles have never been fully understood. Instead, discussions often concentrate on the usage and features, while almost completely ignoring the design. To address these issues, we develop in this thesis (i) a formal taxonomy of learning tasks, (ii) novel design principles for ML toolboxes and (iii) a new unified framework for ML with time series. The framework has been implemented in an open-source Python package called sktime. The design principles are derived from existing state-of-the-art toolboxes and classical software design practices, using a domain-driven approach and a novel scientific type system. We show that these principles cannot just explain key aspects of existing frameworks, but also guide the development of new ones like sktime. Finally, we use sktime to reproduce and extend the M4 competition, one of the major comparative benchmarking studies for forecasting. Reproducing the competition allows us to verify the published results and illustrate sktime’s effectiveness. Extending the competition enables us to explore the potential of previously unstudied ML models. We find that, on a subset of the M4 data, simple ML models implemented in sktime can match the state-of-the-art performance of the hand-crafted M4 winner models

    Assessment of Hand Gestures Using Wearable Sensors and Fuzzy Logic

    Get PDF
    Hand dexterity and motor control are critical in our everyday lives because a significant portion of the daily motions we perform are with our hands and require some degree of repetition and skill. Therefore, development of technologies for hand and extremity rehabilitation is a significant area of research that will directly help patients recovering from hand debilities sustained from causes ranging from stroke and Parkinson’s disease to trauma and common injuries. Cyclic activity recognition and assessment is appropriate for hand and extremity rehabilitation because a majority of our essential motions are cyclic in their nature. For a patient on the road to regaining functional independence with daily skills, the improvement in cyclic motions constitutes an important and quantifiable rehabilitation goal. However, challenges exist with hand rehabilitation sensor technologies preventing acquisition of long-term, continuous, accurate and actionable motion data. These challenges include complicated and uncomfortable system assemblies, and a lack of integration with consumer electronics for easy readout. In our research, we have developed a glove based system where the inertial measurement unit (IMU) sensors are used synergistically with the flexible sensors to minimize the number of IMU sensors. The classification capability of our system is improved by utilizing a fuzzy logic data analysis algorithm. We tested a total of 25 different subjects using a glove-based apparatus to gather data on two-dimensional motions with one accelerometer and three-dimensional motions with one accelerometer and two flexible sensors. Our research provides an approach that has the potential to utilize both activity recognition and activity assessment using simple sensor systems to help patients recover and improve their overall quality of life

    A generic neural network framework using design patterns

    Get PDF
    Designing object-oriented software is hard, and designing reusable object-oriented software is even harder. This task is even more daunting for a developer of computational intelligence applications, as optimising one design objective tends to make others inefficient or even impossible. Classic examples in computer science include ‘storage vs. time’ and ‘simplicity vs. flexibility.’ Neural network requirements are by their very nature very tightly coupled – a required design change in one area of an existing application tends to have severe effects in other areas, making the change impossible or inefficient. Often this situation leads to a major redesign of the system and in many cases a completely rewritten application. Many commercial and open-source packages do exist, but these cannot always be extended to support input from other fields of computational intelligence due to proprietary reasons or failing to fully take all design requirements into consideration. Design patterns make a science out of writing software that is modular, extensible and efficient as well as easy to read and understand. The essence of a design pattern is to avoid repeatedly solving the same design problem from scratch by reusing a solution that solves the core problem. This pattern or template for the solution has well understood prerequisites, structure, properties, behaviour and consequences. CILib is a framework that allows developers to develop new computational intelligence applications quickly and efficiently. Flexibility, reusability and clear separation between components are maximised through the use of design patterns. Reliability is also ensured as the framework is open source and thus has many people that collaborate to ensure that the framework is well designed and error free. This dissertation discusses the design and implementation of a generic neural network framework that allows users to design, implement and use any possible neural network models and algorithms in such a way that they can reuse and be reused by any other computational intelligence algorithm in the rest of the framework, or any external applications. This is achieved by using object-oriented design patterns in the design of the framework.Dissertation (MSc)--University of Pretoria, 2007.Computer Scienceunrestricte

    Calculation of Energy Footprint of Manufacturing Assets

    Get PDF
    Energy efficiency is one of the important topics in manufacturing sector, primarily due to increasing energy prices, ecological concerns and stringent regulations. Industries that have already spent many years in improving energy efficiency now find it difficult to propose and implement additional improvement measures. This thesis proposes a way to identify potential areas for improvements in energy efficiency by calculating energy footprint of the manufacturing assets. The information on energy footprint is provided using analytic tools. The main goal of the thesis is to demonstrate how energy footprint can help in making decisions regarding the manufacturing facility that may lead to improvement in the overall energy efficiency of the system. For purpose of energy data analysis, a methodology is proposed in which the analytic tools are developed as web services. This methodology is suitable for implementation within a service-oriented manufacturing facility. The core analysis of data is carried out using MATLAB due to its powerful data analysis capabilities, high level programming, availability of relevant libraries, easiness in data handling and manipulation. MATLAB functionalities are deployed in Java and are published as web services. The classes in Java are mainly responsible for message parsing and sending/receiving data from MATLAB functions deployed in Java. The work demonstrates the usefulness of analytic tools in analyzing energy footprint of manufacturing assets. The work also demonstrates applicability of service-oriented approach for data analysis and availability of analytic tools as web services in a service-oriented manufacturing system. The service-oriented approach enables extensibility of analytic web services and online availability. This study results in development of analytic tools helpful in analysis of energy footprint of manufacturing assets within a service oriented manufacturing system. Energy footprint helps in identifying potential areas for improving energy efficiency

    A distributed architecture for the monitoring and analysis of time series data

    Get PDF
    It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domai
    • 

    corecore