255,028 research outputs found

    Cyber-Physical Systems Technologies: Applications in Industry and Education

    Get PDF
    Industry 4.0 concept development forms new trends as cloud computing,  big data analysis, the industrial internet of things, machine-to-machine technologies. Cyber-physical systems (CPS) paradigm is based on these trends and integrates of computation, networking and physical processes. Synergy Center at Peter the Great St. Petersburg Polytechnic University works in the areas of intelligent systems for data processing and control, motion control systems for robotics, complex automation and mechatronics as components of CPS. Keywords: Industry 4.0, Cyber-physical systems, Digital twin; intelligent control system, automation, Global digitalisation, Practical-oriented online courses, Skills training, Joint international educational programmes

    Intelligent Sensors: An Integrated Systems Approach

    Get PDF
    The need for intelligent sensors as a critical component for Integrated System Health Management (ISHM) is fairly well recognized by now. Even the definition of what constitutes an intelligent sensor (or smart sensor) is well documented and stems from an intuitive desire to get the best quality measurement data that forms the basis of any complex health monitoring and/or management system. If the sensors, i.e. the elements closest to the measurand, are unreliable then the whole system works with a tremendous handicap. Hence, there has always been a desire to distribute intelligence down to the sensor level, and give it the ability to assess its own health thereby improving the confidence in the quality of the data at all times. This paper proposes the development of intelligent sensors as an integrated systems approach, i.e. one treats the sensors as a complete system with its own sensing hardware (the traditional sensor), A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the NASA Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements. These smart elements can be sensors, actuators or other devices. The immediate application is the monitoring of the rocket test stands, but the technology should be generally applicable to the Intelligent Systems Health Monitoring (ISHM) vision. This paper outlines some fundamental issues in the development of intelligent sensors under the following two categories: Physical Intelligent Sensors (PIS) and Virtual Intelligent Sensors (VIS)

    Reasoning by SVD and morphotronic network

    Full text link
    The immune system of the vertebrates possess the capabilities of "intelligent" information processing, which include memory, the ability to learn, to recognize, and to make decisions with respect to unknown situations. The mathematical formalization of these capabilities forms the basis of immune-computing (IC) as a new computing approach that replicates the principles of information processing by proteins and immune networks. This IC approach looks rather constructive as a basis for a new kind of computing. With the Morphotronic System or the analogous SVD we can create effective learning process and create immune memory by the projection operators. Given the immune memory is possible to recognize and compare antigen in a way to take defense action to eliminate the dangerous cell

    JXTA-Overlay: a P2P platform for distributed, collaborative, and ubiquitous computing

    Get PDF
    With the fast growth of the Internet infrastructure and the use of large-scale complex applications in industries, transport, logistics, government, health, and businesses, there is an increasing need to design and deploy multifeatured networking applications. Important features of such applications include the capability to be self-organized, be decentralized, integrate different types of resources (personal computers, laptops, and mobile and sensor devices), and provide global, transparent, and secure access to resources. Moreover, such applications should support not only traditional forms of reliable distributing computing and optimization of resources but also various forms of collaborative activities, such as business, online learning, and social networks in an intelligent and secure environment. In this paper, we present the Juxtapose (JXTA)-Overlay, which is a JXTA-based peer-to-peer (P2P) platform designed with the aim to leverage capabilities of Java, JXTA, and P2P technologies to support distributed and collaborative systems. The platform can be used not only for efficient and reliable distributed computing but also for collaborative activities and ubiquitous computing by integrating in the platform end devices. The design of a user interface as well as security issues are also tackled. We evaluate the proposed system by experimental study and show its usefulness for massive processing computations and e-learning applications.Peer ReviewedPostprint (author's final draft

    Issues in knowledge representation to support maintainability: A case study in scientific data preparation

    Get PDF
    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings

    Intelligent assistance in scientific data preparation

    Get PDF
    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and run time estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks

    Scheduling of non-repetitive lean manufacturing systems under uncertainty using intelligent agent simulation

    Get PDF
    World-class manufacturing paradigms emerge from specific types of manufacturing systems with which they remain associated until they are obsolete. Since its introduction the lean paradigm is almost exclusively implemented in repetitive manufacturing systems employing flow-shop layout configurations. Due to its inherent complexity and combinatorial nature, scheduling is one application domain whereby the implementation of manufacturing philosophies and best practices is particularly challenging. The study of the limited reported attempts to extend leanness into the scheduling of non-repetitive manufacturing systems with functional shop-floor configurations confirms that these works have adopted a similar approach which aims to transform the system mainly through reconfiguration in order to increase the degree of manufacturing repetitiveness and thus facilitate the adoption of leanness. This research proposes the use of leading edge intelligent agent simulation to extend the lean principles and techniques to the scheduling of non-repetitive production environments with functional layouts and no prior reconfiguration of any form. The simulated system is a dynamic job-shop with stochastic order arrivals and processing times operating under a variety of dispatching rules. The modelled job-shop is subject to uncertainty expressed in the form of high priority orders unexpectedly arriving at the system, order cancellations and machine breakdowns. The effect of the various forms of the stochastic disruptions considered in this study on system performance prior and post the introduction of leanness is analysed in terms of a number of time, due date and work-in-progress related performance metrics

    Roadmap on signal processing for next generation measurement systems

    Get PDF
    Signal processing is a fundamental component of almost any sensor-enabled system, with a wide range of applications across different scientific disciplines. Time series data, images, and video sequences comprise representative forms of signals that can be enhanced and analysed for information extraction and quantification. The recent advances in artificial intelligence and machine learning are shifting the research attention towards intelligent, data-driven, signal processing. This roadmap presents a critical overview of the state-of-the-art methods and applications aiming to highlight future challenges and research opportunities towards next generation measurement systems. It covers a broad spectrum of topics ranging from basic to industrial research, organized in concise thematic sections that reflect the trends and the impacts of current and future developments per research field. Furthermore, it offers guidance to researchers and funding agencies in identifying new prospects.AerodynamicsMicrowave Sensing, Signals & System

    Empirical Evaluation of Vehicle Detection, Tracking And Recognition Algorithms Operating On Real Time Video Feeds

    Get PDF
    A traffic surveillance camera system is an important part of an intelligent transportation system.(Zhang et al., 2013) This system is capable of performing useful object detections on the incoming feed. These detected objects can then be used for tracking purposes which forms the basis for monitoring important traffic data such as collisions, vehicle count, pedestrian count and so on. Furthermore, other additional information such as the weather conditions, time of day as well as date can also be extracted from a live feed. (Sun et al., 2004) Different algorithms can yield different results for any given video input. Not only that, various parameters such as the resolution, frames per second(fps) count, lighting conditions in the video input also affect performance. Therefore, it is imperative to compare between the various image processing and deep learning algorithms and evaluate their performance before deploying them in real time
    corecore