211,803 research outputs found

    The Knowledge Life Cycle for e-learning

    No full text
    In this paper, we examine the semantic aspects of e-learning from both pedagogical and technological points of view. We suggest that if semantics are to fulfil their potential in the learning domain then a paradigm shift in perspective is necessary, from information-based content delivery to knowledge-based collaborative learning services. We propose a semantics driven Knowledge Life Cycle that characterises the key phases in managing semantics and knowledge, show how this can be applied to the learning domain and demonstrate the value of semantics via an example of knowledge reuse in learning assessment management

    An Extension of NDT to Model Entity Reconciliation Problems

    Get PDF
    Within the development of software systems, the development of web applications may be one of the most widespread at present due to the great number of advantages they provide such as: multiplatform, speed of access or the not requiring extremely powerful hardware among others. The fact that so many web applications are being developed, makes grotesque the volume of information that it is generated daily. In the management of all this information, it appears the entity reconciliation problem, which is to identify objects referring to the same real-world entity. This paper proposes to give a solution to this problem through a web perspective. To this end, the NDT methodology has been taken as a reference and has been extended adding new activities, artefacts and documents to cover this problem.Ministerio de Economía y Competitividad TIN2013-46928-C3-3-RMinisterio de Economía y Competitividad TIN2016-76956-C3-2-RMinisterio de Economía y Competitividad TIN2015-71938-RED

    Exploring the potential of dynamic mode decomposition in wireless communication and neuroscience applications

    Get PDF
    The exponential growth of available experimental, simulation, and historical data from modern systems, including those typically considered divergent (e.g., Neuroscience procedures and wireless networks), has created a persistent need for effective data mining and analysis techniques. Most systems can be characterized as high-dimensional, dynamical, exhibiting rich multiscale phenomena in both space and time. Engineering studies of complex linear and non-linear dynamical systems are especially challenging, as the behavior of the system is often unknown and complex. Studying this problem of interest necessitates discovering and modeling the underlying evolving dynamics. In such cases, a simplified, predictive model of the flow evolution profile must be developed based on observations/measurements collected from the system. Consequently, data-driven algorithms have become an essential tool for modeling and analyzing complex systems characterized by high nonlinearity and dimensionality. The field of data-driven modeling and analysis of complex systems is rapidly advancing. Associated investigations are poised to revolutionize the engineering, biomedical, and physical sciences. By applying modeling techniques, a complex system can be simplified using low-dimensional models with spatial-temporal structures described using system measurements. Such techniques enable complex system modeling without requiring knowledge of dynamic equations governing the system's operation. The primary objective of the work detailed in this dissertation was characterizing, identifying, and predicting the behavior of systems under analysis. In particular, characterization and identification entailed finding patterns embedded in system data; prediction required evaluating system dynamics. The thesis of this work proposes the implementation of dynamic mode decomposition (DMD), which is a fully data-driven technique, to characterize dynamical systems from extracted measurements. DMD employs singular value decomposition (SVD), which reduces high-dimensional measurements collected from a system and computes eigenvalues and eigenvectors of a linear approximated model. In other words, by rather estimating the underlying dynamics within a system, DMD serves as a powerful tool for system characterization without requiring knowledge of the governing dynamical equations. Overall, the work presented herein demonstrates the potential of DMD for analyzing and modeling complex systems in the emerging, synthesized field of wireless communication (i.e., wireless technology identification) and neuroscience (i.e., chemotherapy-induced peripheral neuropathy [CIPN] identification for cancer patients). In the former, a novel technique based on DMD was initially developed for wireless coexistence analysis. The scheme can differentiate various wireless technologies, including GSM and LTE signals in the cellular domain and IEEE802.11n, ac, and ax in the Wi-Fi domain, as well as Bluetooth and Zigbee in the personal wireless domain. By capturing embedded periodic features transmitted within the signal, the proposed DMD-based technique can identify a signal’s time domain signature. With regard to cancer neuroscience, a DMD-based scheme was developed to capture the pattern of plantar pressure variability due to the development of neuropathy resulting from neurotoxic chemotherapy treatment. The developed technique modeled gait pressure variations across multiple steps at three plantar regions, which characterized the development of CIPN in patients with uterine cancer. Obtained results demonstrated that DMD can effectively model various systems and characterize system dynamics. Given the advantages of fast data processing, minimal required data preprocessing, and minimal required signal observation time intervals, DMD has proven to be a powerful tool for system analysis and modeling

    User-centered visual analysis using a hybrid reasoning architecture for intensive care units

    Get PDF
    One problem pertaining to Intensive Care Unit information systems is that, in some cases, a very dense display of data can result. To ensure the overview and readability of the increasing volumes of data, some special features are required (e.g., data prioritization, clustering, and selection mechanisms) with the application of analytical methods (e.g., temporal data abstraction, principal component analysis, and detection of events). This paper addresses the problem of improving the integration of the visual and analytical methods applied to medical monitoring systems. We present a knowledge- and machine learning-based approach to support the knowledge discovery process with appropriate analytical and visual methods. Its potential benefit to the development of user interfaces for intelligent monitors that can assist with the detection and explanation of new, potentially threatening medical events. The proposed hybrid reasoning architecture provides an interactive graphical user interface to adjust the parameters of the analytical methods based on the users' task at hand. The action sequences performed on the graphical user interface by the user are consolidated in a dynamic knowledge base with specific hybrid reasoning that integrates symbolic and connectionist approaches. These sequences of expert knowledge acquisition can be very efficient for making easier knowledge emergence during a similar experience and positively impact the monitoring of critical situations. The provided graphical user interface incorporating a user-centered visual analysis is exploited to facilitate the natural and effective representation of clinical information for patient care

    Principles of Neuromorphic Photonics

    Full text link
    In an age overrun with information, the ability to process reams of data has become crucial. The demand for data will continue to grow as smart gadgets multiply and become increasingly integrated into our daily lives. Next-generation industries in artificial intelligence services and high-performance computing are so far supported by microelectronic platforms. These data-intensive enterprises rely on continual improvements in hardware. Their prospects are running up against a stark reality: conventional one-size-fits-all solutions offered by digital electronics can no longer satisfy this need, as Moore's law (exponential hardware scaling), interconnection density, and the von Neumann architecture reach their limits. With its superior speed and reconfigurability, analog photonics can provide some relief to these problems; however, complex applications of analog photonics have remained largely unexplored due to the absence of a robust photonic integration industry. Recently, the landscape for commercially-manufacturable photonic chips has been changing rapidly and now promises to achieve economies of scale previously enjoyed solely by microelectronics. The scientific community has set out to build bridges between the domains of photonic device physics and neural networks, giving rise to the field of \emph{neuromorphic photonics}. This article reviews the recent progress in integrated neuromorphic photonics. We provide an overview of neuromorphic computing, discuss the associated technology (microelectronic and photonic) platforms and compare their metric performance. We discuss photonic neural network approaches and challenges for integrated neuromorphic photonic processors while providing an in-depth description of photonic neurons and a candidate interconnection architecture. We conclude with a future outlook of neuro-inspired photonic processing.Comment: 28 pages, 19 figure

    An Adaptive Design Methodology for Reduction of Product Development Risk

    Full text link
    Embedded systems interaction with environment inherently complicates understanding of requirements and their correct implementation. However, product uncertainty is highest during early stages of development. Design verification is an essential step in the development of any system, especially for Embedded System. This paper introduces a novel adaptive design methodology, which incorporates step-wise prototyping and verification. With each adaptive step product-realization level is enhanced while decreasing the level of product uncertainty, thereby reducing the overall costs. The back-bone of this frame-work is the development of Domain Specific Operational (DOP) Model and the associated Verification Instrumentation for Test and Evaluation, developed based on the DOP model. Together they generate functionally valid test-sequence for carrying out prototype evaluation. With the help of a case study 'Multimode Detection Subsystem' the application of this method is sketched. The design methodologies can be compared by defining and computing a generic performance criterion like Average design-cycle Risk. For the case study, by computing Average design-cycle Risk, it is shown that the adaptive method reduces the product development risk for a small increase in the total design cycle time.Comment: 21 pages, 9 figure

    Living Innovation Laboratory Model Design and Implementation

    Full text link
    Living Innovation Laboratory (LIL) is an open and recyclable way for multidisciplinary researchers to remote control resources and co-develop user centered projects. In the past few years, there were several papers about LIL published and trying to discuss and define the model and architecture of LIL. People all acknowledge about the three characteristics of LIL: user centered, co-creation, and context aware, which make it distinguished from test platform and other innovation approaches. Its existing model consists of five phases: initialization, preparation, formation, development, and evaluation. Goal Net is a goal-oriented methodology to formularize a progress. In this thesis, Goal Net is adopted to subtract a detailed and systemic methodology for LIL. LIL Goal Net Model breaks the five phases of LIL into more detailed steps. Big data, crowd sourcing, crowd funding and crowd testing take place in suitable steps to realize UUI, MCC and PCA throughout the innovation process in LIL 2.0. It would become a guideline for any company or organization to develop a project in the form of an LIL 2.0 project. To prove the feasibility of LIL Goal Net Model, it was applied to two real cases. One project is a Kinect game and the other one is an Internet product. They were both transformed to LIL 2.0 successfully, based on LIL goal net based methodology. The two projects were evaluated by phenomenography, which was a qualitative research method to study human experiences and their relations in hope of finding the better way to improve human experiences. Through phenomenographic study, the positive evaluation results showed that the new generation of LIL had more advantages in terms of effectiveness and efficiency.Comment: This is a book draf

    Recent Achievements in Numerical Simulation in Sheet Metal Forming Processes

    Get PDF
    Purpose of this paper: During the recent 10-15 years, Computer Aided Process Planning and Die Design evolved as one of the most important engineering tools in sheet metal forming, particularly in the automotive industry. This emerging role is strongly emphasized by the rapid development of Finite Element Modelling, as well. The purpose of this paper is to give a general overview about the recent achievements in this very important field of sheet metal forming and to introduce some special results in this development activity. Design/methodology/approach: Concerning the CAE activities in sheet metal forming, there are two main approaches: one of them may be regarded as knowledge based process planning, whilst the other as simulation based process planning. The author attempts to integrate these two separate developments in knowledge and simulation based approach by linking commercial CAD and FEM systems. Findings: Applying the above approach a more powerful and efficient process planning and die design solution can be achieved radically reducing the time and cost of product development cycle and improving product quality. Research limitations: Due to the different modelling approaches in CAD and FEM systems, the biggest challenge is to enhance the robustness of data exchange capabilities between various systems to provide an even more streamlined information flow. Practical implications: The proposed integrated solutions have great practical importance to improve the global competitiveness of sheet metal forming in the very important segment of industry. Originality/value: The concept described in this paper may have specific value both for process planning and die design engineers
    corecore