72,924 research outputs found

    Electric field emissions of FPGA chip based on gigahertz transverse electromagnetic cell modeling and measurements

    Get PDF
    Modern integrated circuits (ICs) are significant sources of undesired electromagnetic wave. Therefore, characterization of chip-level emission is essential to comply with EMC tests at the product level. A Gigahertz Transverse Electromagnetic (GTEM) cell is a common test instrument used to measure IC radiated emission and the test cost is relatively low. Regular IC radiated emission measurements using GTEM tend to neglect some significant emission sources. Thus, this research proposed an alternative methodology to perform field measurement of the IC inside the GTEM cell in order to optimize the field measurements. This research study also attempted analysis of the overall GTEM cell performance using transmission line theory. An FPGA chip was adopted as the IC under test because of its flexibility in configuration to any digital circuit. The investigations discovered that the impact of the FPGA board supporting components and interconnection cables can be significantly reduced with appropriate shielding and grounding. The electric field predict a far distance from the FPGA chip was carried out based on the dipole moment technique. In particular, the dipole moment model emphasizing the tiny horizontal and vertical radiation elements inside the FPGA chip as Hertzian antenna and small current loop. Equations to predict the horizontal and vertical electric field were developed based on Hertzian antenna and small current loop which relate the tiny radiation sources to electric and magnetic dipole moments. The prediction was validated with 3-meter field measurements in a semi-anechoic chamber. On top of that, a spiral-like pattern was developed to obtain a correction factor for further improvement of the correlation between prediction and SAC measurement. The results revealed that the correction factor effectively reduced the gap between the prediction and measurement fields and boosted the correlation coefficient by 44%. The difference of peak values also has limited to less than 0dB after correction. These results suggest a promising finding for a future EMI test of ICs with a cheaper GTEM cell

    Block-Based Development of Mobile Learning Experiences for the Internet of Things

    Get PDF
    The Internet of Things enables experts of given domains to create smart user experiences for interacting with the environment. However, development of such experiences requires strong programming skills, which are challenging to develop for non-technical users. This paper presents several extensions to the block-based programming language used in App Inventor to make the creation of mobile apps for smart learning experiences less challenging. Such apps are used to process and graphically represent data streams from sensors by applying map-reduce operations. A workshop with students without previous experience with Internet of Things (IoT) and mobile app programming was conducted to evaluate the propositions. As a result, students were able to create small IoT apps that ingest, process and visually represent data in a simpler form as using App Inventor's standard features. Besides, an experimental study was carried out in a mobile app development course with academics of diverse disciplines. Results showed it was faster and easier for novice programmers to develop the proposed app using new stream processing blocks.Spanish National Research Agency (AEI) - ERDF fund

    Electricity consumption forecasting using Adaptive Neuro-Fuzzy Inference System (ANFIS)

    Get PDF
    Universiti Tun Hussein Onn Malaysia (UTHM) is a developing Malaysian Technical University. There is a great development of UTHM since its formation in 1993. Therefore, it is crucial to have accurate future electricity consumption forecasting for its future energy management and saving. Even though there are previous works of electricity consumption forecasting using Adaptive Neuro-Fuzzy Inference System (ANFIS), but most of their data are multivariate data. In this study, we have only univariate data of UTHM electricity consumption from January 2009 to December 2018 and wish to forecast 2019 consumption. The univariate data was converted to multivariate and ANFIS was chosen as it carries both advantages of Artificial Neural Network (ANN) and Fuzzy Inference System (FIS). ANFIS yields the MAPE between actual and predicted electricity consumption of 0.4002% which is relatively low if compared to previous works of UTHM electricity forecasting using time series model (11.14%), and first-order fuzzy time series (5.74%), and multiple linear regression (10.62%)

    Improving the Quality of Technology-Enhanced Learning for Computer Programming Courses

    Get PDF
    Teaching computing courses is a major challenge for the majority of lecturers in Libyan higher learning institutions. These courses contain numerous abstract concepts that cannot be easily explained using traditional educational methods. This paper describes the rationale, design, development and implementation stages of an e-learning package (including multimedia resources such as simulations, animations, and videos) using the ASSURE model. This training package can be used by students before they attend practical computer lab sessions, preparing them by developing technical skills and applying concepts and theories presented in lecture through supplementary study and exercises

    Leachate treatment by conventional coagulation, electrocoagulation and two-stage coagulation (conventional coagulation and electrocoagulation)

    Get PDF
    Leachate is widely explored and investigated due to highly polluted and difficult to treat. Leachate treatment commonly involves advanced, complicated and high cost activities. Conventional coagulation is widely used in the treatment of wastewater but the sludge production becomes the biggest constraint in this treatment. Electrocoagulation is an alternative to conventional method because it has the same application but produce less sludge and requires simple equipment. Thus, combination of conventional coagulation and electrocoagulation can improve the efficiency of coagulation process in leachate treatment. This article is focusing on the efficiency of single and combined treatment as well as the improvement made by combined treatment. Based on review, the percentage reduction of current density and dose of coagulant was perceptible. As much 50% reduction of current density, duration of treatment, and dose of coagulant able to be obtained by using combined treatment. This combined treatment is able to reduce the cost and at the same time reduce the duration of treatment. Hence, the combined treatment offers an alternative technique for landfill leachate treatment on the removal of pollutants

    Introducing instrumentation and data acquisition to mechanical engineering students using LabVIEW

    Get PDF
    For several years, LabVIEW has been used within the Department of Mechanical Engineering at the University of Strathclyde as the basis for introducing the basic concepts and practice of data acquisition, and more generally, instrumentation, to postgraduate engineering students and undergraduate project students. The objectives of introducing LabVIEW within the curriculum were to expose students to instrumentation and experimental analysis, and to create courseware that could be used flexibly for a range of students. It was also important that staff time for laboratory work be kept to manageable levels. A course module was developed which allows engineering students with very little or no previous knowledge of instrumentation or programming to become acquainted with the basics of programming, experimentation and data acquisition. The basic course structure has been used to teach both undergraduates and postgraduates as well as laboratory technical staff. The paper describes the objectives of the use of LabVIEW for teaching, the structure of the module developed, and the response of students who have been subjected to the course, and how it is intended to expand the delivery to greater student numbers

    Spatial-temporal data modelling and processing for personalised decision support

    Get PDF
    The purpose of this research is to undertake the modelling of dynamic data without losing any of the temporal relationships, and to be able to predict likelihood of outcome as far in advance of actual occurrence as possible. To this end a novel computational architecture for personalised ( individualised) modelling of spatio-temporal data based on spiking neural network methods (PMeSNNr), with a three dimensional visualisation of relationships between variables is proposed. In brief, the architecture is able to transfer spatio-temporal data patterns from a multidimensional input stream into internal patterns in the spiking neural network reservoir. These patterns are then analysed to produce a personalised model for either classification or prediction dependent on the specific needs of the situation. The architecture described above was constructed using MatLab© in several individual modules linked together to form NeuCube (M1). This methodology has been applied to two real world case studies. Firstly, it has been applied to data for the prediction of stroke occurrences on an individual basis. Secondly, it has been applied to ecological data on aphid pest abundance prediction. Two main objectives for this research when judging outcomes of the modelling are accurate prediction and to have this at the earliest possible time point. The implications of these findings are not insignificant in terms of health care management and environmental control. As the case studies utilised here represent vastly different application fields, it reveals more of the potential and usefulness of NeuCube (M1) for modelling data in an integrated manner. This in turn can identify previously unknown (or less understood) interactions thus both increasing the level of reliance that can be placed on the model created, and enhancing our human understanding of the complexities of the world around us without the need for over simplification. Read less Keywords Personalised modelling; Spiking neural network; Spatial-temporal data modelling; Computational intelligence; Predictive modelling; Stroke risk predictio

    Spatial-temporal data modelling and processing for personalised decision support

    Get PDF
    The purpose of this research is to undertake the modelling of dynamic data without losing any of the temporal relationships, and to be able to predict likelihood of outcome as far in advance of actual occurrence as possible. To this end a novel computational architecture for personalised ( individualised) modelling of spatio-temporal data based on spiking neural network methods (PMeSNNr), with a three dimensional visualisation of relationships between variables is proposed. In brief, the architecture is able to transfer spatio-temporal data patterns from a multidimensional input stream into internal patterns in the spiking neural network reservoir. These patterns are then analysed to produce a personalised model for either classification or prediction dependent on the specific needs of the situation. The architecture described above was constructed using MatLab© in several individual modules linked together to form NeuCube (M1). This methodology has been applied to two real world case studies. Firstly, it has been applied to data for the prediction of stroke occurrences on an individual basis. Secondly, it has been applied to ecological data on aphid pest abundance prediction. Two main objectives for this research when judging outcomes of the modelling are accurate prediction and to have this at the earliest possible time point. The implications of these findings are not insignificant in terms of health care management and environmental control. As the case studies utilised here represent vastly different application fields, it reveals more of the potential and usefulness of NeuCube (M1) for modelling data in an integrated manner. This in turn can identify previously unknown (or less understood) interactions thus both increasing the level of reliance that can be placed on the model created, and enhancing our human understanding of the complexities of the world around us without the need for over simplification. Read less Keywords Personalised modelling; Spiking neural network; Spatial-temporal data modelling; Computational intelligence; Predictive modelling; Stroke risk predictio
    • …
    corecore