20 research outputs found

    Designing an Energy Efficient Network Using Integration of KSOM, ANN and Data Fusion Techniques

    Get PDF
    Energy in a wireless sensor network (WSN) is rendered as the major constraint that affects the overall feasibility and performance of a network. With the dynamic and demanding requirements of diverse applications, the need for an energy efficient network persists. Therefore, this paper proposes a mechanism for optimizing the energy consumption in WSN through the integration of artificial neural networks (ANN) and Kohonen self-organizing map (KSOM) techniques. The clusters are formed and re-located after iteration for effective distribution of energy and reduction of energy depletion at individual nodes. Furthermore, back propagation algorithm is used as a supervised learning method for optimizing the approach and reducing the loss function. The simulation results show the effectiveness of the proposed energy efficient network

    Enhanced Home Energy Management Scheme (EHEM) in Smart Grids

    Get PDF
    Wireless Sensor Networks (WSNs) have become one of the most important components that play a major role in home environment applications. It plays a major role in the creation and the development of smart home environments. Smart homes creates home area network (HAN) to be used in different applications including smart grids. In this paper, we propose an enhancement to in-Home Energy Management (iHEM) scheme, namely EHEM, to reduce energy consumption by shifting the residents’ demands to mid-peak or off-peak periods depending on the appliances priorities and delays. The proposed system handles challenging cases by using internal storage battery. The performance of the proposed system is compared against iHEM and the traditional iHEM scheme, based on the total cost of the power consumption. Obtained results show slight improvement over the existing iHEM schem

    Bandwidth Allocation Mechanism based on Users' Web Usage Patterns for Campus Networks

    Get PDF
    Managing the bandwidth in campus networks becomes a challenge in recent years. The limited bandwidth resource and continuous growth of users make the IT managers think on the strategies concerning bandwidth allocation. This paper introduces a mechanism for allocating bandwidth based on the users’ web usage patterns. The main purpose is to set a higher bandwidth to the users who are inclined to browsing educational websites compared to those who are not. In attaining this proposed technique, some stages need to be done. These are the preprocessing of the weblogs, class labeling of the dataset, computation of the feature subspaces, training for the development of the ANN for LDA/GSVD algorithm, visualization, and bandwidth allocation. The proposed method was applied to real weblogs from university’s proxy servers. The results indicate that the proposed method is useful in classifying those users who used the internet in an educational way and those who are not. Thus, the developed ANN for LDA/GSVD algorithm outperformed the existing algorithm up to 50% which indicates that this approach is efficient. Further, based on the results, few users browsed educational contents. Through this mechanism, users will be encouraged to use the internet for educational purposes. Moreover, IT managers can make better plans to optimize the distribution of bandwidth

    Artificial Neural Networks, Support Vector Machine And Energy Detection For Spectrum Sensing Based On Real Signals

    Get PDF
    A Cognitive Radio (CR) is an intelligent wireless communication system, which is able to improve the utilization of the spectral environment. Spectrum sensing (SS) is one of the most important phases in the cognitive radio cycle, this operation consists in detecting signals presence in a particular frequency band. In order to detect primary user (PU) existence, this paper proposes a low cost and low power consumption spectrum sensing implementation. Our proposed platform is tested based on real world signals. Those signals are generated by a Raspberry Pi card and a 433 MHz Wireless transmitter (ASK (Amplitude-Shift Keying) and FSK (Frequency-Shift Keying) modulation type).  RTL-SDR dongle is used as a reception interface. In this work, we compare the performance of three methods for SS operation: The energy detection technique, the Artificial neural network (ANN) and the support vector machine (SVM). So, the received data could be classified as a PU or not (noise) by the ED method, and by training and testing on a proposed ANN and SVM classification model. The proposed algorithms are implemented under MATLAB software. In order to determine the best architecture, in the case of ANN, two different training algorithms are compared. Furthermore, we have investigated the effect of several SVM functions. The main objective is to find out the best method for signal detection between the three methods. The performance evaluation of our proposed system are the probability of detection and the false alarm probability . This Comparative work has shown that the SS operation by SVM can be more accurate than ANN and ED

    Study of Computational and Experimental Methodologies for Cracks Recognition of Vibrating Systems using Modal Parameters

    Get PDF
    Mostly the structural members and machine elements are subjected to progressive static and dynamic loading and that may cause initiation of defects in the form of crack. The cause of damage may be due to the normal operation, accidents or severe natural calamities such as earthquake or storm. That may lead to catastrophic failure or collapse of the structures. Thereby the importance of identification of damage in the structures is not only for leading safe operation but also to prevent the loss of economy and lives. The condition monitoring of the engineering systems is attracted by the researchers and scientists very much to invent the automated fault diagnosis mechanism using the change in vibration response before and after damage. The structural steel is widely used in various engineering systems such as bridges, railway coaches, ships, automobiles, etc. The glass fiber reinforced epoxy layered composite material has become popular for constructing the various engineering structures due to its valuable characteristics such as higher stiffness and strength to weight ratio, better damage tolerance capacity and wear resistance. Therefore, layered composite and structural steel have been taken into account in the current study. The theoretical analysis has been performed to measure the vibration signatures (Natural Frequencies and Mode Shapes) of multiple cracked composite and structural steel. The presence of the crack in structures generates an additional flexibility. That is evaluated by strain energy release rate given by linear fracture mechanics. The additional flexibility alters the dynamic signatures of cracked beam. The local stiffness matrix has been calculated by the inverse of local dimensionless compliance matrix. The finite element analysis has been carried out to measure the vibration signatures of cracked cantilever beam using commercially available finite element software package ANSYS. It is observed from the current analysis, the various factors such as the orientation of cracks, number and position of the cracks affect the performance and effectiveness of damage detection techniques. The various automated artificial intelligent (AI) techniques such as fuzzy controller, neural network and hybrid AI techniques based multiple faults diagnosis systems are developed using vibration response of cracked cantilever beams. The experiments have been conducted to verify the performance and accuracy of proposed methods. A good agreement is observed between the results

    Geothermal Paving Systems for Urban Runoff Treatment and Renewable Energy Efficiency

    Get PDF
    Water and energy are two of the most precious and essential resources which are inseparably connected; vital for the survival and well-being of humanity. Sustainable water resources and energy management emphasizes the requirement for a holistic approach in meeting the needs of the present and future generations. In order to indentify the needs and obstacles relating to water reuse and renewable energy initiatives, Hanson Formpave in partnership with The University of Edinburgh implement a five-year pilot project between May 2005 and June 2010. The research project addressed the use of sustainable urban drainage systems (SUDS) such as permeable pavements systems (PPS) and integration of renewable energy tools such as geothermal heat pumps (GHPs). The research uses the novel and timely urban drainage system and focuses on water quality assessment when incorporated with GHPs. Twelve-tanked laboratory scaled experimental PPS were evaluated at The King’s Building campus (The University of Edinburgh, Scotland) using different compositions. Variations in designs included the presence of geotextiles layers and geothermal heating/cooling applications. The experimental rigs were examined for a two year period (March 2008 to April 2010). Two types of urban stormwater were used in the analysis; (i) gully pot liquor and (ii) gully pot liquor spiked with Canis lupus familiaris (dog) faeces. This urban wastewater represented the extreme worstcase scenario from a storm event, which can occur on a permeable pavement parking lot. The pavement systems operated in batch-flow to mimic weekly storm events and reduce pumping costs. Six PPS were located indoor in a controlled environment and six corresponding PPS were placed outdoors to allow for a direct comparison of controlled and uncontrolled environmental conditions. The outdoor rig simulated natural weather conditions whilst the indoor rig operated under controlled environmental conditions such as regulated temperature, humidity and light. The project assessed the performance of these pavement rigs with the integration of ground-source heating and cooling, standalone PPS and the abilities for water quality treatment from a physical, chemical and microbiological perspective. The performance efficiency of the GHP was measured by the energy efficiency ration (EER) for steady state cooling efficiency and the coefficient of performance (COP) for the heating cycle efficiency. Findings from the combined PPS and GHP system and standalone systems were able to significantly lower levels for all physiochemical and microbial water quality parameters in the range of (70-99.99%) respectively. Outflow concentrations for all pavement systems met the European Commission Environment Urban Wastewater Treatment Directive (91/271/EEC). The presence of geotextiles resulted in a significant reduction of contaminants when compared to PPS systems without (p <0.05). Photocatalytic disinfection with titanium dioxide (TIO2) was applied to the effluent from PPS for further treatment and polishing of the stormwater. After the photocatalytic disinfection, the water met the requirements for the United States Environmental Protection Agency (US EPA) water recycling guidelines and the World Health Organisation (WHO) guidelines for potable water consumption with regards to microbial contamination. An Energy and temperature balance was developed for two PPS using a 4th order Runge-Kutta numerical method to model the heat fluxes and energy balance within the pavement system. Machine learning techniques such as artificial neural networks (backpropagatioin feed forward neural networks) and self-organising maps (SOM) were applied and successfully predicted the effluent concentrations of nutrients, biochemical oxygen demand (BOD) and microbial pollutants. The overall outcome of this research is a significant contribution to the development of a new generable of sustainable and eco-friendly pavements. The research project proves scientifically that PPS is one of the most appropriate systems for GHP installation and does not affect its efficiency for water pollutant removal

    A system for recognizing human emotions based on speech analysis and facial feature extraction: applications to Human-Robot Interaction

    Get PDF
    With the advance in Artificial Intelligence, humanoid robots start to interact with ordinary people based on the growing understanding of psychological processes. Accumulating evidences in Human Robot Interaction (HRI) suggest that researches are focusing on making an emotional communication between human and robot for creating a social perception, cognition, desired interaction and sensation. Furthermore, robots need to receive human emotion and optimize their behavior to help and interact with a human being in various environments. The most natural way to recognize basic emotions is extracting sets of features from human speech, facial expression and body gesture. A system for recognition of emotions based on speech analysis and facial features extraction can have interesting applications in Human-Robot Interaction. Thus, the Human-Robot Interaction ontology explains how the knowledge of these fundamental sciences is applied in physics (sound analyses), mathematics (face detection and perception), philosophy theory (behavior) and robotic science context. In this project, we carry out a study to recognize basic emotions (sadness, surprise, happiness, anger, fear and disgust). Also, we propose a methodology and a software program for classification of emotions based on speech analysis and facial features extraction. The speech analysis phase attempted to investigate the appropriateness of using acoustic (pitch value, pitch peak, pitch range, intensity and formant), phonetic (speech rate) properties of emotive speech with the freeware program PRAAT, and consists of generating and analyzing a graph of speech signals. The proposed architecture investigated the appropriateness of analyzing emotive speech with the minimal use of signal processing algorithms. 30 participants to the experiment had to repeat five sentences in English (with durations typically between 0.40 s and 2.5 s) in order to extract data relative to pitch (value, range and peak) and rising-falling intonation. Pitch alignments (peak, value and range) have been evaluated and the results have been compared with intensity and speech rate. The facial feature extraction phase uses the mathematical formulation (B\ue9zier curves) and the geometric analysis of the facial image, based on measurements of a set of Action Units (AUs) for classifying the emotion. The proposed technique consists of three steps: (i) detecting the facial region within the image, (ii) extracting and classifying the facial features, (iii) recognizing the emotion. Then, the new data have been merged with reference data in order to recognize the basic emotion. Finally, we combined the two proposed algorithms (speech analysis and facial expression), in order to design a hybrid technique for emotion recognition. Such technique have been implemented in a software program, which can be employed in Human-Robot Interaction. The efficiency of the methodology was evaluated by experimental tests on 30 individuals (15 female and 15 male, 20 to 48 years old) form different ethnic groups, namely: (i) Ten adult European, (ii) Ten Asian (Middle East) adult and (iii) Ten adult American. Eventually, the proposed technique made possible to recognize the basic emotion in most of the cases

    PRESENT AND FUTURE PERVASIVE HEALTHCARE METHODOLOGIES: INTELLIGENT BODY DEVICES, PROCESSING AND MODELING TO SEARCH FOR NEW CARDIOVASCULAR AND PHYSIOLOGICAL BIOMARKERS

    Get PDF
    The motivation behind this work comes from the area of pervasive computing technologies for healthcare and wearable healthcare IT systems, an emerging field of research that brings in revolutionary paradigms for computing models in the 21st century. The aim of this thesis is focused on emerging personal health technologies and pattern recognition strategies for early diagnosis and personalized treatment and rehabilitation for individuals with cardiovascular and neurophysiological diseases. Attention was paid to the development of an intelligent system for the automatic classification of cardiac valve disease for screening purposes. Promising results were reported with the possibility to implement a new screening strategy for the diagnosis of cardiac valve disease in developing countries. A novel assistive architecture for the elderly able to non-invasively assess muscle fatigue by surface electromyography using wireless platform during exercise with an ergonomic platform was presented. Finally a wearable chest belt for ECG monitoring to investigate the psycho-physiological effects of the autonomic system and a wearable technology for monitoring of knee kinematics and recognition of ambulatory activities were characterized to evaluate the reliability for clinical purposes of collected data. The potential impact in the clinical arena of this research would be extremely important, since promising data show how such emerging personal technologies and methodologies are effective in several scenarios to early screening and discovery of novel diagnostic and prognostic biomarkers

    Development of a context-aware internet of things framework for remote monitoring services

    Get PDF
    Asset management is concerned with the management practices necessary to maximise the value delivered by physical engineering assets. Internet of Things (IoT)-generated data are increasingly considered as an asset and the data asset value needs to be maximised too. However, asset-generated data in practice are often collected in non-actionable form. Moreover, IoT data create challenges for data management and processing. One way to handle challenges is to introduce context information management, wherein data and service delivery are determined through resolving the context of a service or data request. This research was aimed at developing a context awareness framework and implementing it in an architecture integrating IoT with cloud computing for industrial monitoring services. The overall aim was achieved through a methodological investigation consisting of four phases: establish the research baseline, define experimentation materials and methods, framework design and development, as well as case study validation and expert judgment. The framework comprises three layers: the edge, context information management, and application. Moreover, a maintenance context ontology for the framework has developed focused on modelling failure analysis of mechanical components, so as to drive monitoring services adaptation. The developed context-awareness architecture is expressed business, usage, functional and implementation viewpoints to frame concerns of relevant stakeholders. The developed framework was validated through a case study and expert judgement that provided supporting evidence for its validity and applicability in industrial contexts. The outcomes of the work can be used in other industrially-relevant application scenarios to drive maintenance service adaptation. Context adaptive services can help manufacturing companies in better managing the value of their assets, while ensuring that they continue to function properly over their lifecycle.Manufacturin

    Studying Large Multi-Protein Complexes Using Single Molecule Localization Microscopy

    Get PDF
    Biology would not be where it is today without fluorescence microscopy. It is arguably one of the most commonly used tools in the biologists toolbox and it has helped scientists study the localization of cellular proteins and other small things for decades, but it is not without its limitations. Due to the diffraction limit, conventional fluorescence microscopy is limited to micrometer-range structures. Science has long relied upon electron microscopy and X-ray crystallography to study phenomena that occur below this limit. However, many of lifes processes occur between these two spatial domains. Super-resolution microscopy, the next stage of evolution of fluorescence microscopy, has the potential to bridge this gap between micro and nano. It combines superior resolutions of down to a few nanometers with the ability to view objects in their natural environments. It is the ideal tool for studying the large, multi-protein complexes that carry out most of lifes functions, but are too complex and fragile to put on an electron microscope or into a synchrotron. A form of super-resolution microscopy called SMLM Microscopy shows especially high promise in this regard. With its ability to detect individual molecules, it combines the high resolution needed for structural studies with the quantitative readout required for obtaining data on the stoichiometry of multi-protein complexes. This thesis describes new tools which expand the toolbox of SMLM with the specific aim of studying multi-protein complexes. First, the development of a novel fluorescent tagging system that is a mix of genetic tagging and immuno-staining. The system, termed BC2, consists of a short, genetically encodable peptide that is targeted by a nanobody (BC2 nanobody). The system brings several advantages. The small tag is not disruptive to the protein it is attached to and the small nanobody can get into tight spaces, making it an excellent tag for dense multi-protein structures. Next, several new variants of some commonly used green-to-red fluorescent proteins. The novel variants, which can be converted with a combination of blue and infrared light are especially useful for live-cell imaging. The developed fluorescent proteins can also be combined with photo-activatable fluorescent proteins to enable imaging of several targets with the same color protein. Finally, an application of the latter technique to study the multi-protein kinetochore complex and gain first glimpses into its spatial organization and the stoichiometry of its subunits
    corecore