13 research outputs found

    Parameterized Affect of Transmission-Range on Lost of Network Connectivity (LNC) of Wireless Sensor Networks

    Get PDF
    Wireless Sensor Networks, referred to as WSNs, are made up of various types of sensor nodes. Recent developments in micro electro-mechanical technology have given rise to new integrated circuitry, microprocessor hardware and nanotechnology, wireless technology, and advanced networking routing protocols. Hospitals and health service facilities, the armed forces, and even residential customers represent a potential huge market for these devices. The problem is that existing sensor network nodes are incapable of providing the support needed to maximize usage of wireless technology. For this reason, there are many novel routing protocols for the wireless sensor networks proposed recently. One is Hierarchical or cluster-based routing. In this paper, we analyze three different types of hierarchical routing protocols: Low Energy Adaptive Clustering Hierarchy (LEACH), Power-Efficient Gathering in Sensor Information Systems (PEGASIS), and Virtual Grid Architecture (VGA). We tried to analyze the performance of these protocols, including the power consumption and overall network performance. We also compared the routing protocol together. This comparison reveals the important features that need to be taken into consideration while designing and evaluating new routing protocols for sensor networks. The simulation results, using same limited sensing range value, show that PEGASIS outperforms all other protocols while LEACH has better performance than VGA. Furthermore, the paper investigates the power consumption for all protocols. On the average, VGA has the worst power consumption when the sensing range is limited, while VGA is the best when the sensing range is increased. Using homogeneous nodes can greatly prolong sensor network’s life time. Also, the network lifetime increases as the number of clusters decreases

    Performance Evaluation of Routing Protocols in Wireless Sensor Networks

    Get PDF
    The efficiency of sensor networks strongly depends on the routing protocol used. In this paper, we analyze three different types of routing protocols: LEACH, PEGASIS, and VGA. Sensor networks are simulated using Sensoria simulator. Several simulations are conducted to analyze the performance of these protocols including the power consumption and overall network performance. The simulation results, using same limited sensing range value, show that PEGASIS outperforms all other protocols while LEACH has better performance than VGA. Furthermore, the paper investigates the power consumption for all protocols. On the average, VGA has the worst power consumption when the sensing range is limited, while VGA is the best when the sensing range is increased

    Innovative and interactive assistive technology controlling system using eye detection and head movements

    No full text
    This thesis is being archived as a Digitized Shelf Copy for campus access to current students and staff only. We currently cannot provide this open access without the author's permission. If you are the author of this work and desire to provide it open access or wish access removed, please contact the Wahlstrom Library to discuss permission.Assistive technology refers to any device that enables people with disabilities to live, work, study or play independently and improves their functional capabilities. It offers assistance to people with a wide range of disabilities including motor, vision, hearing and speech impairments. People with severe physical disabilities can only make small movements that can be used as unconventional approaches to control assistive technologies. These movements can be converted into electrical signals and translated by a control unit into control commands. Despite of the amount of research that has been done on finding robust alternative control and communication methods and applying them to assistive technology, it is still a challenging task which requires more investigation. The research being presented investigates advanced methods to enable people who are only able to make small movements to control different devices. It uses a combination of various enhanced interaction models to provide easy interface to replace the traditional control methods. This combination includes: eye gaze direction classification model and head movement detection model. The eye gaze direction classification model uses the Viola-Jones face detector and dynamic parameters in Circular Hough Transform (CHT) to locate the eye iris location. Then it uses low-level features and a classifier to categorize the eye gaze direction. The head movement detection model uses flex sensors and a PIC microcontroller to calculate the head flexion angle. An interactive device such as a tablet or smartphone can be used for the user interface. A central control unit links the assistive device with the sensors, camera and the interactive device. Such a technology has the advantage of being used by people suffering high level spinal damage

    Enhanced Eye Gaze Direction Classification Using a Combination of Face Detection, CHT and SVM

    No full text
    Automatic estimation of eye gaze direction is an interesting research area in the field of computer vision that is growing rapidly with its wide range of potential applications. However, it is still a very challenging task to implement a robust eye gaze classification system. This paper proposes a robust eye detection system that uses face detection for finding the eyes region. The Circular Hough Transform (CHT) is used for locating the center of the iris. The parameters of the Circular Hough Transform are dynamically calculated based on the detected face information. A new method for eye gaze direction classification using Support Vector Machine (SVM) is introduced and combined with Circular Hough Transform to complete the task required. The experiments were performed on a database containing 4000 images of 40 subjects from different ages and genders. The algorithm achieved a classification accuracy of up to 92.1%

    Detection of Bleeding in Wireless Capsule Endoscopy Images Using Range Ratio Color

    No full text
    Wireless Capsule Endoscopy (WCE) is device to detect abnormalities in colon,esophagus,small intestinal and stomach, to distinguish bleeding in WCE images from non bleeding is a hard job by human reviewing and very time consuming. Consequently, automation for classifying bleeding frames not only will expedite the process but will reduce the burden on the doctors. Using the purity of the red color we can detect the Bleeding areas in WCE images. But, we could find various intensity of red color values in different parts of the small intestinal,so it is not enough to depend on the red color feature alone. We select RGB (Red,Green,Blue) because it takes raw level values and it is easy to use. In this paper we will put range ratio color for each of R,G,and B. Therefore, we divide each image into multiple pixels and apply the range ratio color condition for each pixel. Then we count the number of the pixels that achieved our condition. If the number of pixels grater than zero, then the frame is classified as a bleeding type. Otherwise, it is a non-bleeding. Our experimental results show that this method could achieve a very high accuracy in detecting bleeding images for the different parts of the small intestinal

    A Novel Robotic System for Painting with Eyes

    No full text
    In this paper we present the preliminary analysis and results of a novel robotic system that allows a user to paint with eyes, using an eye tracker device and a collaborative robot. Eye tracking is a sensing technology that allows a computer to detect where a person is looking (point of gaze). This technology is used to control the motion of a collaborative robot that paints on a canvas based on the detected input. Results show the feasibility of the robotic system and its possible future application as a tool for artistic and creative painting by people with disabilities who cannot use their hands, or that have lost the control of all or part of their muscles
    corecore