1,106 research outputs found

    Titanic smart objects

    Get PDF

    EYECOM: an innovative approach for computer interaction

    Get PDF
    The world is innovating rapidly, and there is a need for continuous interaction with the technology. Sadly, there do not exist promising options for paralyzed people to interact with the machines i.e., laptops, smartphones, and tabs. A few commercial solutions such as Google Glasses are costly and cannot be afforded by every paralyzed person for such interaction. Towards this end, the thesis proposes a retina-controlled device called EYECOM. The proposed device is constructed from off-the-shelf cost-effective yet robust IoT devices (i.e., Arduino microcontrollers, Xbee wireless sensors, IR diodes, and accelerometer). The device can easily be mounted on to the glasses; the paralyzed person using this device can interact with the machine using simple head movement and eye blinks. The IR detector is located in front of the eye to illuminate the eye region. As a result of illumination, the eye reflects IR light which includes electrical signals and as the eyelids close, the reflected light over eye surface is disrupted, and such change in reflected value is recorded. Further to enable cursor movement onto the computer screen for the paralyzed person a device named accelerometer is used. The accelerometer is a small device, with the size of phalanges, a human thumb bone. The device operates on the principle of axis-based motion sensing and it can be worn as a ring by a paralyzed person. A microcontroller processes the inputs from the IR sensors, accelerometer and transmits them wirelessly via Xbee wireless sensor (i.e., a radio) to another microcontroller attached to the computer. With the help of a proposed algorithm, the microcontroller attached to the computer, on receiving the signals moves cursor onto the computer screen and facilitate performing actions, as simple as opening a document to operating a word-to-speech software. EYECOM has features which can help paralyzed persons to continue their contributions towards the technological world and become an active part of the society. Resultantly, they will be able to perform number of tasks without depending upon others from as simple as reading a newspaper on the computer to activate word-to-voice software

    Gesture Based Home Automation for the Physically Disabled

    Get PDF
    Paralysis and motor-impairments can greatly reduce the autonomy and quality of life of a patient while presenting a major recurring cost in home-healthcare. Augmented with a non-invasive wearable sensor system and home-automation equipment, the patient can regain a level of autonomy at a fraction of the cost of home nurses. A system which utilizes sensor fusion, low-power digital components, and smartphone cellular capabilities can extend the usefulness of such a system to allow greater adaptivity for patients with various needs. This thesis develops such a system as a Bluetooth enabled glove device which communicates with a remote web server to control smart-devices within the home. The power consumption of the system is considered as a major component to allow the system to operate while requiring little maintenance, allowing for greater patient autonomy. The system is evaluated in terms of power consumption and accuracy to prove its viability as a home accessibility tool

    Recognition of elementary arm movements using orientation of a tri-axial accelerometer located near the wrist

    No full text
    In this paper we present a method for recognising three fundamental movements of the human arm (reach and retrieve, lift cup to mouth, rotation of the arm) by determining the orientation of a tri-axial accelerometer located near the wrist. Our objective is to detect the occurrence of such movements performed with the impaired arm of a stroke patient during normal daily activities as a means to assess their rehabilitation. The method relies on accurately mapping transitions of predefined, standard orientations of the accelerometer to corresponding elementary arm movements. To evaluate the technique, kinematic data was collected from four healthy subjects and four stroke patients as they performed a number of activities involved in a representative activity of daily living, 'making-a-cup-of-tea'. Our experimental results show that the proposed method can independently recognise all three of the elementary upper limb movements investigated with accuracies in the range 91–99% for healthy subjects and 70–85% for stroke patients

    Automated Detection of Cigarette Smoking Puffs from Mobile Sensors - A Multimodal Approach

    Get PDF
    Smoking has been conclusively proved to be the leading cause of preventable deaths in the United States. Extensive research is conducted on developing effective smoking cessation programs. Most smoking cessation programs achieve low success rate because they are unable to intervene at the right moment. Identification of high-risk situations that may lead an abstinent smoker to relapse involves discovering the associations among various contexts that precede a smoking session or a smoking lapse. In the absence of an automated method, detection of smoking events still relies on subject self-report that is prone to failure to report and involves subject burden. Automated detection of smoking events in the natural environment can revolutionize smoking research and lead to effective intervention. We investigate the feasibility of automated detection smoking puff from measurement obtained from respiratory inductive plethysmography (RIP) sensor. We introduce several new features from respiration that can help classify individual respiration cycles into smoking puffs or non-puffs. We then propose supervised and semi-supervised support vector models to detect smoking puffs. We train our models on data collected from 10 daily smokers and show that our model can still identify smoking puffs with an accuracy of 86.7%. We further show accuracy of smoking puff detection can be improved by fusing measurements from RIP and inertial sensors. We use measurements obtained from wrist worn accelerometer and gyroscope to find segments when the hand is at mouth. The segments are used to identify respiration cycles that can be potentially puff cycles. A SVM classifier is trained using 40 hours of data collected from 6 participants. The 10-fold cross validation results show that at 90.3% true positive rate, respiration feature based classifier produces on average 43.8 false positives puff per hours which is reduced to 3.7 false positives per hour when both wrist and respiration features are used. We also perform leave one subject out cross validation and show that the method generalized well
    corecore