12 research outputs found
Mouse Simulation Using Two Coloured Tapes
In this paper, we present a novel approach for Human Computer Interaction
(HCI) where, we control cursor movement using a real-time camera. Current
methods involve changing mouse parts such as adding more buttons or changing
the position of the tracking ball. Instead, our method is to use a camera and
computer vision technology, such as image segmentation and gesture recognition,
to control mouse tasks (left and right clicking, double-clicking, and
scrolling) and we show how it can perform everything as current mouse devices
can. The software will be developed in JAVA language. Recognition and pose
estimation in this system are user independent and robust as we will be using
colour tapes on our finger to perform actions. The software can be used as an
intuitive input interface to applications that require multi-dimensional
control e.g. computer games etc.Comment: 5 page
Using Haar-like feature classifiers for hand tracking in tabletop augmented reality
We propose in this paper a hand interaction approach
to Augmented Reality Tabletop applications. We detect
the user’s hands using haar-like feature classifiers and
correlate its positions with the fixed markers on the
table. This gives the user the possibility to move, rotate
and resize the virtual objects located over the table
with their bare hands.Postprint (published version
Interactive exploration of historic information via gesture recognition
Developers of interactive exhibits often struggle to �nd appropriate input devices
that enable intuitive control, permitting the visitors to engage e�ectively with the
content. Recently motion sensing input devices like the Microsoft Kinect or Panasonic
D-Imager have become available enabling gesture based control of computer
systems. These devices present an attractive input device for exhibits since the user
can interact with their hands and they are not required to physically touch any part
of the system. In this thesis we investigate techniques to enable the raw data coming
from these types of devices to be used to control an interactive exhibit. Object
recognition and tracking techniques are used to analyse the user's hand where movement
and clicks are processed. To show the e�ectiveness of the techniques the gesture
system is used to control an interactive system designed to inform the public about
iconic buildings in the centre of Norwich, UK. We evaluate two methods of making
selections in the test environment.
At the time of experimentation the technologies were relatively new to the image
processing environment. As a result of the research presented in this thesis, the techniques
and methods used have been detailed and published [3] at the VSMM (Virtual
Systems and Multimedia 2012) conference with the intention of further forwarding
the area
A Framework for Gamification of Human Joint Remote Rehabilitation, Incorporating Non-Invasive Sensors
Patients who have suffered soft tissue injuries or undergone surgery often experience reduced muscle strength, flexibility, and pain in the affected area, which can interfere with daily activities. Rehabilitation exercises are crucial in reducing symptoms and returning patients to normal activities. This research presents a framework for human joint rehabilitation that enables clinicians to set engaging gamified rehabilitation tasks for their patients utilising non-invasive sensors and machine learning algorithms