28 research outputs found

    Facial Expression Recognition Using SVM Classifier

    Get PDF
    Facial feature tracking and facial actions recognition from image sequence attracted great attention in computer vision field. Computational facial expression analysis is a challenging research topic in computer vision. It is required by many applications such as human-computer interaction, computer graphic animation and automatic facial expression recognition. In recent years, plenty of computer vision techniques have been developed to track or recognize the facial activities in three levels. First, in the bottom level, facial feature tracking, which usually detects and tracks prominent landmarks surrounding facial components (i.e., mouth, eyebrow, etc), captures the detailed face shape information; Second, facial actions recognition, i.e., recognize facial action units (AUs) defined in FACS, try to recognize some meaningful facial activities (i.e., lid tightener, eyebrow raiser, etc); In the top level, facial  expression analysis attempts to recognize some meaningful facial activities (i.e., lid tightener, eyebrow raiser, etc); In the top level, facial expression analysis attempts to recognize facial expressions that represent the human emotion states. In this proposed algorithm initially detecting eye and mouth, features of eye and mouth are extracted using Gabor filter, (Local Binary Pattern) LBP and PCA is used to reduce the dimensions of the features. Finally SVM is used to classification of expression and facial action units

    Feature Extraction Techniques for Human Emotion Identification from Face Images

    Get PDF
    Emotion recognition has been one of the stimulating issues over the years due to the irregularities in the complexity of models and unpredictability between expression categories. So many Emotion detection algorithms have developed in the last two decades and still facing problems in accuracy, complexity and real-world implementation. In this paper, we propose two feature extraction techniques: Mouth region-based feature extraction and Maximally Stable Extremal Regions (MSER) method. In Mouth based feature extraction method mouth area is calculated and based on that value the emotions are classified. In the MSER method, the features are extracted by using connecting components and then the extracted features are given to a simple ANN for classification. Experimental results shows that the Mouth area based feature extraction method gives 86% accuracy and MSER based feature extraction method outperforms it by achieving 89% accuracy on DEAP. Thus, it can be concluded that the proposed methods can be effectively used for emotion detection

    Collecting sensor-generated data for assessing teamwork and individual contributions in computing student teams

    Get PDF
    The aim of this paper is twofold. First, the authors describe a series of experiments that have been conducted in a dedicated smart-spaces laboratory, aiming to combine the use of several sensors in collecting student data. Second, the paper shares key findings from the use of sensor-generated data as an instrument for assessing individual contributions as well as team performance. The early sections of the paper describe the setting of a smart-space laboratory and how it was used for two scenarios; on one hand student teams were monitored during a coordination meeting involving decision making, while on the other hand students were observed during a team presentation. The discussion explains how sensors were used to monitor emotions (using facial image processing), stress (using galvanic skin response) and participation (based on the use of Kinnect). The key contribution is in the form of the experiment setting that can be replicated with students from different educational backgrounds but also in scenarios involving practitioners from different disciplines. The authors discuss the drivers for organizing this type of experiment and explain the reasoning behind the use of certain sensors and the value of collecting specific data sets. The later part of the paper describes how the analysis of collected data has produced visualizations of patterns that can be used in education for assessing student contribution, emotions and stress levels. Similar approaches could be used for project management where student teams are replaced by software engineering teams in agile development scenarios (e.g. scrum stand-up meetings)

    Emotion Detector

    Get PDF
    Face plays significant role in social communication. This is a 'window' to human personality, emotions and thoughts. Verbal part contributes about 7% of the message, vocal – 34% and facial expression about 55%. Due to that, face is a subject of study in many areas of science such as psychology, behavioral science, medicine and finally computer science. In the field of computer science much effort is put to explore the ways of automation the process of face detection and segmentation. Several approaches addressing the problem of facial feature extraction have been proposed. The main issue is to provide appropriate face representation, which remains robust with respect to diversity of facial appearances. The objective of this report is to outline the problem of facial expression recognition, which is a great challenge in the area of computer vision. Advantages of creating a fully automatic system for facial action analysis are constant motivation for exploring this field of science and will be mentioned in this thesis

    Analysis on techniques used to recognize and identifying the Human emotions

    Get PDF
    Facial expression is a major area for non-verbal language in day to day life communication. As the statistical analysis shows only 7 percent of the message in communication was covered in verbal communication while 55 percent transmitted by facial expression. Emotional expression has been a research subject of physiology since Darwin’s work on emotional expression in the 19th century. According to Psychological theory the classification of human emotion is classified majorly into six emotions: happiness, fear, anger, surprise, disgust, and sadness. Facial expressions which involve the emotions and the nature of speech play a foremost role in expressing these emotions. Thereafter, researchers developed a system based on Anatomic of face named Facial Action Coding System (FACS) in 1970. Ever since the development of FACS there is a rapid progress of research in the domain of emotion recognition. This work is intended to give a thorough comparative analysis of the various techniques and methods that were applied to recognize and identify human emotions. This analysis results will help to identify the proper and suitable techniques, algorithms and the methodologies for future research directions. In this paper extensive analysis on the various recognition techniques used to identify the complexity in recognizing the facial expression is presented. This work will also help researchers and scholars to ease out the problem in choosing the techniques used in the identification of the facial expression domain

    Understanding collaboration in Global Software Engineering (GSE) teams with the use of sensors: introducing a multi-sensor setting for observing social and human aspects in project management

    Get PDF
    This paper discusses on-going research in the ways Global Software Engineering (GSE) teams collaborate for a range of software development tasks. The paper focuses on providing the means for observing and understanding GSE team member collaboration including team coordination and member communication. Initially the paper provides the background on social and human issues relating to GSE collaboration. Next the paper describes a pilot study involving a simulation of virtual GSE teams working together with the use of asynchronous and synchronous communication over a virtual learning environment. The study considered the use of multiple data collection techniques recordings of SCRUM meetings, design and implementation tasks. Next, the paper discusses the use of a multi-sensor for observing human and social aspects of project management in GSE teams. The scope of the study is to provide the means for gathering data regarding GSE team coordination for project managers including member emotions, participation pattern in team discussions and potentially stress levels

    Investigating the role of biometrics in education – the use of sensor data in collaborative learning

    Get PDF
    This paper provides a detailed description of how a smart spaces laboratory has been used for assessing learners’ performance in various educational contexts. The paper shares the authors’ experiences from using sensor-generated data in a number of learning scenarios. In particular the paper describes how a smart learning environment is created with the use of a range of sensors measuring key data from individual learners including (i) heartbeat, (ii) emotion detection, (iii) sweat levels, (iv) voice fluctuations and (v) duration and pattern of contribution via voice recognition. The paper also explains how biometrics are used to assess learner’ contribution in certain activities but also to evaluate collaborative learning in student groups. Finally the paper instigates research in the role of using visualization of biometrics as a medium for supporting assessment, facilitating learning processes and enhancing learning experiences. Examples of how learning analytics are created based on biometrics are also provided, resulting from a number of pilot studies that have taken place over the past couple of years
    corecore