6 research outputs found
Use of Vocal Prosody to Express Emotions in Robotic Speech
Vocal prosody (pitch, timing, loudness, etc.) and its use to convey emotions are essential components of speech communication between humans. The objective of this dissertation research was to determine the efficacy of using varying vocal prosody in robotic speech to convey emotion. Two pilot studies and two experiments were performed to address the shortcomings of previous HRI research in this area. The pilot studies were used to determine a set of vocal prosody modification values for a female voice model using the MARY speech synthesizer to convey the emotions: anger, fear, happiness, and sadness. Experiment 1 validated that participants perceived these emotions along with a neutral vocal prosody at rates significantly higher than chance. Four of the vocal prosodies (anger, fear, neutral, and sadness) were recognized at rates approaching the recognition rate (60%) of emotions in person to person speech. During Experiment 2 the robot led participants through a creativity test while making statements using one of the validated emotional vocal prosodies. The ratings of the robot’s positive qualities and the creativity scores by the participant group that heard nonnegative vocal prosodies (happiness, neutral) did not significantly differ from the ratings and scores of the participant group that heard the negative vocal prosodies (anger, fear, sadness). Therefore, Experiment 2 failed to show that the use of emotional vocal prosody in a robot’s speech influenced the participants’ appraisal of the robot or the participants’ performance on this specific task. At this time robot designers and programmers should not expect that vocal prosody alone will have a significant impact on the acceptability or the quality of human-robot interactions. Further research is required to show that multi-modal (vocal prosody along with facial expressions, body language, or linguistic content) expressions of emotions by robots will be effective at improving human-robot interactions
Relief Displacement of Airborne Objects
The increasing availability of unoccupied aircraft systems (UAS, also referred to as drones) has led to their use in taking vertical aerial photographs at relatively small spatial scales. These photographs can be used to measure the distances between objects appearing in the photographs. However, relief displacement can cause an object above or below ground level to appear at a point in a vertical aerial photograph that is not directly in-line with the object’s actual location, causing a measurement error. A UAS was used in this study as a photographed airborne object because its location and altitude could be controlled. We were interested in predicting the horizontal distance of the UAS’s appearance from the centre of a vertical aerial photograph. Predictions of the location of the photographed UAS’s appearance in vertical aerial photographs over both level and sloped surfaces matched measured appearance distances within 0.06–0.48 m. This study shows that the relief displacement formulas typically used to compute the height of a vertical structure appearing in a vertical aerial photograph can additionally be used to compute the actual location of an airborne object (e.g., a flying UAS, bird, bat) if the object’s altitude is known or can be estimated
CATCH COMPOSITION AND POTENTIAL IMPACT OF BAITED AND UNBAITED COMMERCIALLY FISHED HOOP NETS IN THREE CENTRAL FLORIDA LAKES
Dataset Demonstrating Relief Displacement of Airborne Objects
Dataset used to demonstrate the effects of relief displacement on airborne objects depicted in vertical aerial photographs. The dataset includes photographs taken by an Unoccupied Aircraft Systems (UAS), ground control points used to georeference the photographs, and digital elevation models of the study area
