14,078 research outputs found
Interfaces of the Agriculture 4.0
The introduction of information technologies in the environmental field is impacting and changing even a traditional sector like agriculture. Nevertheless, Agriculture 4.0 and data-driven decisions should meet user
needs and expectations. The paper presents a broad theoretical overview, discussing both the strategic role of design applied to Agri-tech and the issue of User Interface and Interaction as enabling tools in the field. In
particular, the paper suggests to rethink the HCD approach, moving on a Human-Decentered Design approach that put together user-technology-environment and the importance of the role of calm technologies as a way
to place the farmer, not as a final target and passive spectator, but as an active part of the process to aim the process of mitigation, appropriation from a traditional cultivation method to the 4.0 one
Multimodal agent interfaces and system architectures for health and fitness companions
Multimodal conversational spoken dialogues using physical and virtual agents provide a potential interface to motivate and support users in the domain of health and fitness. In this paper we present how such multimodal conversational Companions can be implemented to support their owners in various pervasive and mobile settings. In particular, we focus on different forms of multimodality and system architectures for such interfaces
CoachAI: A Conversational Agent Assisted Health Coaching Platform
Poor lifestyle represents a health risk factor and is the leading cause of
morbidity and chronic conditions. The impact of poor lifestyle can be
significantly altered by individual behavior change. Although the current shift
in healthcare towards a long lasting modifiable behavior, however, with
increasing caregiver workload and individuals' continuous needs of care, there
is a need to ease caregiver's work while ensuring continuous interaction with
users. This paper describes the design and validation of CoachAI, a
conversational agent assisted health coaching system to support health
intervention delivery to individuals and groups. CoachAI instantiates a text
based healthcare chatbot system that bridges the remote human coach and the
users. This research provides three main contributions to the preventive
healthcare and healthy lifestyle promotion: (1) it presents the conversational
agent to aid the caregiver; (2) it aims to decrease caregiver's workload and
enhance care given to users, by handling (automating) repetitive caregiver
tasks; and (3) it presents a domain independent mobile health conversational
agent for health intervention delivery. We will discuss our approach and
analyze the results of a one month validation study on physical activity,
healthy diet and stress management
A mobile fitness companion
The paper introduces a Mobile Companion prototype, which helps users to plan and keep track of their exercise activities via an interface based mainly on speech input and output. The Mobile Companion runs on a PDA and is based on a stand-alone, speaker-independent solution, making it fairly unique among mobile spoken dialogue systems, where the common solution is to run the ASR on a separate server or to restrict the speech input to some specific set of users. The prototype uses a GPS receiver to collect position, distance and speed data while the user is exercising, and allows the data to be compared to previous exercises. It communicates over the mobile network with a stationary system, placed in the user’s home. This allows plans for exercise activities to be downloaded from the stationary to the mobile system, and exercise result data to be uploaded once an exercise has been completed
Conversational Sensing
Recent developments in sensing technologies, mobile devices and context-aware
user interfaces have made it possible to represent information fusion and
situational awareness as a conversational process among actors - human and
machine agents - at or near the tactical edges of a network. Motivated by use
cases in the domain of security, policing and emergency response, this paper
presents an approach to information collection, fusion and sense-making based
on the use of natural language (NL) and controlled natural language (CNL) to
support richer forms of human-machine interaction. The approach uses a
conversational protocol to facilitate a flow of collaborative messages from NL
to CNL and back again in support of interactions such as: turning eyewitness
reports from human observers into actionable information (from both trained and
untrained sources); fusing information from humans and physical sensors (with
associated quality metadata); and assisting human analysts to make the best use
of available sensing assets in an area of interest (governed by management and
security policies). CNL is used as a common formal knowledge representation for
both machine and human agents to support reasoning, semantic information fusion
and generation of rationale for inferences, in ways that remain transparent to
human users. Examples are provided of various alternative styles for user
feedback, including NL, CNL and graphical feedback. A pilot experiment with
human subjects shows that a prototype conversational agent is able to gather
usable CNL information from untrained human subjects
When the fingers do the talking: A study of group participation for different kinds of shareable surfaces
and other research outputs When the fingers do the talking: A study of group par-ticipation for different kinds of shareable surface
- …