5 research outputs found

    Speech-centric multimodal interaction for easy-to-access online services: A personal life assistant for the elderly

    Get PDF
    The PaeLife project is a European industry-academia collaboration whose goal is to provide the elderly with easy access to online services that make their life easier and encourage their continued participation in the society. To reach this goal, the project partners are developing a multimodal virtual personal life assistant (PLA) offering a wide range of services from weather information to social networking. This paper presents the multimodal architecture of the PLA, the services provided by the PLA, and the work done in the area of speech input and output modalities, which play a key role in the application.info:eu-repo/semantics/publishedVersio

    Speech-centric multimodal interaction for easy-to-access online services: A personal life assistant for the elderly

    Get PDF
    The PaeLife project is a European industry-academia collaboration whose goal is to provide the elderly with easy access to online services that make their life easier and encourage their continued participation in the society. To reach this goal, the project partners are developing a multimodal virtual personal life assistant (PLA) offering a wide range of services from weather information to social networking. This paper presents the multimodal architecture of the PLA, the services provided by the PLA, and the work done in the area of speech input and output modalities, which play a key role in the application.info:eu-repo/semantics/publishedVersio

    Elderly Centered Design for Interaction – The Case of the S4S Medication Assistant

    Get PDF
    AbstractSeveral aspects of older adults’ life can benefit from recent technological developments, but success in harnessing this potential depends on careful design and accessible, easy to use products. Design and development must be centered on the elderly and adequately consider interaction. In this paper we follow this design approach and put it to the test in developing a concrete application, aimed to contribute to lower the high levels of non-adherence to medication in the elderly population. The “Medication Assistant” application was developed following an iterative method centered, from the start, on the elderly and interaction design. The method repeats short-time development cycles integrating definition of scenarios and goals, requirements engineering, design, prototyping and evaluation. Evaluation, by end-users, of the increasingly refined prototypes, is a key characteristic of the method. The evaluation results provide information related to strengths and weaknesses of the application and yield suggestions regarding changes and improvements, valuable support further development. Results regarding evaluation of the second prototype of “Medication Assistant” are presented

    Interacção multimodal : contribuições para simplificar o desenvolvimento de aplicações

    Get PDF
    Doutoramento em Engenharia InformáticaA forma como interagimos com os dispositivos que nos rodeiam, no nosso diaa- dia, está a mudar constantemente, consequência do aparecimento de novas tecnologias e métodos que proporcionam melhores e mais aliciantes formas de interagir com as aplicações. No entanto, a integração destas tecnologias, para possibilitar a sua utilização alargada, coloca desafios significativos e requer, da parte de quem desenvolve, um conhecimento alargado das tecnologias envolvidas. Apesar de a literatura mais recente apresentar alguns avanços no suporte ao desenho e desenvolvimento de sistemas interactivos multimodais, vários aspectos chave têm ainda de ser resolvidos para que se atinja o seu real potencial. Entre estes aspectos, um exemplo relevante é o da dificuldade em desenvolver e integrar múltiplas modalidades de interacção. Neste trabalho, propomos, desenhamos e implementamos uma framework que permite um mais fácil desenvolvimento de interacção multimodal. A nossa proposta mantém as modalidades de interacção completamente separadas da aplicação, permitindo um desenvolvimento, independente de cada uma das partes. A framework proposta já inclui um conjunto de modalidades genéricas e módulos que podem ser usados em novas aplicações. De entre as modalidades genéricas, a modalidade de voz mereceu particular atenção, tendo em conta a relevância crescente da interacção por voz, por exemplo em cenários como AAL, e a complexidade associada ao seu desenvolvimento. Adicionalmente, a nossa proposta contempla ainda o suporte à gestão de aplicações multi-dispositivo e inclui um método e respectivo módulo para criar fusão entre eventos. O desenvolvimento da arquitectura e da framework ocorreu num contexto de I&D diversificado, incluindo vários projectos, cenários de aplicação e parceiros internacionais. A framework permitiu o desenho e desenvolvimento de um conjunto alargado de aplicações multimodais, sendo um exemplo digno de nota o assistente pessoal AALFred, do projecto PaeLife. Estas aplicações, por sua vez, serviram um contínuo melhoramento da framework, suportando a recolha iterativa de novos requisitos, e permitido demonstrar a sua versatilidade e capacidades.The way we interact with the devices around us, in everyday life, is constantly changing, boosted by emerging technologies and methods, providing better and more engaging ways to interact with applications. Nevertheless, the integration with these technologies, to enable their widespread use in current systems, presents a notable challenge and requires considerable knowhow from developers. While the recent literature has made some advances in supporting the design and development of multimodal interactive systems, several key aspects have yet to be addressed to enable its full potential. Among these, a relevant example is the difficulty to develop and integrate multiple interaction modalities. In this work, we propose, design and implement a framework enabling easier development of multimodal interaction. Our proposal fully decouples the interaction modalities from the application, allowing the separate development of each part. The proposed framework already includes a set of generic modalities and modules ready to be used in novel applications. Among the proposed generic modalities, the speech modality deserved particular attention, attending to the increasing relevance of speech interaction, for example in scenarios such as AAL, and the complexity behind its development. Additionally, our proposal also tackles the support for managing multi-device applications and includes a method and corresponding module to create fusion of events. The development of the architecture and framework profited from a rich R&D context including several projects, scenarios, and international partners. The framework successfully supported the design and development of a wide set of multimodal applications, a notable example being AALFred, the personal assistant of project PaeLife. These applications, in turn, served the continuous improvement of the framework by supporting the iterative collection of novel requirements, enabling the proposed framework to show its versatility and potential
    corecore