6 research outputs found

    In private, secure, conversational FinBots we trust

    Get PDF
    In the past decade, the financial industry has experienced a technology revolution. While we witness a rapid introduction of conversational bots for financial services, there is a lack of understanding of conversational user interfaces (CUI) features in this domain. The finance industry also deals with highly sensitive information and monetary transactions, presenting a challenge for developers and financial providers. Through a study on how to design text-based conversational financial interfaces with N=410 participants, we outline user requirements of trustworthy CUI design for financial bots. We posit that, in the context of Finance, bot privacy and security assurances outweigh conversational capability and postulate implications of these findings. This work acts as a resource on how to design trustworthy FinBots and demonstrates how automated financial advisors can be transformed into trusted everyday devices, capable of supporting users' daily financial activities.Comment: Proceedings of the CHI 2021 Workshop on Let's Talk About CUIs: Putting Conversational User Interface Design into Practice, May 8, 2021 in Yokohama, Japa

    "My sex-related data is more sensitive than my financial data and I want the same level of security and privacy": User Risk Perceptions and Protective Actions in Female-oriented Technologies

    Full text link
    The digitalization of the reproductive body has engaged myriads of cutting-edge technologies in supporting people to know and tackle their intimate health. Generally understood as female technologies (aka female-oriented technologies or 'FemTech'), these products and systems collect a wide range of intimate data which are processed, transferred, saved and shared with other parties. In this paper, we explore how the "data-hungry" nature of this industry and the lack of proper safeguarding mechanisms, standards, and regulations for vulnerable data can lead to complex harms or faint agentic potential. We adopted mixed methods in exploring users' understanding of the security and privacy (SP) of these technologies. Our findings show that while users can speculate the range of harms and risks associated with these technologies, they are not equipped and provided with the technological skills to protect themselves against such risks. We discuss a number of approaches, including participatory threat modelling and SP by design, in the context of this work and conclude that such approaches are critical to protect users in these sensitive systems

    Why Privacy is All But Forgotten - An Empirical Study of Privacy and Sharing Attitude

    Get PDF
    Privacy and sharing are believed to share a dynamic and dialectical tension, where individuals have competing needs to be both open and closed in contact with others [8]. Online, technology can impact this dy- namic process [68]. Indeed, a number of researchers ob- served that users’ stated privacy attitude do not match their behavior [2, 3, 23, 30, 64, 81]. In these studies privacy attitude is compared with behavior via a num- ber of concepts related to privacy. While it is known in psychology that attitudes are multidimensional con- structs [10, 15, 76], the question arises whether the user ambivalence with regards to privacy is due to different or contradictory cognitive and affective components of privacy and sharing attitude. We conducted an empirical study to investigate the dif- ference between privacy attitude and sharing attitude. A US sample of N = 60 MTurk workers was assigned to two groups and asked to describe in a 250-word free- form response what [privacy/sharing] online means for them. Responses were coded in quantitative content analysis. The presence and frequency of codes were com- pared across conditions. Emotions and relationships to other parties were evaluated as predictors for a discrim- inative logistic regression classifying both attitudes. We found that privacy and sharing attitude differ signif- icantly across a number of the extracted codes. Partic- ipants in privacy attitude were significantly more likely to express fear and significantly less likely to express happiness. For sharing attitude the reverse is true. We found that a discriminant logistic regression on a tone analysis of the participants’ responses offers excellent discrimination between privacy and sharing attitude. We cross-validated this classifier with another sample of N′ = 54. The observed differences contribute an un- derstanding of user states in privacy (and sharing) sit- uations online and has implications for both privacy re- search and practice
    corecore