3 research outputs found

    WOLF: a Research Platform to Write NFC Secure Applications on Top of Multiple Secure Elements (With an Original SQL-Like Interface)

    No full text
    International audienceThis article presents the WOLF (Wallet Open Library Framework) platform which supports an original interface for NFC developers called " SE-QL ". SE-QL is a SQL-like interface which eases and optimizes NFC secure application development in making the heterogeneity of the Secure Element (SE) transparent. SE implementation could be " embedded " (eSE) in the mobile device, or inside the SIM Card (UICC), or " on-host " software-based, or in the Cloud (e.g. through HCE); every SE implementation has its own interface(s) making NFC secure-application development extremely cumbersome and complex. Proposed SE-QL solves this problem. This article demonstrates the feasibility and attractiveness of our approach based upon an original high-level API

    User Data on AndroĂŻd Smartphone Must be Protected

    No full text
    International audience"In the world of mobile, there is no anonymity," says Michael Becker of the Mobile Marketing Association, an industry trade group. In recent work, Enck and colleagues have used information flow monitoring on a mobile device to show that, on average, over two thirds of the most popular applications of an Android market were responsible for data leakage [1]. We believe data leakages are mainly due to the intrinsic limitations of Android's security mechanisms. Here we describe "Blare", a tool that detects AndroĂŻd data leakages

    Prendre soin du consentement : tisser l’éthique dans le design d’un agent conversationnel

    Full text link
    « Es-tu d’accord ? » En contexte numĂ©rique, cette question appelle un oui ou un non au traitement annoncĂ© de nos donnĂ©es personnelles par une organisation; elle demande ainsi Ă  l’utilisateur son consentement. Les agents conversationnels sont des systĂšmes de dialogue (ou bots) qui utilisent de nombreuses donnĂ©es personnelles pour parler avec des humains par interface textuelle (chatbot) ou vocale (assistant vocal, parfois appelĂ© voicebot). Par une ethnographie au sein d’une start-up qui dĂ©veloppe ce type d’agents, cette thĂšse explore comment le consentement se tisse dans les pratiques de design de cette organisation. Cette thĂšse se demande : que peut ĂȘtre une « bonne expĂ©rience » de consentement avec un agent conversationnel ? Elle s’ouvre par un chemin tracĂ© au travers de littĂ©ratures en design d’expĂ©rience conversationnelle et en Ă©thique sur le consentement. Tandis qu’une bonne expĂ©rience conversationnelle est censĂ©e ĂȘtre fluide et agrĂ©able, celle du consentement demande des interruptions parfois inconfortables pour ĂȘtre suffisamment Ă©clairĂ© : comment ces deux versions de « bonne expĂ©rience » peuvent-elles ĂȘtre rĂ©unies sans que l’une ne blesse ou n’invalide l’autre ? Face aux nombreuses limites qui rendent le consentement numĂ©rique bien difficile Ă  ĂȘtre valable, la tentation de le laisser de cĂŽtĂ© est palpable. Pourtant, son horizon Ă©thique de respect vaut la peine d'en prendre soin. Par une perspective qui envisage le consentement comme un matter of caring, cette thĂšse dĂ©veloppe ensuite un cadre thĂ©orique qui invite Ă  prendre soin des situations de consentement. Il dote l'approche ventriloque, qui Ă©tudie la constitution de la rĂ©alitĂ© par la communication, d’une Ă©thique du care : avec le concept de matter of caring, il s’agit de prendre soin de certaines prĂ©occupations (matters of concern) pour amĂ©liorer des situations de consentement. En se faisant matter of caring, le consentement trouve une façon de n'ĂȘtre plus nĂ©gligĂ© comme une prĂ©occupation qui ne parvient pas Ă  compter, mais de participer Ă  un changement dans les habitudes d'interaction. Cette participation au changement est Ă©galement au cƓur des pratiques ethnographiques qui constituent cette thĂšse : Ă  partir de mon expĂ©rience comme participante active plutĂŽt que de simple observatrice dans la startup mentionnĂ©e plus haut, j’invite Ă  comprendre la pratique rĂ©flexive ethnographique par une approche relationnelle oĂč l’ethnographe peut ĂȘtre activement engagĂ©e dans la constitution de l’organisation qu’elle Ă©tudie. C’est par cet engagement que s’est constituĂ© le matĂ©riel de terrain que j’analyse tout au long de la thĂšse. Plus prĂ©cisĂ©ment, le dernier chapitre plonge dans mon terrain et ses tensions pour concevoir une bonne expĂ©rience de conversation et de consentement. En rĂ©sistant Ă  la tentation de laisser le consentement perdre de son importance, je montre que s’y accrocher comme un matter of caring amĂšne Ă  se concentrer sur les conditions dans lesquelles l’organisation demande le consentement : comment prĂ©voit-on de traiter les informations personnelles et comment conçoit-on l’interaction par laquelle le bot demandera leur consentement Ă  ses utilisateurs. Cette histoire n’est pas couronnĂ©e de succĂšs mais dĂ©voile plutĂŽt une certaine vulnĂ©rabilitĂ©. Ainsi, cette thĂšse ne propose pas un modĂšle de consentement, ni des directives de design Ă©thique d'un agent conversationnel. Elle s'attarde plutĂŽt sur l'importance des conditions Ă  permettre pour faire de la place aux interactions de consentement. PlutĂŽt que de figer le consentement dans un Ă©tat contraint de formalitĂ©, elle invite Ă  penser le consentement comme une conversation, avec du respect et de l'Ă©panouissement Ă  l’horizon.“Do you agree?” In the digital realm, this question calls for a yes or no answer to the processing of our personal data by an organization; that is, it asks for the user’s consent. Conversational agents are dialogue systems (or bots) that use many personal data to talk with humans through a textual interface (chatbot) or voice interface (voice assistant, sometimes called “voicebot”). Through an ethnography within a start-up that develops these kinds of agents, this doctoral dissertation explores the weaving of consent into this organization’s design practices. Thus, this dissertation asks: What does a “good experience” of consent with a conversational agent look like? It starts by reviewing the literature on design of conversational experiences and in the ethics of consent. While a good conversational experience is supposed to be smooth and enjoyable, the consent experience requires some interruptions to be sufficiently informed, which can be uncomfortable. So how can these two versions of a “good experience” be brought together without one hurting or invalidating the other? Faced with the many limitations for meaningful consent in a digital context, the temptation to put consent aside is palpable, yet its ethical horizon of respect is worth caring for. Then, with a perspective that views consent in terms of a matter of caring, this dissertation develops a theoretical framework that enables us to explore how people can care for consent situations. Centered on the concept of matter of caring, this framework enriches a ventriloquial approach to the study of the communicative constitution of reality with an ethics of care; it focuses attention on how certain matters of concern can be cared for in order to improve consent situations. By becoming a matter of caring, consent can no longer be neglected as a concern that fails to count; it rather participates in changing how human beings interact with each other. Participating in the bringing about of this change is at the core of the ethnographic methods that constitute this dissertation: Based on my experience with becoming an active participant in the mentioned start-up, rather than a mere participant observant, I explain how ethnographic reflexive practice can be viewed relationally; that is, how an ethnographer can be actively engaged in the constitution of the organization she is studying. This engagement shaped the fieldwork materials I analyze in-depth throughout the dissertation. More specifically, the last chapter of the dissertation delves into my fieldwork on the mentioned tension between designing a good conversational and consent experience. Resisting the temptation to let consent fade away, I show that holding on to it as a matter of caring makes us focus on the conditions under which the organization I studied asks for consent: how it plans to process personal information and how it designs the interaction through which the bot asks for consent from its users. This is not a success story, but rather a story of vulnerability. Thus, this dissertation does not propose a model for consent, nor does it suggest ethical design guidelines for a conversational agent. Instead, it highlights the importance of providing conditions to enable interactions in which consent becomes a matter of caring. Rather than freezing consent in a constrained state of formality, it invites us to think of consent as a conversation, with respect and flourishing on the horizon
    corecore