223,371 research outputs found

    Peer Support Specialists and Service Users’ Perspectives on privacy, confidentiality, and security of digital mental health

    Get PDF
    As the digitalization of mental health systems progresses, the ethical and social debate on the use of these mental health technologies has seldom been explored among end-users. This article explores how service users (e.g., patients and users of mental health services) and peer support specialists understand and perceive issues of privacy, confidentiality, and security of digital mental health interventions. Semi-structured qualitative interviews were conducted among service users (n = 17) and peer support specialists (n = 15) from a convenience sample at an urban community mental health center in the United States. We identified technology ownership and use, lack of technology literacy including limited understanding of privacy, confidentiality, and security as the main barriers to engagement among service users. Peers demonstrated a high level of technology engagement, literacy of digital mental health tools, and a more comprehensive awareness of digital mental health ethics. We recommend peer support specialists as a potential resource to facilitate the ethical engagement of digital mental health interventions for service users. Finally, engaging potential end-users in the development cycle of digital mental health support platforms and increased privacy regulations may lead the field to a better understanding of effective uses of technology for people with mental health conditions. This study contributes to the ongoing debate of digital mental health ethics, data justice, and digital mental health by providing a first-hand experience of digital ethics from end-users’ perspectives.publishedVersio

    Health Information Systems in the Digital Health Ecosystem—Problems and Solutions for Ethics, Trust and Privacy

    Get PDF
    Digital health information systems (DHIS) are increasingly members of ecosystems, collecting, using and sharing a huge amount of personal health information (PHI), frequently without control and authorization through the data subject. From the data subject's perspective, there is frequently no guarantee and therefore no trust that PHI is processed ethically in Digital Health Ecosystems. This results in new ethical, privacy and trust challenges to be solved. The authors' objective is to find a combination of ethical principles, privacy and trust models, together enabling design, implementation of DHIS acting ethically, being trustworthy, and supporting the user's privacy needs. Research published in journals, conference proceedings, and standards documents is analyzed from the viewpoint of ethics, privacy and trust. In that context, systems theory and systems engineering approaches together with heuristic analysis are deployed. The ethical model proposed is a combination of consequentialism, professional medical ethics and utilitarianism. Privacy enforcement can be facilitated by defining it as health information specific contextual intellectual property right, where a service user can express their own privacy needs using computer-understandable policies. Thereby, privacy as a dynamic, indeterminate concept, and computational trust, deploys linguistic values and fuzzy mathematics. The proposed solution, combining ethical principles, privacy as intellectual property and computational trust models, shows a new way to achieve ethically acceptable, trustworthy and privacy-enabling DHIS and Digital Health Ecosystems

    Uncertain Terms

    Get PDF
    Health apps collect massive amounts of sensitive consumer data, including information about users’ reproductive lives, mental health, and genetics. As a result, consumers in this industry may shop for privacy terms when they select a product. Yet our research reveals that many digital health tech companies reserve the right to unilaterally amend their terms of service and their privacy policies. This ability to make one-sided changes undermines the market for privacy, leaving users vulnerable. Unfortunately, the current law generally tolerates unilateral amendments, despite fairness and efficiency concerns. We therefore propose legislative, regulatory, and judicial solutions to better protect consumers of digital health tech and beyond

    Will I or Will I Not? Explaining the Willingness to Disclose Personal Self-Tracking Data to a Health Insurance Company

    Get PDF
    Users of digital self-tracking devices increasingly benefit from multiple services related to their self-tracking data. Vice versa, new digital as well as “offline” service providers, such as health insurance companies, depend on the users’ willingness to disclose personal data to be able to offer new services. Whereas previous research mostly investigated the willingness to disclose data in the context of social media, e-commerce and smartphone apps, the aim of our research is to analyze the influence of the privacy calculus of personal risks and benefits on the willingness to disclose highly personal and confidential self-tracking data to health insurance companies. To do so, we develop a conceptual model based on the privacy calculus concept and validate it with a sample of 103 respondents in a scenario-based experiment using structural equation modeling. Our results reveal that privacy risks always have a negative impact on the willingness to disclose personal data, while positive effects of privacy benefits are partly depending on the data sensitivity

    Ethical Considerations for Participatory Health through Social Media: Healthcare Workforce and Policy Maker Perspectives

    Get PDF
    Objectives: To identify the different ethical issues that should be considered in participatory health through social media from different stakeholder perspectives (i.e., patients/service users, health professionals, health information technology (IT) professionals, and policy makers) in any healthcare context. Methods: We implemented a two-round survey composed of open ended questions in the first round, aggregated into a list of ethical issues rated for importance by participants in the second round, to generate a ranked list of possible ethical issues in participatory health based on healthcare professionals’ and policy makers’ opinions on both their own point of view and their beliefs for other stakeholders’ perspectives. 1 Introduction Nowadays, individuals have more autonomy, access to information, and human capital to support their health decisions than previously fathomable [1, 2]. These informed, connected, and socially supported health consumers (or patients) are leading a shift in the way healthcare is approached, delivered, and governed. This very notion lies at the heart of participatory health, which centers on collaboration and shared-decision making [2, 3]. Results: Twenty-six individuals responded in the first round of the survey. Multiple ethical issues were identified for each perspective. Data privacy, data security, and digital literacy were common themes in all perspectives. Thirty-three individuals completed the second round of the survey. Data privacy and data security were ranked among the three most important ethical issues in all perspectives. Quality assurance was the most important issue from the healthcare professionals’ perspective and the second most important issue from the patients’ perspective. Data privacy was the most important consideration for patients/service users. Digital literacy was ranked as the fourth most important issue, except for policy makers’ perspective. Conclusions: Different stakeholders’ opinions fairly agreed that there are common ethical issues that should be considered across the four groups (patients, healthcare professionals, health IT professionals, policy makers) such as data privacy, security, and quality assurance

    Attempts Towards a Zero-Sum Game: A Recurring Imbalance Between Individual Privacy and the Fourth Amendment

    Get PDF
    The digital era we live in today allows society to work, shop, socialize, and even monitor one’s health without having to leave the confines of one’s home. In a recent landmark privacy case, Carpenter v. United States, the individual privacy implications of the Fourth Amendment were strengthened when the Supreme Court held that the government must generally obtain a warrant before collecting more than six days of historical cell-site location information from a third-party service provider, like Verizon. Cell-site location information could implicate numerous Fourth Amendment concepts, such as the third-party doctrine, mosaic theory, and public exposure doctrine. Refusing to apply the third-party doctrine in its existing state, the Supreme Court advanced an alternative digital third-party doctrine to protect historical cell-site location information. Recognizing the technological advances and the ubiquitous use of technology by society, the Court’s decision attempts to balance the playing field between individual privacy and law enforcement. This Article explores the Supreme Court’s selective valuation of privacy in physical and digital information. In doing so, this Article argues that a digital third-party doctrine will not resolve the tension between the Fourth Amendment and technology, as it is a direct departure from traditional expectations and proves unworkable. What will prove workable, however, is adhering to the common understand- ing that what enters the public—either through physical or digital information—remains public knowledge, and that which is public knowledge does not amount to a reasonable expectation of privacy

    Analysis of Consumer Reuse Intention for Digital Healthcare Application Using The Extended TAM Approach

    Get PDF
    The technological disruption and conventional health services that required consumers to meet directly with health practitioners can now be accelerated through health service technology known as digital healthcare applications. In Indonesia, the use of digital healthcare applications increased rapidly when the Covid-19 pandemic hit the world. Therefore, the conventional health care service no longer suits the current market of health care. This study aimed to examine the factors in determining their influence on consumer reuse intentions in using digital healthcare applications. The methodology used is a quantitative method, with data collected by an electronic survey of customers using health care applications. Data is analyzed using SEM PLS. The finding showed that perceived usefulness, trust, privacy, and PEOU have and provide a positive influence on re-usage intention in reusing digital healthcare applications. However, this study did not test the mediation or moderation of the variables. Therefore, external variables limited to trust and privacy need to be explored further so that other external variables that might affect the use of digital healthcare applications can be obtained

    Citizens' attitudes to contact tracing apps

    Get PDF
    Citizens’ concerns about data privacy and data security breaches may reduce the adoption of COVID-19 contact tracing mobile phone applications, making them less effective. We implement a choice experiment (conjoint experiment) where participants indicate which version of two contact tracing apps they would install, varying the apps’ privacy-preserving attributes. Citizens do not always prioritise privacy and prefer a centralised National Health Service system over a decentralised system. In a further study asking about participants’ preference for digital-only vs human-only contact tracing, we find a mixture of digital and human contact tracing is supported. We randomly allocated a subset of participants in each study to receive a stimulus priming data breach as a concern, before asking about contact tracing. The salient threat of unauthorised access or data theft does not significantly alter preferences in either study. We suggest COVID-19 and trust in a national public health service system mitigate respondents’ concerns about privacy

    How a service user knows the level of privacy and to whom trust in pHealth systems?

    Get PDF
    pHealth is a data (personal health information) driven approach that use communication networks and platforms as technical base. Often it’ services take place in distributed multi-stakeholder environment. Typical pHealth services for the user are personalized information and recommendations how to manage specific health problems and how to behave healthy (prevention). The rapid development of micro- and nano-sensor technology and signal processing makes it possible for pHealth service provider to collect wide spectrum of personal health related information from vital signs to emotions and health behaviors. This development raises big privacy and trust challenges especially because in pHealth similarly to eCommerce and Internet shopping it is commonly expected that the user automatically trust in service provider and used information systems. Unfortunately, this is a wrong assumption because in pHealth’s digital environment it almost impossible for the service user to know to whom to trust, and what the actual level of information privacy is. Therefore, the service user needs tools to evaluate privacy and trust of the service provider and information system used. In this paper, the authors propose a solution for privacy and trust as results of their antecedents, and for the use of computational privacy and trust. To answer the question, which antecedents to use, two literature reviews are performed and 27 privacy and 58 trust attributes suitable for pHealth are found. A proposal how to select a subset of antecedents for real life use is also provided
    • 

    corecore