23,320 research outputs found

    Influence of Culture, Transparency, Trust, and Degree of Automation on Automation Use

    Get PDF
    The reported study compares groups of 120 participants each, from the United States, Taiwan, and Turkey interacting with versions of an automated path planner that vary in transparency and degree of automation. The nationalities were selected in accordance with the theory of Cultural Syndromes as representatives of Dignity (US), Face (Taiwan), and Honor (Turkey) cultures and were predicted to differ in readiness to trust automation, degree of transparency required to use automation, and willingness to use systems with high degrees of automation. Three experimental conditions were tested. In the first, highlight, path conflicts were highlighted leaving rerouting to the participant. In the second, re-planner made requests for permission to reroute when a path conflict was detected. The third combined condition increased transparency of the re-planner by combining highlighting with rerouting to make the conflict on which decision was based visible to the user. A novel framework relating transparency, stages of automation, and trust in automation is proposed in which transparency plays a primary role in decisions to use automation but is supplemented by trust where there is insufficient information otherwise. Hypothesized cultural effects and framework predictions were confirme

    Influence of cultural factors in dynamic trust in automation

    Get PDF
    The use of autonomous systems has been rapidly increasing in recent decades. To improve human-automation interaction, trust has been closely studied. Research shows trust is critical in the development of appropriate reliance on automation. To examine how trust mediates the human-automation relationships across cultures, the present study investigated the influences of cultural factors on trust in automation. Theoretically guided empirical studies were conducted in the U.S., Taiwan and Turkey to examine how cultural dynamics affect various aspects of trust in automation. The results found significant cultural differences in human trust attitude in automation

    Towards the development of an Inter-Cultural Scale to Measure Trust in Automation

    Get PDF
    Trust is conceived as an attitude leading to intentions resulting in user actions involving automation. It is generally believed that trust is dynamic and that a user’s prior experience with automation affects future behavior indirectly through causing changes in trust. Additionally, individual differences and cultural factors have been frequently cited as the contributors to influencing trust beliefs about using and monitoring automation. The presented research focuses on modeling human’s trust when interacting with automated systems across cultures. The initial trust assessment instrument, comprising 110 items along with 2 perceptions (general vs. specific use of automation), has been empirically validated. Detailed results comparing items and dimensionality with our new pooled measure will be presented

    Artificial intelligence and UK national security: Policy considerations

    Get PDF
    RUSI was commissioned by GCHQ to conduct an independent research study into the use of artificial intelligence (AI) for national security purposes. The aim of this project is to establish an independent evidence base to inform future policy development regarding national security uses of AI. The findings are based on in-depth consultation with stakeholders from across the UK national security community, law enforcement agencies, private sector companies, academic and legal experts, and civil society representatives. This was complemented by a targeted review of existing literature on the topic of AI and national security. The research has found that AI offers numerous opportunities for the UK national security community to improve efficiency and effectiveness of existing processes. AI methods can rapidly derive insights from large, disparate datasets and identify connections that would otherwise go unnoticed by human operators. However, in the context of national security and the powers given to UK intelligence agencies, use of AI could give rise to additional privacy and human rights considerations which would need to be assessed within the existing legal and regulatory framework. For this reason, enhanced policy and guidance is needed to ensure the privacy and human rights implications of national security uses of AI are reviewed on an ongoing basis as new analysis methods are applied to data

    Artificial intelligence: opportunities and implications for the future of decision making

    Get PDF
    Artificial intelligence has arrived. In the online world it is already a part of everyday life, sitting invisibly behind a wide range of search engines and online commerce sites. It offers huge potential to enable more efficient and effective business and government but the use of artificial intelligence brings with it important questions about governance, accountability and ethics. Realising the full potential of artificial intelligence and avoiding possible adverse consequences requires societies to find satisfactory answers to these questions. This report sets out some possible approaches, and describes some of the ways government is already engaging with these issues

    Dialectic tensions in the financial markets: a longitudinal study of pre- and post-crisis regulatory technology

    Get PDF
    This article presents the findings from a longitudinal research study on regulatory technology in the UK financial services industry. The financial crisis with serious corporate and mutual fund scandals raised the profile of compliance as governmental bodies, institutional and private investors introduced a ‘tsunami’ of financial regulations. Adopting a multi-level analysis, this study examines how regulatory technology was used by financial firms to meet their compliance obligations, pre- and post-crisis. Empirical data collected over 12 years examine the deployment of an investment management system in eight financial firms. Interviews with public regulatory bodies, financial institutions and technology providers reveal a culture of compliance with increased transparency, surveillance and accountability. Findings show that dialectic tensions arise as the pursuit of transparency, surveillance and accountability in compliance mandates is simultaneously rationalized, facilitated and obscured by regulatory technology. Responding to these challenges, regulatory bodies continue to impose revised compliance mandates on financial firms to force them to adapt their financial technologies in an ever-changing multi-jurisdictional regulatory landscape

    Influence of Cultural, Organizational, and Automation Capability on Human Automation Trust: A Case Study of Auto-GCAS Experimental Test Pilots

    Get PDF
    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development

    Relation between Trust Attitudes Toward Automation, Hofstede’s Cultural Dimensions, and Big Five Personality Traits

    Get PDF
    Automation has been widely used in interactions with smartphones, computers, and other machinery in recent decades. Studies have shown that inappropriate reliance on automation can lead to unexpected and even catastrophic results. Trust is conceived as an intervening variable between user intention and actions involving reliance on automation. It is generally believed that trust is dynamic and an individual’s culture or personality may influence automation use through changes in trust. To better understand how cultural and individual differences may affect a person’s trust and resulting behaviors, the present study examined the effects of cultural characteristics and personality traits on reported trust in automation in U.S., Taiwanese and Turkish populations. The results showed individual differences significantly affected human trust in automation across the three cultures
    • …
    corecore