7 research outputs found

    Are Auditors\u27 Reliance on Conclusions from Data Analytics Impacted By Different Data Analytic Inputs?

    Get PDF
    Global stakeholders have expressed interest in increasing the use of data analytics throughout the audit process. While data analytics offer great promise in identifying audit-relevant information, auditors may not uniformly incorporate this information into their decision making. This study examines whether conclusions from two data analytic inputs, the type of data analytical model (anomaly vs. predictive) and type of data analyzed (financial vs. nonfinancial), result in different auditors\u27 decisions. Findings suggest that conclusions from data analytical models and data analyzed jointly impact budgeted audit hours. Specifically, when financial data is analyzed auditors increase budgeted audit hours more when predictive models are used than when anomaly models are used. The opposite occurs when nonfinancial data is analyzed, auditors increase budgeted audit hours more when anomaly models are used compared to predictive models. These findings provide initial evidence that data analytics with different inputs do not uniformly impact auditors\u27 judgments

    Is Sophistication Always Better? Can Perceived Data Analytic Tool Sophistication Lead to Biased Judgements?

    Get PDF
    The rise of technology-enabled data analytic tools creates opportunities for firms to improve audit quality related to complex estimates. To combat auditors’ resistance to using technology-enabled tools, firms may promote the sophistication of such tools to their audit staff. However, there is a paucity of research that has examined how auditors’ perceived sophistication of an analytic tool impacts judgments about audit evidence. We conduct an experiment and find that, holding all other information constant, the preferences of an audit supervisor interact with the perceived sophistication of an analytic tool to jointly impact auditors’ anticipated evaluation from a supervisor and, in turn, their evidence assessment decisions when auditing a complex estimate. As such, the promotion of tool sophistication by audit firms can significantly affect the audit of complex estimates to a greater degree than what would be expected. Implications for audit theory and practice are discussed

    Data Analytics (Ab)Use in Healthcare Fraud Audits

    Get PDF
    This study explores how government-adopted audit data analytic tools promote the abuse of power by auditors enabling politically sensitive processes that encourage industry-wide normalization of behavior. In an audit setting, we investigate how a governmental organization enables algorithmic decision-making to alter power relationships to effect organizational and industry-wide change. While prior research has identified discriminatory threats emanating from the deployment of algorithmic decision-making, the effects of algorithmic decision-making on inherently imbalanced power relationships have received scant attention. Our results provide empirical evidence of how systemic and episodic power relationships strengthen each other, thereby enabling the governmental organization to effect social change that might be too politically prohibitive to enact directly. Overall, the results suggest that there are potentially negative effects caused by the use of algorithmic decision-making and the resulting power shifts, and these effects create a different view of the level of purported success attained through auditor use of data analytics

    Do Different Data Analytics Impact Auditors\u27 Decisions?

    Get PDF
    Global stakeholders have expressed interest in increasing the use of data analytics throughout the audit process. While data analytics offer great promise in identifying auditrelevant information, auditors may not use this information to its full potential, resulting in a missed opportunity for possible improvements to audit quality. This article summarizes a study by Koreff (2022) that examines whether conclusions from different types of data analytical models (anomaly vs. predictive) and data analyzed (financial vs. non-financial), result in different auditor decisions. Findings suggest that when predictive models are used and identify a risk of misstatement, auditors increase budgeted audit hours more when financial data is analyzed than when non-financial data is analyzed. However, when anomaly models are used and identify a risk of misstatement, auditors’ budgeted hours do not differ based on the type of data analyzed. These findings provide evidence that different data analytics do not uniformly impact auditors’ decisions

    The Sentinel Effect and Financial Reporting Aggressiveness in the Healthcare Industry

    Get PDF
    The sentinel effect posits that the perception of increased oversight is associated with improved behavior. We consider the association between enhanced government oversight and financial reporting aggressiveness in the healthcare industry. Specifically, we examine the association between criminal cases (pending cases and successful cases) against healthcare providers and the quality of subjective accounts that require significant judgment and have been shown to be linked to healthcare earnings management – revenue accruals and the allowance for doubtful accounts. We find that heightened government oversight is associated with lower financial reporting aggressiveness

    Three Studies Examining Auditors\u27 Use of Data Analytics

    Get PDF
    This dissertation comprises three studies, one qualitative and two experimental, that center on auditor\u27s use of data analytics. Data analytics hold the potential for auditors to reallocate time spent on labor intensive tasks to judgment intensive tasks (Brown-Liburd et al. 2015), ultimately improving audit quality (Raphael 2017). Yet the availability of these tools does not guarantee that auditors will incorporate the data analytics into their judgments (Davis et al. 1989; Venkatesh et al. 2003). The first study investigates implications of using data analytics to structure the audit process for nonprofessionalized auditors. As the public accounting profession continues down a path of de-professionalization (Dirsmith et al. 2015), data analytics may increasingly be used as a control mechanism for guiding nonprofessionalized auditors\u27 work tasks. Results of this study highlight negative ramifications of using nonprofessionalized auditors in a critical audit setting. The second study examines how different types of data analytics impact auditors\u27 judgments. This study demonstrates the joint impact that the type of data analytical model and type of data analyzed have on auditors\u27 judgments. This study contributes to the literature and practice by demonstrating that data analytics do not uniformly impact auditors\u27 judgments. The third study examines how auditors\u27 reliance on data analytics is impacted by the presentation source and level of risk identified. This study provide insights into the effectiveness of public accounting firms\u27 development of data scientist groups to incorporate the data analytic skillset into audit teams. Collectively, these studies contribute to the literature by providing evidence on auditors\u27 use of data analytics. Currently, the literature is limited to demonstrating that auditors are not effective at identifying patterns in data analytics visualizations when viewed before traditional audit evidence (Rose et al. 2017). The three studies in this dissertation highlight that not all data analytics influence judgments equally

    Is sophistication always better? The impact of data analytic tool sophistication and supervisor preferences on the evaluation of complex estimates

    No full text
    The rise of technology-enabled data analytic tools creates opportunities for firms to improve audit quality related to complex estimates. In an effort to combat auditors' resistance to using technology-enabled tools, firms may promote the sophistication of such tools to their audit staff. However, there is a paucity of research that has examined how auditors consider the sophistication of an analytic tool when making judgments about audit evidence. We conduct an experiment and find that, holding all other information constant, the perceived sophistication of an analytic tool interacts with the preferences of an audit supervisor to jointly impact auditors' anticipated evaluation from a supervisor and, in turn, their evidence evaluation decisions when auditing a complex estimate. As such, the promotion of tool sophistication by audit firms can significantly affect the audit of complex estimates to a greater degree than what would normatively be expected. Implications for audit theory and practice are discussed
    corecore