242,129 research outputs found

    Data analytics and algorithms in policing in England and Wales: Towards a new policy framework

    Get PDF
    RUSI was commissioned by the Centre for Data Ethics and Innovation (CDEI) to conduct an independent study into the use of data analytics by police forces in England and Wales, with a focus on algorithmic bias. The primary purpose of the project is to inform CDEI’s review of bias in algorithmic decision-making, which is focusing on four sectors, including policing, and working towards a draft framework for the ethical development and deployment of data analytics tools for policing. This paper focuses on advanced algorithms used by the police to derive insights, inform operational decision-making or make predictions. Biometric technology, including live facial recognition, DNA analysis and fingerprint matching, are outside the direct scope of this study, as are covert surveillance capabilities and digital forensics technology, such as mobile phone data extraction and computer forensics. However, because many of the policy issues discussed in this paper stem from general underlying data protection and human rights frameworks, these issues will also be relevant to other police technologies, and their use must be considered in parallel to the tools examined in this paper. The project involved engaging closely with senior police officers, government officials, academics, legal experts, regulatory and oversight bodies and civil society organisations. Sixty nine participants took part in the research in the form of semi-structured interviews, focus groups and roundtable discussions. The project has revealed widespread concern across the UK law enforcement community regarding the lack of official national guidance for the use of algorithms in policing, with respondents suggesting that this gap should be addressed as a matter of urgency. Any future policy framework should be principles-based and complement existing police guidance in a ‘tech-agnostic’ way. Rather than establishing prescriptive rules and standards for different data technologies, the framework should establish standardised processes to ensure that data analytics projects follow recommended routes for the empirical evaluation of algorithms within their operational context and evaluate the project against legal requirements and ethical standards. The new guidance should focus on ensuring multi-disciplinary legal, ethical and operational input from the outset of a police technology project; a standard process for model development, testing and evaluation; a clear focus on the human–machine interaction and the ultimate interventions a data driven process may inform; and ongoing tracking and mitigation of discrimination risk

    On Evidence-based Risk Management in Requirements Engineering

    Full text link
    Background: The sensitivity of Requirements Engineering (RE) to the context makes it difficult to efficiently control problems therein, thus, hampering an effective risk management devoted to allow for early corrective or even preventive measures. Problem: There is still little empirical knowledge about context-specific RE phenomena which would be necessary for an effective context- sensitive risk management in RE. Goal: We propose and validate an evidence-based approach to assess risks in RE using cross-company data about problems, causes and effects. Research Method: We use survey data from 228 companies and build a probabilistic network that supports the forecast of context-specific RE phenomena. We implement this approach using spreadsheets to support a light-weight risk assessment. Results: Our results from an initial validation in 6 companies strengthen our confidence that the approach increases the awareness for individual risk factors in RE, and the feedback further allows for disseminating our approach into practice.Comment: 20 pages, submitted to 10th Software Quality Days conference, 201

    Better by design: Business preferences for environmental regulatory reform

    Get PDF
    We present the preferences for environmental regulatory reform expressed by 30 UK businesses and industry bodies from 5 sectors. While five strongly preferred voluntary regulation, seven expressed doubts about its effectiveness, and 18 expressed no general preference between instrument types. Voluntary approaches were valued for flexibility and lower burdens, but direct regulation offered stability and a level playing field. Respondents sought regulatory frameworks that: are coherent; balance clarity, prescription and flexibility; are enabled by positive regulatory relationships; administratively efficient; targeted according to risk magnitude and character; evidence-based and that deliver long-term market stability for regulatees. Anticipated differences in performance between types of instrument can be undermined by poor implementation. Results underline the need for policy makers and regulators to tailor an effective mix of instruments for a given sector, and to overcome analytical, institutional and political barriers to greater coherence, to better coordinate existing instruments and tackle new environmental challenges as they emerge

    Gainsharing: A Critical Review and a Future Research Agenda

    Get PDF
    This paper provides a critical review of the extensive literature on gainsharing. It examines the reasons for the fast growth in these programs in recent years and the major prototypes used in the past. Different theoretical formulations making predictions about the behavioral consequences and conditions mediating the success of these programs are discussed and the supporting empirical evidence is examined. The large number of a theoretical case studies and practitioner reports or gainsharing are also summarized and integrated. The article concludes with a suggested research agenda for the future

    Board composition, monitoring and credit risk: evidence from the UK banking industry

    Get PDF
    This paper examines the effects of board composition and monitoring on the credit risk in the UK banking sector. The study finds CEO duality, pay and board independence to have a positive and significant effect on credit risk of the UK banks. However, board size and women on board have a negative and significant influence on credit risk. Further analysis using sub-samples divided into pre-financial crisis, during the financial crisis and post crisis reinforce the robustness of our findings. Overall, the paper sheds light on the effectiveness of the within-firm monitoring arrangement, particularly, the effects of CEO power and board independence on credit risk decisions thereby contributing to the agency theory

    Operator-based approaches to harm minimisation in gambling: summary, review and future directions

    Get PDF
    In this report we give critical consideration to the nature and effectiveness of harm minimisation in gambling. We identify gambling-related harm as both personal (e.g., health, wellbeing, relationships) and economic (e.g., financial) harm that occurs from exceeding one’s disposable income or disposable leisure time. We have elected to use the term ‘harm minimisation’ as the most appropriate term for reducing the impact of problem gambling, given its breadth in regard to the range of goals it seeks to achieve, and the range of means by which they may be achieved. The extent to which an employee can proactively identify a problem gambler in a gambling venue is uncertain. Research suggests that indicators do exist, such as sessional information (e.g., duration or frequency of play) and negative emotional responses to gambling losses. However, the practical implications of requiring employees to identify and interact with customers suspected of experiencing harm are questionable, particularly as the employees may not possess the clinical intervention skills which may be necessary. Based on emerging evidence, behavioural indicators identifiable in industryheld data, could be used to identify customers experiencing harm. A programme of research is underway in Great Britain and in other jurisdiction

    A Survey on Economic-driven Evaluations of Information Technology

    Get PDF
    The economic-driven evaluation of information technology (IT) has become an important instrument in the management of IT projects. Numerous approaches have been developed to quantify the costs of an IT investment and its assumed profit, to evaluate its impact on business process performance, and to analyze the role of IT regarding the achievement of enterprise objectives. This paper discusses approaches for evaluating IT from an economic-driven perspective. Our comparison is based on a framework distinguishing between classification criteria and evaluation criteria. The former allow for the categorization of evaluation approaches based on their similarities and differences. The latter, by contrast, represent attributes that allow to evaluate the discussed approaches. Finally, we give an example of a typical economic-driven IT evaluation

    A Substruction Approach to Assessing the Theoretical Validity of Measures

    Get PDF
    Background Validity is about the logic, meaningfulness, and evidence used to defend inferences made when interpreting results. Substruction is a heuristic or process that visually represent the hierarchical structure between theory and measures. Purpose To describe substruction as a method for assessing the toretical validity of research measures. Methods Using Fawcett\u27s Conceptual-Theoretical-Empirical Structure. an exemplar is presented of substruction from the Individual and Family Self-Management Theory to the Striving to be strong study concepts and empirical measures. Results Substruction tables display evidence supporting theoretical validity of the instruments used in the study. Conclusion A high degree of congruence between theory and measure is critical to support the validity of the theory and to support attributions made about moderating, mediating, causal relationships, and intervention effects

    Empirical Evidence of RFID Impacts on Supply Chain Performance

    Get PDF
    Purpose - The purpose of this paper is to investigate the actual benefits of radio frequency identification (RFID) on supply chain performance through the empirical evidence. Design/methodology/approach - The research reviews and classifies the existing quantitative empirical evidence of RFID on supply chain performance. The evidence is classified by process (operational or managerial) and for each process by effect (automational, informational, and transformational). Findings - The empirical evidence shows that the major effects from the implementation of RFID are automational effects on operational processes followed by informational effects on managerial processes. The RFID implementation has not reached transformational level on either operational or managerial processes. RFID has an automational effect on operational processes through inventory control and efficiency improvements. An informational effect for managerial processes is observed for improved decision quality, production control and the effectiveness of retail sales and promotions coordination. In addition, a three-stage model is proposed to explain the effects of RFID on the supply chain. Research limitations/implications - Limitations of this research include the use of secondary sources and the lack of consistency in performance measure definitions. Future research could focus on detailed case studies that investigate cross-functional applications across the organization and the supply chain. Practical implications - For managers, the empirical evidence presented can help them identify implementation areas where RFID can have the greatest impact. The data can be used to build the business case for RFID and therefore better estimate ROI and the payback period. Originality/value - This research fills a void in the literature by providing practitioners and researchers with a better understanding of the quantitative benefits of RFID in the supply chain
    corecore