21 research outputs found
Technology investment decision making: an integrated analysis in UK Internet Banking
The research addresses the problem of technological investment decision making (TIDM) in UK Banks. It focuses on Internet Banking technologies and uses interviews with bank executives and industry practitioners to form a coherent understanding of how technological decisions are practically made and what, in that process, is the role of evaluation techniques. The aims of the research are (1) to identify and explain the discord between formal and practical evaluations of technologies, (2) to review the roles of expert professional groups in defining the norms of evaluation, and (3) to develop a model to reflect the reality of TIDM in UK banking. The ultimate aim is to contribute to reducing the ambiguity that notoriously characterises the evaluation of new technology.According to the theoretical framework the TIDM problem is socially constructed by expert groups (actors) who either participate in decision-making or assume roles in developing methodologies for facilitating it. Its ultimate shape is the outcome of negotiations between these viewpoints, in light of expert power positions and political advocacy Three classes of such "actors" are identified: (1) Practitioners, namely experts in Financial Institutions, (2) Observers, academic researchers, consultants and government bodies, and (3) the Community of Received Wisdom, comprising the commonly understood views on what TIDM is and how it should be made.A novel methodological approach is introduced as a variant of Grounded Theory. Called Informed Grounded Theory (IGT), it proposes that viewpoints are by default informed by individuals' academic and professional training; thus, past theory should not be considered as a contaminating factor for the data and their interpretation, (as Grounded Theory proposes) but as integral part of it.The key findings of the research concern (1) the unconventional usage of financial and other formal methodologies in TIDM practice, (2) the highly political role of dominant expert groups and the resulting dynamics of their development, (3) the influence of the wider economic cycles on how technological value is perceived and (4) the changing role of the Finance function in technological investment justification. The core conclusion from these points is that TIDM in UK banks is an act of justification and advocacy, far more than it is an assessment process; valuation techniques play an ancillary role in ascertaining views often founded on purely strategic or political grounds.The research recommends an interdisciplinary approach to improving TIDM methodologies. Unlike the traditional paradigm which might be characterised as improvable measurement, where measurement precision is sought as the solution to the valuation ambiguity, it is proposed that we seek improvement by taking explicit account of the perceptions of expert groups, as these are encoded into existing formal methodologies, and thus offer only partial evaluations. By mobilising these partialities, newer approaches may provide for including socio-political as well as economic factors in technological valuation processes.The research recommends an interdisciplinary approach to improving TIDM methodologies. Unlike the traditional paradigm which might be characterised as improvable measurement, where measurement precision is sought as the solution to the valuation ambiguity, it is proposed that we seek improvement by taking explicit account of the perceptions of expert groups, as these are encoded into existing formal methodologies, and thus offer only partial evaluations. By mobilising these partialities, newer approaches may provide for including socio-political as well as economic factors in technological valuation processes
Recommended from our members
Why better models do not always lead to better decisions
It was the environmental academic Jerome Ravetz who said: âWe believe in numbers, just because they are numbers.â Other science historians, including Theodore Porter, have long held the same view. This statement could not be more relevant as the global economy recovers from the biggest financial crisis since 1929. Many in management privately agree that we use numerical decision models with near religiousness, to convince that our decisions are scientifically robust, as few will doubt a well-founded, widely accepted mathematical model: we call on the legitimacy of arithmetic to persuade rather than prove
Recommended from our members
Practitioners, observers and the community of received wisdom: The actor-based approach to technological investment decisions
This paper addresses the problem of technology valuation in UK financial institutions, specifically concerning the introduction of Internet Banking. The research looked into the prescribed processes and the respective established practices for Technological Investment Decision-Making (TIDM) in banks. Significant disparity between process and practice was found, on the grounds that the actual decisions are determined by expertsâ perceptions and are less about the normative assessment of economic value, as defined in academic literature and corporate handbooks. The research suggests that the TIDM problem is socially constructed (rather than externally addressed) by experts who either participate directly in decision-making or, alternatively, contribute to developing relevant methodologies. The TIDM problem is ultimately defined by the disparate perceptions of the problem that these different interested parties, or âactorsâ, assume. Three classes of actors were identified: (1) Practitioners, namely expert professionals in Financial Institutions, (2) Observers, primarily academic researchers, consultants and government bodies, and (3) the Community of Received Wisdom, reflecting commonly understood views on what TIDM is and how it should be made. According to the Actor-based approach, the shape of the TIDM problem results from continuous negotiations between actorsâ viewpoints, in light of expert power positions, political advocacy and fitness to the prevailing TIDM paradigms. These viewpoints are by default informed by expertsâ academic and professional backgrounds, which strongly influence both the received understanding of the TIDM problem, and the perceptions of practitioner and research experts. The paper recommends that the Actor-based approach may contribute to improving TIDM: instead of seeking measurement precision as the solution to valuation ambiguities, notoriously characterising technological investment, it is suggested that we take explicit account of the differently-informed perceptions of expert groups, as these are encoded into existing formal methodologies. By mobilising these disparities, newer approaches can combine the socio-political together with the economic factors for technological valuation
Recommended from our members
Technological investment decision-making and the anomaly of practice in internet banking assessment
Recommended from our members
Computer-aided financial fraud detection: Promise and applicability in monitoring financial transaction fraud
Anti-money Laundering (AML) and Financial Fraud Detection (FFD) have been receiving increasing attention in the past few years, especially in light of the global financial crisis. Closer systems integration and a number of latest steep technological developments in areas like Big Data; High Frequency Trading; e-payments; and mobile payment systems, to name a few, are now promising enhanced risk management through superior decision support for the global financial industry. At the same time, however, resident regulatory frameworks, national and international, appear to lack the connectivity and flexibility required to support integrated AML and FFD approaches. This is strongly testified by the disparate technological approaches to FFD across different Financial Institutions and their reluctance to share practice within the industry.
Focusing on Financial Transaction Fraud, this paper draws on the authorsâ past research work which presented a prototype system that uses a workflow approach to identify abnormal financial transactions and applies Artificial Intelligence for classification. That work has shown successful applicability at short scale experiments, limited by the wide concern that information sharing should be achieved within the broader sector in order to achieve improved results. Drawing from there, this paper proposes that extending that approach across transaction infrastructure will deliver higher quality intelligent monitoring against Financial Transaction Fraud.
Following from that, we argue that the necessary technological maturity does exist to support full-scale operable FFD systems working on large disparate datasets. We then discuss the evidence in favour of the view that such systems can only be realised in the presence of wider regulatory consensus. There is, therefore, the need for a framework within which the technical infrastructure, business architecture and regulatory rules will harness that technological capability to deliver superior fraud prevention.
The paper first reviews computer-aided techniques and approaches for FFD available to the financial sector and discusses the business value of their application. It then addresses the main impediments for their full-scale applicability and uses an analytical framework for assessing their significance, in technological, business-specific and regulatory terms. A brief account of the authorsâ workflow-based approach is then provided and its capabilities are outlined
Recommended from our members
The history of banking technologies in the UK: patterns of technological investment decision-making and expertise
The central role of banking in the 2008 credit crisis has been the source of much controversy about the quality and robustness of decision-making in the Financial Services sector. This paper aims to surface the influence of the historical evolution of expertise in the banking sector, on such decisions and, in so doing, to underline that the decision-making activity is strongly linked to the views of dominant expert groups in the industry in each era. The paper proposes that Technological Investment Decision Making (TIDM), as viewed historically, has been highly contingent to both technological developments in banking and the subsequent developments in banking expertise that provides the pool for decision-makers in the industry.The paper adopts an historical perspective to illustrate that, counter to popular belief, TIDM is a socially constructed process rather than the outcome of any normative exercise. History demonstrates that there is no optimal method for TIDM with rigour and accuracy of execution determining successful outcomes. On the contrary, in each era, the "right way" to perform TIDM has always been underpinned by the standpoints and beliefs of specialised practitioners who dominated the UK banking industry and by the received wisdom of a community of expert professionals, administrators and think tanks, dictating "realities" on the state of the economy, the role of banks and the value of technologies
Impact of mobility in ad hoc protocol design
Protocols in ad hoc networks are not designed with mobility in mind. Recent research reveals that mobility impacts all the layers of the protocol stack. Specifically, more realistic mobility models that are extracted from real user traces for the vehicular and pedestrian scenarios show that wireless nodes tend to cluster around popular locations. The contributions of this paper are two-fold. First, it suggests cross layer design, as a promising approach, in designing ad hoc protocols with mobility in mind. Therefore, it provides a survey of the methodologies used in wireless cross layer studies. Second, it presents a framework for cross layer and flexible ad hoc protocol design, which integrates mobility into protocol design
Towards the ensemble: IPCBR model in investigating financial bubbles
Asset value predictability remains a major research concern in financial market especially when considering the effect of unprecedented market fluctuations on the behaviour of market participants.
This paper presents preliminary results toward the building a reliable forward problem on ensemble approach IPCBR model, that leverages the capabilities of Case based Reasoning(CBR) and Inverse Problem Techniques (IPTs) to describe and model abnormal stock market fluctuations (often associated with asset bubbles) using datasets from historical stock market prices. The framework uses a rich set of past observations and geometric pattern description and then applies a CBR to formulate the forward problem, Inverse Problem formulation is then applied to identify a set of parameters that can statistically be associated with the occurrence of the observed patterns.
This research work presents a formative strategy aimed to determine the causes of behaviour, rather than predict future time series points which brings a novel perspective to the problem of asset bubbles predictability, and a deviation from the existing research trend. The results depict the stock dynamics and statistical fluctuating evidence associated with the envisaged bubble problem
Recommended from our members
Chapter 4. Regtech frontiers: innovations, trends, and insights redefining compliance
This chapter discusses the contributions and challenges involving regulatory technology (regtech) in financial services. It explores the salient areas where regtech can and should focus, observing existing and forthcoming industry, technology, and legal developments. This chapter outlines regtech use cases to clarify the shaping of that industry sector. It draws on developments in industry and academia, where significant research sets the tone and direction of technological solutions and regulatory drivers. A brief critical account of the benefits and challenges in regtech is offered. This chapter presents potential future directions, focusing on the salient areas of environmental, social, and governance (ESG), cryptocurrency, and decentralized compliance
Recommended from our members
Using BiLSTM networks for context-aware deep sensitivity labelling on conversational data
Information privacy is a critical design feature for any exchange system, with privacy-preserving applications requiring, most of the time, the identification and labeling of sensitive information. However, privacy and the concept of âsensitive informationâ are extremely elusive terms, as they are heavily dependent upon the context they are conveyed in. To accommodate such specificity, we first introduce a taxonomy of four context classes to categorise relationships of terms with their textual surroundings by meaning, interaction, precedence, and preference. We then propose a predictive context-aware model based on a Bidirectional Long Short Term Memory network with Conditional Random Fields (BiLSTM + CRF) to identify and label sensitive information in conversational data (multi-class sensitivity labelling). We train our model on a synthetic annotated dataset of real-world conversational data categorised in 13 sensitivity classes that we derive from the P3P standard. We parameterise and run a series of experiments featuring word and character embeddings and introduce a set of auxiliary features to improve model performance. Our results demonstrate that the BiLSTM + CRF model architecture with BERT embeddings and WordShape features is the most effective (F1 score 96.73%). Evaluation of the model is conducted under both temporal and semantic contexts, achieving a 76.33% F1 score on unseen data and outperforms Googleâs Data Loss Prevention (DLP) system on sensitivity labeling tasks