19 research outputs found

    Technology investment decision making: an integrated analysis in UK Internet Banking

    Get PDF
    The research addresses the problem of technological investment decision making (TIDM) in UK Banks. It focuses on Internet Banking technologies and uses interviews with bank executives and industry practitioners to form a coherent understanding of how technological decisions are practically made and what, in that process, is the role of evaluation techniques. The aims of the research are (1) to identify and explain the discord between formal and practical evaluations of technologies, (2) to review the roles of expert professional groups in defining the norms of evaluation, and (3) to develop a model to reflect the reality of TIDM in UK banking. The ultimate aim is to contribute to reducing the ambiguity that notoriously characterises the evaluation of new technology.According to the theoretical framework the TIDM problem is socially constructed by expert groups (actors) who either participate in decision-making or assume roles in developing methodologies for facilitating it. Its ultimate shape is the outcome of negotiations between these viewpoints, in light of expert power positions and political advocacy Three classes of such "actors" are identified: (1) Practitioners, namely experts in Financial Institutions, (2) Observers, academic researchers, consultants and government bodies, and (3) the Community of Received Wisdom, comprising the commonly understood views on what TIDM is and how it should be made.A novel methodological approach is introduced as a variant of Grounded Theory. Called Informed Grounded Theory (IGT), it proposes that viewpoints are by default informed by individuals' academic and professional training; thus, past theory should not be considered as a contaminating factor for the data and their interpretation, (as Grounded Theory proposes) but as integral part of it.The key findings of the research concern (1) the unconventional usage of financial and other formal methodologies in TIDM practice, (2) the highly political role of dominant expert groups and the resulting dynamics of their development, (3) the influence of the wider economic cycles on how technological value is perceived and (4) the changing role of the Finance function in technological investment justification. The core conclusion from these points is that TIDM in UK banks is an act of justification and advocacy, far more than it is an assessment process; valuation techniques play an ancillary role in ascertaining views often founded on purely strategic or political grounds.The research recommends an interdisciplinary approach to improving TIDM methodologies. Unlike the traditional paradigm which might be characterised as improvable measurement, where measurement precision is sought as the solution to the valuation ambiguity, it is proposed that we seek improvement by taking explicit account of the perceptions of expert groups, as these are encoded into existing formal methodologies, and thus offer only partial evaluations. By mobilising these partialities, newer approaches may provide for including socio-political as well as economic factors in technological valuation processes.The research recommends an interdisciplinary approach to improving TIDM methodologies. Unlike the traditional paradigm which might be characterised as improvable measurement, where measurement precision is sought as the solution to the valuation ambiguity, it is proposed that we seek improvement by taking explicit account of the perceptions of expert groups, as these are encoded into existing formal methodologies, and thus offer only partial evaluations. By mobilising these partialities, newer approaches may provide for including socio-political as well as economic factors in technological valuation processes

    Impact of mobility in ad hoc protocol design

    Get PDF
    Protocols in ad hoc networks are not designed with mobility in mind. Recent research reveals that mobility impacts all the layers of the protocol stack. Specifically, more realistic mobility models that are extracted from real user traces for the vehicular and pedestrian scenarios show that wireless nodes tend to cluster around popular locations. The contributions of this paper are two-fold. First, it suggests cross layer design, as a promising approach, in designing ad hoc protocols with mobility in mind. Therefore, it provides a survey of the methodologies used in wireless cross layer studies. Second, it presents a framework for cross layer and flexible ad hoc protocol design, which integrates mobility into protocol design

    Towards the ensemble: IPCBR model in investigating financial bubbles

    Get PDF
    Asset value predictability remains a major research concern in financial market especially when considering the effect of unprecedented market fluctuations on the behaviour of market participants. This paper presents preliminary results toward the building a reliable forward problem on ensemble approach IPCBR model, that leverages the capabilities of Case based Reasoning(CBR) and Inverse Problem Techniques (IPTs) to describe and model abnormal stock market fluctuations (often associated with asset bubbles) using datasets from historical stock market prices. The framework uses a rich set of past observations and geometric pattern description and then applies a CBR to formulate the forward problem, Inverse Problem formulation is then applied to identify a set of parameters that can statistically be associated with the occurrence of the observed patterns. This research work presents a formative strategy aimed to determine the causes of behaviour, rather than predict future time series points which brings a novel perspective to the problem of asset bubbles predictability, and a deviation from the existing research trend. The results depict the stock dynamics and statistical fluctuating evidence associated with the envisaged bubble problem

    An event-driven serverless ETL pipeline on AWS

    Get PDF
    This work presents an event-driven Extract, Transform, and Load (ETL) pipeline serverless architecture and provides an evaluation of its performance over a range of dataflow tasks of varying frequency, velocity, and payload size. We design an experiment while using generated tabular data throughout varying data volumes, event frequencies, and processing power in order to measure: (i) the consistency of pipeline executions; (ii) reliability on data delivery; (iii) maximum payload size per pipeline; and, (iv) economic scalability (cost of chargeable tasks). We run 92 parameterised experiments on a simple AWS architecture, thus avoiding any AWS-enhanced platform features, in order to allow for unbiased assessment of our model’s performance. Our results indicate that our reference architecture can achieve time-consistent data processing of event payloads of more than 100 MB, with a throughput of 750 KB/s across four event frequencies. It is also observed that, although the utilisation of an SQS queue for data transfer enables easy concurrency control and data slicing, it becomes a bottleneck on large sized event payloads. Finally, we develop and discuss a candidate pricing model for our reference architecture usage
    corecore