16 research outputs found

    Enterprise resource planning systems implementation and the implications for the internal audit function

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Corporate governance has received increased attention from both regulators and researchers in recent years resulting in highlighting the significance of the internal audit function (IAF). Another transformative force on the IAF has been the dissemination of enterprise resource planning (ERP) systems which have an impact on the legitimacy of the IAF if it is not suitably adapted. However, there is insufficient knowledge about the adaptations of the IAF which are required if it is to maintain its essential role in governance. This thesis extends our knowledge by exploring and theorising the adaptation of the IAF after ERP introduction. This thesis uses institutional theory as a lens through which to investigate how the IAF responds to the external governance pressures and the internal pressures of the control logic following the introduction of an ERP system. Data were gathered from two listed companies in the food and beverage sector and two large banks operating in Egypt, where one of each pair is an international company and the other is a national company. Interviews and focus groups were conducted with all stakeholders in addition to careful analysis of a number of internal and external documents related to the ERP and the IAF. The study finds that governance pressures related to the IAF determine the legitimisation criteria for the IAF. There is little coercive governance pressure on the IAF in Egypt. However, international companies with operations in Egypt have introduced normative governance pressures as a result of their compliance with stock exchange rules in other jurisdictions. Therefore, mimetic behaviour has helped in transferring the IAF response to ERP implementation. ERP systems carry new control logics based on some interlinked assumptions, which have affected the IAF. The ERP system’s control logic is aligned with the corporate governance goals and objectives, but further alignment is needed to make the best use of the ERP system in enhancing internal control. The introduction of an ERP system produces uncertainty about the IAF’s activities, which motivates it to adapt by changing its practice and structure. The changes in the IAF are dependent on the strategic response adopted by the auditors, which range from acquiescence to defiance. These responses were found to change over time. The differences in responses result in different outcomes for the IAF adaptation. In the international companies the implementation of an ERP system motivates the IAF to be integrated and have a comprehensive scope, whereas in the national companies change was resisted and the role of the IAF was significantly diminished. The IAF’s legitimacy maintaining strategies depend on the coercive and normative governance pressures, which give directions about how to maintain legitimacy. This study offers an explanation of how information systems contribute to the IAF’s professional stability or change and of how macro-governance pressures can bind micro-IAF practice within organisations.

    Metaverse. Old urban issues in new virtual cities

    Get PDF
    Recent years have seen the arise of some early attempts to build virtual cities, utopias or affective dystopias in an embodied Internet, which in some respects appear to be the ultimate expression of the neoliberal city paradigma (even if virtual). Although there is an extensive disciplinary literature on the relationship between planning and virtual or augmented reality linked mainly to the gaming industry, this often avoids design and value issues. The observation of some of these early experiences - Decentraland, Minecraft, Liberland Metaverse, to name a few - poses important questions and problems that are gradually becoming inescapable for designers and urban planners, and allows us to make some partial considerations on the risks and potentialities of these early virtual cities

    ANALYSING INTEGRATED PUBLIC FINANCIAL MANAGEMENT REFORMS: A CASE STUDY OF GHANA

    Get PDF
    Public sector accounting reforms have a vast potential to impact on the developing world (Hopper et al., 2016) but the explication of reform performance is only partial in the neo-classical economics framework. Accounting scholars have called for a deeper understanding of the context and conditions of accounting reforms in the less developing countries and emerging economies (LDCs/EEs) (van Helden and Uddin, 2016; Nyamori et al.,2017; Hopper et al., 2016; Goddard and Mkasiwa, 2016; Manning and McCourt, 2013; Abdul-Rahaman et al., 1997). This research sets out to understand the context of the public financial management (PFM)reforms using Ghana as an example of LDCs/EEs. The tools embedded in the institutional logics approach (ILA) are mobilised, as a meta-theoretical framework encompassing symbolic interactionism and grounded theory, to offer a constructivist-interpretivist account of the reforms. This research found that accounting changes through the PFM reforms in Ghana have been challenging to implement, and reforms outcomes have been poor to mediocre. Empirically, this study identified budget credibility as the core category in public sector accounting reforms in Ghana. The reform outcomes are reflected and constituted in the budget logics, such as over-centralization of controls, gaming, and unpredictability of flow of funds to the ministries, departments and agencies (MDAs), that have constrained the reforms. This study relates the core category to the other five main empirical categories, namely: accountability, professionalism, and financial re-engineering, automation, and policy credibility) which emerged from the study. Longitudinally, the research found that, dating back to the pre-independence era, only limited improvements have been made in the transformation of the public financial management practices to support economic development in Ghana. Opportunities to decentralize financial controls were not taken the 1980s, and the successive recent reforms have only centralized both the reforms and financial management controls. This research explains why sub-optimal accounting practices endure, the paradox of embeddedness, and constraints and possibilities of collective action to effect accounting change through reforms. The study examines the dynamics and the interplay of budget credibility and the other categorical elements within the “vampire state”, and the impact of hegemonic influences of the international community on local actors institutionalised in the polities through objectification and exteriorisation of culture. The implications of the budget credibility are explored further through the development of a substantive theory on the reforms, a processual analysis, and interinstitutional orders comprising national community logics, state logics, and international community logics that shape public sector accounting change in Ghana. This research opens up the possibility of further theoretical and empirical studies in other resource dependent countries where reforms are influenced by external donor development partners

    Standardising the USGS volcano alert level system: acting in the context of risk, uncertainty and complexity

    Get PDF
    A volcano alert level system (VALS) forms a key component of a volcano early warning system, which is used to communicate warning information from scientists to civil authorities managing volcanic hazards. In 2006, the United States Geological Survey (USGS) standardised its VALS, replacing all locally developed systems with a common standard. The emergence of this standardisation, and resulting implications, are charted here, in the context of managing the scientific complexities and diverse agencies involved in volcanic crises. The VALS concept embodies a linear reductionist approach to decision-making, designed around warning levels that correspond to levels of volcanic activity. Yet, complexities emerge as a consequence of the uncertain nature of the physical hazard, the contingencies of local institutional dynamics, and the plural social contexts within which each VALS is embedded, challenging its responsiveness to local knowledge and context. Research conducted at five USGS managed volcano observatories in Alaska, Cascades, Hawaii, Long Valley, and Yellowstone explores the benefits and limitations standardisation brings to each observatory. It concludes that standardisation is difficult to implement for three reasons. Firstly, conceptually, natural hazard warning systems are complex and non-linear, and the VALS intervenes in an overall system characterised by emergent properties and the interaction of many agents, for which forecasting and prediction are difficult. Secondly, pragmatically, the decision to move between alert levels is based upon more than volcanic activity and scientific information, with broader social and environmental risks playing a key role in changing alert levels. Thirdly, empirically, the geographical, social and political context to each volcano observatory results in the standardised VALS being applied in non-standard ways. It is recommended that, rather than further defining a standardised linear product, VALS should focus on developing systems based upon processes and best practice designed to facilitate communication and interaction between scientists and users in context

    A decision framework to mitigate vendor lock-in risks in cloud (SaaS category) migration.

    Get PDF
    Cloud computing offers an innovative business model to enterprise IT services consumption and delivery. However, vendor lock-in is recognised as being a major barrier to the adoption of cloud computing, due to lack of standardisation. So far, current solutions and efforts tackling the vendor lock-in problem have been confined to/or are predominantly technology-oriented. Limited studies exist to analyse and highlight the complexity of vendor lock-in problem existing in the cloud environment. Consequently, customers are unaware of proprietary standards which inhibit interoperability and portability of applications when taking services from vendors. The complexity of the service offerings makes it imperative for businesses to use a clear and well understood decision process to procure, migrate and/or discontinue cloud services. To date, the expertise and technological solutions to simplify such transition and facilitate good decision making to avoid lock-in risks in the cloud are limited. Besides, little research investigations have been carried out to provide a cloud migration decision framework to assist enterprises to avoid lock-in risks when implementing cloud-based Software-as-a-Service (SaaS) solutions within existing environments. Such decision framework is important to reduce complexity and variations in implementation patterns on the cloud provider side, while at the same time minimizing potential switching cost for enterprises by resolving integration issues with existing IT infrastructures. Thus, the purpose of this thesis is to propose a decision framework to mitigate vendor lock-in risks in cloud (SaaS) migration. The framework follows a systematic literature review and analysis to present research findings containing factual and objective information, and business requirements for vendor-neutral interoperable cloud services, and/or when making architectural decisions for secure cloud migration and integration. The underlying research procedure for this thesis investigation consists of a survey based on qualitative and quantitative approaches conducted to identify the main risk factors that give rise to cloud computing lock-in situations. Epistemologically, the research design consists of two distinct phases. In phase 1, qualitative data were collected using open-ended interviews with IT practitioners to explore the business-related issues of vendor lock-in affecting cloud adoption. Whereas the goal of phase 2 was to identify and evaluate the risks and opportunities of lock-in which affect stakeholders’ decision-making about migrating to cloud-based solutions. In synthesis, the survey analysis and the framework proposed by this research (through its step-by-step approach), provides guidance on how enterprises can avoid being locked to individual cloud service providers. This reduces the risk of dependency on a cloud provider for service provision, especially if data portability, as the most fundamental aspect, is not enabled. Moreover, it also ensures appropriate pre-planning and due diligence so that the correct cloud service provider(s) with the most acceptable risks to vendor lock-in is chosen, and that the impact on the business is properly understood (upfront), managed (iteratively), and controlled (periodically). Each decision step within the framework prepares the way for the subsequent step, which supports a company to gather the correct information to make a right decision before proceeding to the next step. The reason for such an approach is to support an organisation with its planning and adaptation of the services to suit the business requirements and objectives. Furthermore, several strategies are proposed on how to avoid and mitigate lock-in risks when migrating to cloud computing. The strategies relate to contract, selection of vendors that support standardised formats and protocols regarding data structures and APIs, negotiating cloud service agreements (SLA) accordingly as well as developing awareness of commonalities and dependencies among cloud-based solutions. The implementation of proposed strategies and supporting framework has a great potential to reduce the risks of vendor lock-in

    The Evolution of Smart Buildings: An Industrial Perspective of the Development of Smart Buildings in the 2010s

    Get PDF
    Over the course of the 2010s, specialist research bodies have failed to provide a holistic view of the changes in the prominent reason (as driven by industry) for creating a smart building. Over the 2010s, research tended to focus on remaining deeply involved in only single issues or value drivers. Through an analysis of the author’s peer reviewed and published works (book chapters, articles, essays and podcasts), supplemented with additional contextual academic literature, a model for how the key drivers for creating a smart building have evolved in industry during the 2010s is presented. The critical research commentary within this thesis, tracks the incremental advances of technology and their application to the built environment via academic movements, industrial shifts, or the author’s personal contributions. This thesis has found that it is demonstrable, through the chronology and publication dates of the included research papers, that as the financial cost and complexity of sensors and cloud computing reduced, smart buildings became increasingly prevalent. Initially, sustainability was the primary focus with the use of HVAC analytics and advanced metering in the early 2010s. The middle of the decade saw an economic transformation of the commercial office sector and the driver for creating a smart building was concerned with delivering flexible yet quantifiably used space. Driven by society’s emphasis on health, wellbeing and productivity, smart buildings pivoted their focus towards the end of the 2010s. Smart building technologies were required to demonstrate the impacts of architecture on the human. This research has evidenced that smart buildings use data to improve performance in sustainability, in space usage or for humancentric outcomes

    Timely and reliable evaluation of the effects of interventions: a framework for adaptive meta-analysis (FAME)

    Get PDF
    Most systematic reviews are retrospective and use aggregate data AD) from publications, meaning they can be unreliable, lag behind therapeutic developments and fail to influence ongoing or new trials. Commonly, the potential influence of unpublished or ongoing trials is overlooked when interpreting results, or determining the value of updating the meta-analysis or need to collect individual participant data (IPD). Therefore, we developed a Framework for Adaptive Metaanalysis (FAME) to determine prospectively the earliest opportunity for reliable AD meta-analysis. We illustrate FAME using two systematic reviews in men with metastatic (M1) and non-metastatic (M0)hormone-sensitive prostate cancer (HSPC)

    Modelling and analysis of heterogeneous data to improve process flow in the emergency department

    Get PDF
    Emergency Departments (EDs) must treat growing numbers of patients quickly and efficiently. However, there are bottlenecks caused by many reasons including the lack of information to process patients timely, the lack of decision-makers and the lack of timely decision-making that is affecting the smooth flow of processes. Techniques used to address bottlenecks have yielded limited sustainability due to reliance on simplistic models as inputs which do not account for the complexities and variations in the real system. This study aimed to address bottlenecks by developing a systematic model-driven approach, for assessing ED processes for improving waiting time as measured by the 4-hour quality indicator (4HQI). Using an exploratory framework, this study employed a mixed-method approach in examining heterogeneous data to realise its aim. Semi-structured interviews with 21 ED clinicians were conducted in a level-1 ED of an Acute Trust in the UK. Interview transcripts embedded with systems knowledge were extracted to develop role activity diagrams (RAD) to capture granularity of care processes and identify bottlenecks through process mapping. Additionally, service utilisation data were analysed using logistic regression, generalized linear model and decision tree. The impact of changes on waiting time was assessed using Discrete Event Simulation (DES). Process mapping revealed Majors, the unit that treats complex patients to be the most problematic in the ED and also identified five bottlenecks in the unit: awaiting specialty input, test outside the ED, awaiting transportation, bed search and inpatient handover. The process maps further revealed that information available to the ED at the pre-hospital phase and before entry into Majors can be better utilised to address bottlenecks, especially those related to awaiting specialty input, test outside the ED and awaiting transportation. This led to exploring improvement suggestions that included: (1) introducing an advanced nurse practitioner at triage, (2) utilising pre-hospital information to reduce repeat testing and (3) operating a discharge lounge. Results from the qualitative and quantitative analysis were integrated into a discrete event simulation (DES) model to evaluate the improvement suggestions, leading to reductions in the length of stay (LOS) for given scenarios. Several statistical models for predicting LOS and breach of the 4HQI were also developed. The methodology developed entailed (1) qualitative process modelling to derive the systems model, (2) quantitative analysis of audit-level patient data to understand decision-making and patient flow (3) integration of qualitative and quantitative analysis results to derive improvement suggestions and (4) simulation to analyse suggestions. RADs served as a granular process mapping technique for bottleneck identification and solution derivation in analysing complex systems. Its application helped to derive realistic models of the system This is the first study to model Majors, unit. Furthermore, a methodology for indirect mapping of RAD to DES was developed to bridge the gap between the two methods where RAD provides granular input to complement DES models. Monitoring patients’ length of stay as three-time blocks, was recommended in addition to a model-based, data-informed alert system to support decision-making and patient flow. This study sheds light on the development of quality indicators scientifically and operationally. The Majors unit identified as the most crowded unit underscores to ED managers and policymakers as an area of focus for improvement initiatives considering limited resources. This study modelled and analysed heterogeneous data to improve process flow in the ED. Implementing the recommendations made would enhance patient flow and bottlenecks, thereby improving waiting times
    corecore