248,943 research outputs found

    Preliminary Results in a Multi-site Empirical Study on Cross-organizational ERP Size and Effort Estimation

    Get PDF
    This paper reports on initial findings in an empirical study carried out with representatives of two ERP vendors, six ERP adopting organizations, four ERP implementation consulting companies, and two ERP research and advisory services firms. Our study’s goal was to gain understanding of the state-of-the practice in size and effort estimation of cross-organizational ERP projects. Based on key size and effort estimation challenges identified in a previously published literature survey, we explored some difficulties, fallacies and pitfalls these organizations face. We focused on collecting empirical evidence from the participating ERP market players to assess specific facts about the state-of-the-art ERP size and effort estimation practices. Our study adopted a qualitative research method based on an asynchronous online focus group

    Time-varying persistence in US inflation

    Get PDF
    The persistence property of inflation is an important issue not only for economists, but especially for central banks, given that the degree of inflation persistence determines the extent to which central banks can control inflation. Further, not only is it the level of inflation persistence that is important in economic analyses, but also the question of whether the persistence varies over time, for instance, across business cycle phases, is equally pertinent, since assuming constant persistence across states of the economy is sure to lead to misguided policy decisions. Against this backdrop, we extend the literature on long-memory models of inflation persistence for the US economy over the monthly period of 1920:1\u20132014:5, by developing an autoregressive fractionally integrated moving-average-generalized autoregressive conditional heteroskedastic model with a time-varying memory coefficient which varies across expansions and recessions. In sum, we find that inflation persistence does vary across recessions and expansions, with it being significantly higher in the former than in the latter. As an aside, we also show that persistence of inflation volatility is higher during expansions than in recessions. Understandably, our results have important policy implications

    International financial centres, office market rents and volatility

    Get PDF
    Despite continuing developments in information technology and the growing economic significance of the emerging Eastern European, South American and Asian economies, international financial activity remains strongly concentrated in a relatively small number of international financial centres. That concentration of financial activity requires a critical mass of office occupation and creates demand for high specification, high cost space. The demand for that space is increasingly linked to the fortunes of global capital markets. That linkage has been emphasised by developments in real estate markets, notably the development of global real estate investment, innovation in property investment vehicles and the growth of debt securitisation. The resultant interlinking of occupier, asset, debt and development markets within and across global financial centres is a source of potential volatility and risk. The paper sets out a broad conceptual model of the linkages and their implications for systemic market risk and presents preliminary empirical results that provide support for the model proposed

    The Natural-Rate Hypothesis, the Rational-Expectations Hypothesis, and the Remarkable Survival of Non-Market-Clearing Assumptions

    Get PDF
    Non-market-clearing models continue to dominate analysis of macroeconomic fluctuations and discussions of macroeconomic policy. This situation is remarkable because non-market-clearing assumptions seem to be inconsistent with the essential presumption of neoclassical economic analysis that market outcomes exhaust opportunities for mutually advantageous exchange. Non-market-clearing models apparently have survived because they have evolved to incorporate both the natural-rate hypothesis and the rational-expectations hypothesis and because the alternative "equilibrium" approach has failed empirically.This paper expands on these ideas and briefly discusses some of the problems that we face in attempting to evaluate empirically the recent vintage of non-market-clearing models. The main difficulties seem to involve accounting for shifts in the natural levels of real aggregates and specifying the timing of the past anticipations that determine the effects of current monetary policy.

    An Empirical Study of Operational Performance Parity Following Enterprise System Deployment

    Get PDF
    This paper presents an empirical investigation into whether the implementation of packaged Enterprise Systems (ES) leads to parity in operational performance. Performance change and parity in operational performance are investigated in three geographically defined operating regions of a single firm. Order lead time, the elapsed time between receipt of an order and shipment to a customer, is used as a measure of operational performance. A single ES installation was deployed across all regions of the subject firm\u27s operations.Findings illustrate parity as an immediate consequence of ES deployment. However, differences in rates of performance improvement following deployment eventually result in significant (albeit smaller than pre-deployment) performance differences. An additional consequence of deployment seems to be an increased synchronization of performance across the formerly independent regions

    Validating adequacy and suitability of business-IT alignment criteria in an inter-enterprise maturity model

    Get PDF
    Aligning requirements of a business with its information technology is currently a major issue in enterprise computing. Existing literature indicates important criteria to judge the level of alignment between business and IT within a single enterprise. However, identifying such criteria in an inter-enterprise setting – or re-thinking the existing ones – is hardly addressed at all. Business-IT alignment in such settings poses new challenges, as in inter-enterprise collaborations, alignment is driven by economic processes instead of centralized decision-making processes. In our research, we develop a maturity model for business-IT alignment in inter-enterprise settings that takes this difference into account. In this paper, we report on a multi-method approach we devised to confront the validation of the business-IT alignment criteria that we included in the maturity model. As independent feedback is critical for our validation, we used a focus group session and a case study as instruments to take the first step in validating the business-IT alignment criteria. We present how we applied our approach, what we learnt, and what the implications were for our model

    The price impact of economic news, private information and trading intensity

    Get PDF
    In this paper we use three years high-frequency data to investigate the role played by public and private information in the process of price formation in two secondary government bond markets. As public information we examine the impact of regularly scheduled macroeconomic news announcements. We identify those announcements with the greatest impact on these markets. As private information we estimate the price impact of order flow. In fact, according to the microstructure models, private information in this context is related to the subjective evaluation of information and order flow can reflect difference of opinions among market participants. Thus, market participant may infer information about the subjective beliefs of other market participants looking at the aggregate order flow. We then use a vector autoregressive model for prices and trades to empirically test the role played by intraday trading intensity and by the waiting time between consecutive transactions in the process of price formations

    Combining long memory and level shifts in modeling and forecasting the volatility of asset returns

    Full text link
    We propose a parametric state space model of asset return volatility with an accompanying estimation and forecasting framework that allows for ARFIMA dynamics, random level shifts and measurement errors. The Kalman filter is used to construct the state-augmented likelihood function and subsequently to generate forecasts, which are mean- and path-corrected. We apply our model to eight daily volatility series constructed from both high-frequency and daily returns. Full sample parameter estimates reveal that random level shifts are present in all series. Genuine long memory is present in high-frequency measures of volatility whereas there is little remaining dynamics in the volatility measures constructed using daily returns. From extensive forecast evaluations, we find that our ARFIMA model with random level shifts consistently belongs to the 10% Model Confidence Set across a variety of forecast horizons, asset classes, and volatility measures. The gains in forecast accuracy can be very pronounced, especially at longer horizons

    Building on CHASM: A Study of Using Counts for the Analysis of Static Models of Processes

    Get PDF
    Process modelling is gaining increasing acceptance by software engineers as a useful discipline to facilitate both process understanding and improvement activities. This position paper builds upon previous work reported at the 1997 ICSE workshop on process models and empirical studies of software engineering (Phalp and Counsell 1997). In the previous paper, we argued that simple counts could be used to support analysis of static process models. We also illustrated the idea with a coupling measure for Role Activity Diagrams, a graphical process modelling notation adapted from Petri Nets. At that time only limited empirical work had been carried out, based upon a single industrial study, where we found high levels of coupling in an inefficient process (a more thorough description may be found in (Phalp and Shepperd 1999)). We now summarise a more recent study, which uses a similar analysis of process coupling again based on simple counts. In the study, we compared ten software prototyping processes drawn from eight different organisations. We found that this approach does yield insights into process problems, which could potentially be missed by qualitative analysis alone. This is particularly so when analysing real world processes, which are frequently more complex than their text book counterparts. One notable finding was that despite differences in size and domain, role types across the organisations exhibited similar levels of coupling. Furthermore, where there were deviations in one particular role type, this led the authors to discover a relationship between project size and the coupling levels within that type of role. Given the simplicity of our approach and the complexity of many real world processes we argue that quantitative analysis of process models should be considered as a process analysis technique
    • …
    corecore