182 research outputs found

    Dynamic Activity Analysis Model Based Win-Win Development Forecasting Under the Environmental Regulation in China

    Get PDF
    Porter Hypothesis states that environmental regulation may lead to win-win opportunities, that is, improve the productivity and reduce the undesirable output simultaneously. Based on directional distance function, this paper proposes a novel dynamic activity analysis model to forecast the possibilities of win-win development in Chinese Industry between 2009 and 2049. The evidence reveals that the appropriate energy-saving and emission-abating regulation will result in both the improvement in net growth of potential output and the steadily increasing growth of total factor productivity. This favors Porter Hypothesis.Dynamic Activity Analysis Model, Energy-Saving and Emission-Abating, Environmental Regulation, Win-Win Development

    The dynamics of hourly electricity prices

    Get PDF
    The dynamics of hourly electricity prices in day-ahead markets is an important element of competitive power markets that were only established in the last decade. In electricity markets, the market microstructure does not allow for continuous trading, since operators require advance notice in order to verify that the schedule is feasible and lies within transmission constraints. Instead agents have to submit their bids and offers for delivery of electricity for all hours of the next day before a specified market closing time. We suggest the use of dynamic semiparametric factor models (DSFM) for the behavior of hourly electricity prices. We find that a model with three factors is able to explain already a high proportion of the variation in hourly electricity prices. Our analysis also provides insights into the characteristics of the market, in particular with respect to the driving factors of hourly prices and their dynamic behavior through time.Power Markets, Dynamic Semiparametric Factor Models, Day-ahead Electricity Prices

    Adaptive Interest Rate Modelling

    Get PDF
    A good description of the dynamics of interest rates is crucial to price derivatives and to hedge corresponding risk. Interest rate modelling in an unstable macroeconomic context motivates one factor models with time varying parameters. In this paper, the local parameter approach is introduced to adaptively estimate interest rate models. This method can be generally used in time varying coefficient parametric models. It is used not only to detect the jumps and structural breaks, but also to choose the largest time homogeneous interval for each time point, such that in this interval, the coeffcients are statistically constant. We use this adaptive approach and apply it in simulations and real data. Using the three month treasure bill rate as a proxy of the short rate, we nd that our method can detect both structural changes and stable intervals for homogeneous modelling of the interest rate process. In more unstable macroeconomy periods, the time homogeneous interval can not last long. Furthermore, our approach performs well in long horizon forecasting.CIR model, Interest rate, Local parametric approach, Time homogeneous interval, Adaptive statistical techniques

    Spatial Risk Premium on Weather Derivatives and Hedging Weather Exposure in Electricity

    Get PDF
    Due to dependency of energy demand on temperature, weather derivatives enable the effective hedging of temperature related fluctuations. However, temperature varies in space and time and therefore the contingent weather derivatives also vary. The spatial derivative price distribution involves a risk premium. We examine functional principal components of temperature variation for this spatial risk premium. We employ a pricing model for temperature derivatives based on dynamics modelled via a vectorial Ornstein-Uhlenbeck process with seasonal variation. We use an analytical expression for the risk premia depending on variation curves of temperature in the measurement period. The dependence is exploited by a functional principal component analysis of the curves. We compute risk premia on cumulative average temperature futures for locations traded on CME and fit to it a geographically weighted regression on functional principal component scores. It allows us to predict risk premia for nontraded locations and to adopt, on this basis, a hedging strategy, which we illustrate in the example of Leipzig.risk premium, weather derivatives, Ornstein-Uhlenbeck process, functional principal components, geographically weighted regression

    Modeling Asset Prices

    Get PDF
    As an asset is traded, its varying prices trace out an interesting time series. The price, at least in a general way, reflects some underlying value of the asset. For most basic assets, realistic models of value must involve many variables relating not only to the individual asset, but also to the asset class, the industrial sector(s) of the asset, and both the local economy and the general global economic conditions. Rather than attempting to model the value, we will confine our interest to modeling the price. The underlying assumption is that the price at which an asset trades is a "fair market price" that reflects the actual value of the asset. Our initial interest is in models of the price of a basic asset, that is, not the price of a derivative asset. Usually instead of the price itself, we consider the relative change in price, that is, the rate of return, over some interval of time. The purpose of asset pricing models is not for prediction of future prices; rather the purpose is to provide a description of the stochastic behavior of prices. Models of price changes have a number of uses, including, for investors, optimal construction of portfolios of assets and, for market regulators, maintaining a fair and orderly market. A major motivation for developing models of price changes of given assets is to use those models to develop models of fair value of derivative assets that depend on the given assets.Discrete time series models, continuous time diffusion models, models with jumps, stochastic volatility, GARCH

    Local Quantile Regression

    Get PDF
    Quantile regression is a technique to estimate conditional quantile curves. It provides a comprehensive picture of a response contingent on explanatory variables. In a flexible modeling framework, a specific form of the conditional quantile curve is not a priori fixed. % Indeed, the majority of applications do not per se require specific functional forms. This motivates a local parametric rather than a global fixed model fitting approach. A nonparametric smoothing estimator of the conditional quantile curve requires to balance between local curvature and stochastic variability. In this paper, we suggest a local model selection technique that provides an adaptive estimator of the conditional quantile regression curve at each design point. Theoretical results claim that the proposed adaptive procedure performs as good as an oracle which would minimize the local estimation risk for the problem at hand. We illustrate the performance of the procedure by an extensive simulation study and consider a couple of applications: to tail dependence analysis for the Hong Kong stock market and to analysis of the distributions of the risk factors of temperature dynamics

    Generalized single-index models: The EFM approach

    Get PDF
    Generalized single-index models are natural extensions of linear models and circumvent the so-called curse of dimensionality. They are becoming increasingly popular in many scientific fields including biostatistics, medicine, economics and finan- cial econometrics. Estimating and testing the model index coefficients beta is one of the most important objectives in the statistical analysis. However, the commonly used assumption on the index coefficients, beta = 1, represents a non-regular problem: the true index is on the boundary of the unit ball. In this paper we introduce the EFM ap- proach, a method of estimating functions, to study the generalized single-index model. The procedure is to first relax the equality constraint to one with (d - 1) components of beta lying in an open unit ball, and then to construct the associated (d - 1) estimating functions by projecting the score function to the linear space spanned by the residuals with the unknown link being estimated by kernel estimating functions. The root-n consistency and asymptotic normality for the estimator obtained from solving the re- sulting estimating equations is achieved, and a Wilk's type theorem for testing the index is demonstrated. A noticeable result we obtain is that our estimator for beta has smaller or equal limiting variance than the estimator of Carroll et al. (1997). A fixed point iterative scheme for computing this estimator is proposed. This algorithm only involves one-dimensional nonparametric smoothers, thereby avoiding the data sparsity problem caused by high model dimensionality. Numerical studies based on simulation and on applications suggest that this new estimating system is quite powerful and easy to implement.Generalized single-index model, index coefficients, estimating equations, asymptotic properties, iteration

    A Confidence Corridor for Sparse Longitudinal Data Curves

    Get PDF
    Longitudinal data analysis is a central piece of statistics. The data are curves and they are observed at random locations. This makes the construction of a simultaneous confidence corridor (SCC) (confidence band) for the mean function a challenging task on both the theoretical and the practical side. Here we propose a method based on local linear smoothing that is implemented in the sparse (i.e., low number of nonzero coefficients) modelling situation. An SCC is constructed based on recent results obtained in applied probability theory. The precision and performance is demonstrated in a spectrum of simulations and applied to growth curve data. Technically speaking, our paper intensively uses recent insights into extreme value theory that are also employed to construct a shoal of confidence intervals (SCI).Longitudinal data, confidence band, Karhunen-Loève L² representation, local linear estimator, extreme value, double sum, strong approximation

    Yxilon: Designing The Next Generation, Vertically Integrable Statistical Software Environment

    Get PDF
    Modern statistical computing requires smooth integration of new algorithms and quantitative analysis results in all sorts of platforms such as webbrowsers, standard and proprietary application software. Common statistical software packages can often not be adapted to integrate into new environments or simply lack the demands users and especially beginners have. With Yxilon we propose a vertically integrable, modular statistical computing environment, providing the user a rich set of methods and a diversity of different interfaces, including command-line interface, web clients and interactive examples in electronic books. This architecture allows the users to rely upon only one environment in order to organize data from a variety of sources, analyse them and visualize or export the results to other software programs. The design of Yxilon is inspired by XploRe, a statistical environment developed by MD*Tech and Humboldt-Universität zu Berlin. Yxilon incorporates several ideas from recent developments and design principles in software engineering: modular plug-in architecture, platform independence, and separation of user interfaces and computing engine. --Java,Client/Server,XploRe,Yxilon,electronic publishing,e-books
    corecore