2,743 research outputs found
"Rotterdam econometrics": publications of the econometric institute 1956-2005
This paper contains a list of all publications over the period 1956-2005, as reported in the Rotterdam Econometric Institute Reprint series during 1957-2005.
Clearing price distributions in call auctions
We propose a model for price formation in financial markets based on clearing
of a standard call auction with random orders, and verify its validity for
prediction of the daily closing price distribution statistically. The model
considers random buy and sell orders, placed following demand- and supply-side
valuation distributions; an equilibrium equation then leads to a distribution
for clearing price and transacted volume. Bid and ask volumes are left as free
parameters, permitting possibly heavy-tailed or very skewed order flow
conditions. In highly liquid auctions, the clearing price distribution
converges to an asymptotically normal central limit, with mean and variance in
terms of supply/demand-valuation distributions and order flow imbalance. By
means of simulations, we illustrate the influence of variations in order flow
and valuation distributions on price/volume, noting a distinction between high-
and low-volume auction price variance. To verify the validity of the model
statistically, we predict a year's worth of daily closing price distributions
for 5 constituents of the Eurostoxx 50 index; Kolmogorov-Smirnov statistics and
QQ-plots demonstrate with ample statistical significance that the model
predicts closing price distributions accurately, and compares favourably with
alternative methods of prediction
How efficiency shapes market impact
We develop a theory for the market impact of large trading orders, which we
call metaorders because they are typically split into small pieces and executed
incrementally. Market impact is empirically observed to be a concave function
of metaorder size, i.e., the impact per share of large metaorders is smaller
than that of small metaorders. We formulate a stylized model of an algorithmic
execution service and derive a fair pricing condition, which says that the
average transaction price of the metaorder is equal to the price after trading
is completed. We show that at equilibrium the distribution of trading volume
adjusts to reflect information, and dictates the shape of the impact function.
The resulting theory makes empirically testable predictions for the functional
form of both the temporary and permanent components of market impact. Based on
the commonly observed asymptotic distribution for the volume of large trades,
it says that market impact should increase asymptotically roughly as the square
root of metaorder size, with average permanent impact relaxing to about two
thirds of peak impact.Comment: 34 pages, 3 figure
Productivity analysis and functional specification of Pakistani textile industry
This study deals with the general functional characterization of the Pakistani Textile Industry. Previous studies assumed constant elasticity of substitution functional form, the best-fitted model for general manufacturing industries of Pakistan. There is only one study in which the textile industry is treated separately, but the functional form used is the same, and investigations are done at the provincial level with data pooling for other provinces;The important point in this study is that functional form is not restricted and allowed to be estimated freely by the collected data. We assume that the productivity growth of Pakistani Textiles can be properly specified by a non-homothetic and nonneutral growth flexible functional form. This function is continuous, monotic, concave and twice derivable;The concept of duality of cost function is utilized to evaluate the production function characterized by the above-mentioned properties. To discriminate among the flexible functional forms, the Cox-Box transformation is used, both in production as well as in cost functions. Non-neutral scale effect, biases of technical change, input price changes and output quantity changes are used to analyze the productivity growth;Total factor productivity is used as the growth index. It is calculated by residual as well as by parametric methods. The model used is the four input model with stochastic disturbance term. The method of estimation is maximum likelihood estimation. The error terms are used to account for discrepancy in cost minimizing behavior of the function;The four input model has been used to estimate the Pakistani textile industry, 1965-1989. The inputs are capital, labor, energy and intermediate material;The main hypotheses tested are neutrality of technical change and homothetic shifts of the function;The elasticity results show that energy and capital exhibit complementarity behavior, while both of them are labor substitutable. The scale effects are labor and energy saving and technical change effects are energy and capital-using and labor-saving. Also capital and energy are more own price elastic than labor;The contribution of scale economies to the productivity is almost the same in all the selected models, while the contribution of technical change effects is varying and not significant
EUROPEAN CONFERENCE ON QUEUEING THEORY 2016
International audienceThis booklet contains the proceedings of the second European Conference in Queueing Theory (ECQT) that was held from the 18th to the 20th of July 2016 at the engineering school ENSEEIHT, Toulouse, France. ECQT is a biannual event where scientists and technicians in queueing theory and related areas get together to promote research, encourage interaction and exchange ideas. The spirit of the conference is to be a queueing event organized from within Europe, but open to participants from all over the world. The technical program of the 2016 edition consisted of 112 presentations organized in 29 sessions covering all trends in queueing theory, including the development of the theory, methodology advances, computational aspects and applications. Another exciting feature of ECQT2016 was the institution of the Takács Award for outstanding PhD thesis on "Queueing Theory and its Applications"
Provably near-optimal algorithms for multi-stage stochastic optimization models in operations management
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 157-165).Many if not most of the core problems studied in operations management fall into the category of multi-stage stochastic optimization models, whereby one considers multiple, often correlated decisions to optimize a particular objective function under uncertainty on the system evolution over the future horizon. Unfortunately, computing the optimal policies is usually computationally intractable due to curse of dimensionality. This thesis is focused on providing provably near-optimal and tractable policies for some of these challenging models arising in the context of inventory control, capacity planning and revenue management; specifically, on the design of approximation algorithms that admit worst-case performance guarantees. In the first chapter, we develop new algorithmic approaches to compute provably near-optimal policies for multi-period stochastic lot-sizing inventory models with positive lead times, general demand distributions and dynamic forecast updates. The proposed policies have worst-case performance guarantees of 3 and typically perform very close to optimal in extensive computational experiments. We also describe a 6-approximation algorithm for the counterpart model under uniform capacity constraints. In the second chapter, we study a class of revenue management problems in systems with reusable resources and advanced reservations. A simple control policy called the class selection policy (CSP) is proposed based on solving a knapsack-type linear program (LP). We show that the CSP and its variants perform provably near-optimal in the Halfin- Whitt regime. The analysis is based on modeling the problem as loss network systems with advanced reservations. In particular, asymptotic upper bounds on the blocking probabilities are derived. In the third chapter, we examine the problem of capacity planning in joint ventures to meet stochastic demand in a newsvendor-type setting. When resources are heterogeneous, there exists a unique revenue-sharing contract such that the corresponding Nash Bargaining Solution, the Strong Nash Equilibrium, and the system optimal solution coincide. The optimal scheme rewards every participant proportionally to her marginal cost. When resources are homogeneous, there does not exist a revenue-sharing scheme which induces the system optimum. Nonetheless, we propose provably good revenue-sharing contracts which suggests that the reward should be inversely proportional to the marginal cost of each participant.by Cong Shi.Ph.D
Deterministic and stochastic optimal inventory control with logistic stock-dependent demand rate
It has been suggested by many supply chain practitioners that in certain cases inventory can have a stimulating effect on the demand. In mathematical terms this amounts to the demand being a function of the inventory level alone. In this work we propose a logistic growth model for the inventory dependent demand rate and solve first the continuous time deterministic optimal control problem of maximising the present value of the total net profit over an infinite horizon. It is shown that under a strict condition there is a unique optimal stock level which the inventory planner should maintain in order to satisfy demand. The stochastic version of the optimal control problem is considered next. A bang-bang type of optimal control problem is formulated and the associated Hamilton-Jacobi-Bellman equation is solved. The inventory level that signifies a switch in the ordering strategy is worked out in the stochastic case. Copyright © 2014 Inderscience Enterprises Ltd
Statistical Methodologies
Statistical practices have recently been questioned by numerous independent authors, to the extent that a significant fraction of accepted research findings can be questioned. This suggests that statistical methodologies may have gone too far into an engineering practice, with minimal concern for their foundation, interpretation, assumptions, and limitations, which may be jeopardized in the current context. Disguised by overwhelming data sets, advanced processing, and stunning presentations, the basic approach is often intractable to anyone but the analyst. The hierarchical nature of statistical inference, exemplified by Bayesian aggregation of prior and derived knowledge, may also be challenging. Conceptual simplified studies of the kind presented in this book could therefore provide valuable guidance when developing statistical methodologies, but also applying state of the art with greater confidence
- …