4,360 research outputs found
Recommended from our members
Judgments, forecasts and decisions: an analysis of fund managers over time
Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investorsâ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investorâs objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model.
This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify âsoftâ parameters in decision making which will influence the optimal allocation for that asset class. This âsoftâ information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations.
The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investorsâ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance
Recommended from our members
Decision theory and real estate development: a note on uncertainty
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and âbetterâ
Recommended from our members
The uncertainty of valuation
Valuation is often said to be âan art not a scienceâ but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken apart from a single Guidance Note (GN5, RICS 2003) stressing the importance of recognising uncertainty in valuation but not proffering any particular solution. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model
Two-point correlation properties of stochastic "cloud processes''
We study how the two-point density correlation properties of a point particle
distribution are modified when each particle is divided, by a stochastic
process, into an equal number of identical "daughter" particles. We consider
generically that there may be non-trivial correlations in the displacement
fields describing the positions of the different daughters of the same "mother"
particle, and then treat separately the cases in which there are, or are not,
correlations also between the displacements of daughters belonging to different
mothers. For both cases exact formulae are derived relating the structure
factor (power spectrum) of the daughter distribution to that of the mother.
These results can be considered as a generalization of the analogous equations
obtained in ref. [1] (cond-mat/0409594) for the case of stochastic displacement
fields applied to particle distributions. An application of the present results
is that they give explicit algorithms for generating, starting from regular
lattice arrays, stochastic particle distributions with an arbitrarily high
degree of large-scale uniformity.Comment: 14 pages, 3 figure
A perturbation theory for large deviation functionals in fluctuating hydrodynamics
We study a large deviation functional of density fluctuation by analyzing
stochastic non-linear diffusion equations driven by the difference between the
densities fixed at the boundaries. By using a fundamental equality that yields
the fluctuation theorem, we first relate the large deviation functional with a
minimization problem. We then develop a perturbation method for solving the
problem. In particular, by performing an expansion with respect to the average
current, we derive the lowest order expression for the deviation from the local
equilibrium part. This expression implies that the deviation is written as the
space-time integration of the excess entropy production rate during the most
probable process of generating the fluctuation that corresponds to the argument
of the large deviation functional.Comment: 12page
Diffusion, super-diffusion and coalescence from single step
From the exact single step evolution equation of the two-point correlation
function of a particle distribution subjected to a stochastic displacement
field \bu(\bx), we derive different dynamical regimes when \bu(\bx) is
iterated to build a velocity field. First we show that spatially uncorrelated
fields \bu(\bx) lead to both standard and anomalous diffusion equation. When
the field \bu(\bx) is spatially correlated each particle performs a simple
free Brownian motion, but the trajectories of different particles result to be
mutually correlated. The two-point statistical properties of the field
\bu(\bx) induce two-point spatial correlations in the particle distribution
satisfying a simple but non-trivial diffusion-like equation. These
displacement-displacement correlations lead the system to three possible
regimes: coalescence, simple clustering and a combination of the two. The
existence of these different regimes, in the one-dimensional system, is shown
through computer simulations and a simple theoretical argument.Comment: RevTeX (iopstyle) 19 pages, 5 eps-figure
Average observational quantities in the timescape cosmology
We examine the properties of a recently proposed observationally viable
alternative to homogeneous cosmology with smooth dark energy, the timescape
cosmology. In the timescape model cosmic acceleration is realized as an
apparent effect related to the calibration of clocks and rods of observers in
bound systems relative to volume-average observers in an inhomogeneous geometry
in ordinary general relativity. The model is based on an exact solution to a
Buchert average of the Einstein equations with backreaction. The present paper
examines a number of observational tests which will enable the timescape model
to be distinguished from homogeneous cosmologies with a cosmological constant
or other smooth dark energy, in current and future generations of dark energy
experiments. Predictions are presented for: comoving distance measures; H(z);
the equivalent of the dark energy equation of state, w(z); the Om(z) measure of
Sahni, Shafieloo and Starobinsky; the Alcock-Paczynski test; the baryon
acoustic oscillation measure, D_v; the inhomogeneity test of Clarkson, Bassett
and Lu; and the time drift of cosmological redshifts. Where possible, the
predictions are compared to recent independent studies of similar measures in
homogeneous cosmologies with dark energy. Three separate tests with indications
of results in possible tension with the Lambda CDM model are found to be
consistent with the expectations of the timescape cosmology.Comment: 22 pages, 12 figures; v2 discussion, references added, matches
published versio
Fick and Fokker--Planck diffusion law in inhomogeneous media
We discuss diffusion of particles in a spatially inhomogeneous medium. From
the microscopic viewpoint we consider independent particles randomly evolving
on a lattice. We show that the reversibility condition has a discrete geometric
interpretation in terms of weights associated to un--oriented edges and
vertices. We consider the hydrodynamic diffusive scaling that gives, as a
macroscopic evolution equation, the Fokker--Planck equation corresponding to
the evolution of the probability distribution of a reversible spatially
inhomogeneous diffusion process. The geometric macroscopic counterpart of
reversibility is encoded into a tensor metrics and a positive function. The
Fick's law with inhomogeneous diffusion matrix is obtained in the case when the
spatial inhomogeneity is associated exclusively with the edge weights. We
discuss also some related properties of the systems like a non-homogeneous
Einstein relation and the possibility of uphill diffusion
Low Power Front End for the Optical Module of a Neutrino Underwater Telescope
A proposal for a new system to capture signals in the Optical Module (OM) of an underwater neutrino telescope is described. It concentrates on the problem of power consumption and time precision. In particular, a solution for the interface between the photomultiplier (PMT) and the front-end electronics is presented
- âŠ