20,854 research outputs found
Stochastic Frontier Models With Correlated Error Components
In the productivity modelling literature, the disturbances U (representing technical inefficiency) and V (representing noise) of the composite error W=V-U of the stochastic frontier model are assumed to be independent random variables. By employing the copula approach to statistical modelling, the joint behaviour of U and V can be parameterised thereby allowing the data the opportunity to determine the adequacy of the independence assumption. In this context, three examples of the copula approach are given: the first is algebraic (the Logistic-Exponential stochastic frontier model with margins bound by the Fairlie-Gumbel-Morgenstern copula) and the second and third are empirically oriented, using data sets well-known in productivity analysis. Analysed are a cross-section of cost data sampled from the US electrical power industry, and an unbalanced panel of data sampled from the US airline industryStochastic Frontier model; Copula; Copula approach; Sklar's theorem; Families of copulas; Spearman's rho.
Nonlinear adaptive control using non-parametric Gaussian Process prior models
Nonparametric Gaussian Process prior models, taken from Bayesian statistics methodology are used to implement a nonlinear adaptive control law. The expected value of a quadratic cost function is minimised, without ignoring the variance of the model predictions. This leads to implicit regularisation of the control signal (caution), and excitation of the system. The controller has dual features, since it is both tracking a reference signal and learning a model of the system from observed responses. The general method and its main features are illustrated on a simulation example
Symbolic Maximum Likelihood Estimation with Mathematica
Mathematica is a symbolic programming language that empowers the user to undertake complicated algebraic tasks. One such task is the derivation of maximum likelihood estimators, demonstrably an important topic in statistics at both the research and expository level. In this paper, a Mathematica package is provided that contains a function entitled SuperLog. This function utilises pattern-matching code that enhances Mathematica's ability to simplify expressions involving the natural logarithm of a product of algebraic terms. This enhancement to Mathematica's functionality can be of particular benefit for maximum likelihood estimation
Neural networks for modelling and control of a non-linear dynamic system
The authors describe the use of neural nets to model and control a nonlinear second-order electromechanical model of a drive system with varying time constants and saturation effects. A model predictive control structure is used. This is compared with a proportional-integral (PI) controller with regard to performance and robustness against disturbances. Two feedforward network types, the multilayer perceptron and radial-basis-function nets, are used to model the system. The problems involved in the transfer of connectionist theory to practice are discussed
Applications of inverse simulation to a nonlinear model of an underwater vehicle
Inverse simulation provides an important alternative
to conventional simulation and to more formal
mathematical techniques of model inversion. The
application of inverse simulation methods to a nonlinear
dynamic model of an unmanned underwater vehicle with
actuator limits is found to give rise to a number of
challenging problems. It is shown that this particular
problem requires, in common with other applications that
include hard nonlinearities in the model or discontinuities
in the required trajectory, can best be approached using a
search-based optimization algorithm for inverse
simulation in place of the more conventional Newton-
Raphson approach. Results show that meaningful inverse
simulation results can be obtained but that multi-solution
responses exist. Although the inverse solutions are not
unique they are shown to generate the required
trajectories when tested using conventional forward
simulation methods
Empowerment as a metric for Optimization in HCI
We propose a novel metric for optimizing human-computer interfaces, based on the information-theoretic capacity of empowerment, a task-independent universal utility measure. Empowerment measures, for agent-environment systems with stochastic transitions, how much influence, which can be sensed by the agent sensors, an agent has on its environment. It captures the uncertainty in human-machine systems arising from different sources (i.e. noise, delays, errors, etc.) as a single quantity. We suggest the potential empowerment has as an objective optimality criterion in user interface design optimization, contributing to the more solid theoretical foundations of HCI.Peer reviewedFinal Accepted Versio
Model of Coordination Flow in Remote Collaborative Interaction
Ā© 2015 IEEEWe present an information-theoretic approach for modelling coordination in human-human interaction and measuring coordination flows in a remote collaborative tracking task. Building on Shannon's mutual information, coordination flow measures, for stochastic collaborative systems, how much influence, the environment has on the joint control of collaborating parties. We demonstrate the application of the approach on interactive human data recorded in a user study and reveal the amount of effort required for creating rigorous models. Our initial results suggest the potential coordination flow has - as an objective, task-independent measure - in supporting designers of human collaborative systems and in providing better theoretical foundations for the science of Human-Computer Interaction
On Dependency in Double-Hurdle Models
In microeconometrics, consumption data is typically zero-inflated due to many individuals recording, for one reason or another, no consumption. A mixture model can be appropriate for statistical analysis of such data, with the Dependent Double-Hurdle model (DDH hereafter) one specification that is frequently adopted in econometric practice. Essentially, the DDH model is designed to explain individual demand through a sequential two-step process: a market participation decision (first hurdle), followed by a consumption level decision (second hurdle) - a non-zero correlation/covariance parameter allows for dependency between the hurdles. A significant feature of the majority of empirical DDH studies has been the lack of support for the existence of dependency. This empirical phenomenon is studied from a theoretical perspective using examples based on the bivariate normal, bivariate logistic, and bivariate Poisson distributions. The Fisher Information matrix for the parameters of the model is considered, especially the component corresponding to the dependency parameter. The main finding is that the DDH model contains too little statistical information to support estimation of dependency, even when dependency is truly present. Consequently, the paper calls for the elimination of attempts to estimate dependency using the DDH framework. The advantage of this strategy is that it permits flexible modelling; some possibilities are proposed
Adaptive, cautious, predictive control with Gaussian process priors
Nonparametric Gaussian Process models, a Bayesian statistics approach, are used to implement a nonlinear adaptive control law. Predictions, including propagation of the state uncertainty are made over a k-step horizon. The expected value of a quadratic cost function is minimised, over this prediction horizon, without ignoring the variance of the model predictions. The general method and its main features are illustrated on a simulation example
- ā¦