85 research outputs found
Iterative Unbiased FIR State Estimation: A Review of Algorithms
In this paper, we develop in part and review various iterative unbiased finite impulse response (UFIR) algorithms (both direct and two-stage) for the filtering, smoothing, and prediction of time-varying and time-invariant discrete state-space models in white Gaussian noise environments. The distinctive property of UFIR algorithms is that noise statistics are completely ignored. Instead, an optimal window size is required for optimal performance. We show that the optimal window size can be determined via measurements with no reference. UFIR algorithms are computationally more demanding than Kalman filters, but this extra computational effort can be alleviated with parallel computing, and the extra memory that is required is not a problem for modern computers. Under real-world operating conditions with uncertainties, non-Gaussian noise, and unknown noise statistics, the UFIR estimator generally demonstrates better robustness than the Kalman filter, even with suboptimal window size. In applications requiring large window size, the UFIR estimator is also superior to the best previously known optimal FIR estimators
Improved Receding Horizon Fourier Analysis for Quasi-periodic Signals
In this paper, an efficient short-time Fourier analysis method for the quasi-periodic signals is proposed via an optimal fixed-lag finite impulse response (FIR) smoother approach using a receding horizon scheme. In order to deal with time-varying Fourier coefficients (FCs) of quasi-periodic signals, a state space model including FCs as state variables is augmented with the variants of FCs. Through an optimal fixed-lag FIR smoother, FCs and their increments are estimated simultaneously and combined to produce final estimates. A lag size of the optimal fixed-lag FIR smoother is chosen to minimize the estimation error. Since the proposed estimation scheme carries out the correction process with the estimated variants of FCs, it is highly probable that the smaller estimation error is achieved compared with existing approaches not making use of such a process. It is shown through numerical simulation that the proposed scheme has better tracking ability for estimating time-varying FCs compared with existing ones.111Ysciescopuskc
Diagnostic checking and intra-daily effects in time series models
A variety of topics on the statistical analysis of time
series are addressed in this thesis. The main emphasis is on the
state space methodology and, in particular, on structural time
series (STS) models. There are now many applications of STS models
in the literature and they have proved to be very successful.
The keywords of this thesis vary from - Kalman filter,
smoothing and diagnostic checking - to - time-varying cubic splines
and intra-daily effects -. Five separate studies are carried out for
this research project and they are reflected in the chapters 2 to 6.
All studies concern time series models which are placed in the state
space form (SSF) so that the Kalman filter (KF) can be applied for
estimation. The SSF and the KF play a central role in time series
analysis that can be compared with the important role of the
regression model and the method of least squares estimation in
econometrics. Chapter 2 gives an overview of the latest developments
in the state space methodology including diffuse likelihood
evaluation, stable calculations, etc.
Smoothing algorithms evaluate the full sample estimates of
unobserved components in time series models. New smoothing
algorithms are developed for the state and the disturbance vector of
the SSF which are computationally efficient and outperform existing
methods. Chapter 3 discusses the existing and the new smoothing
algorithms with an emphasis on theory, algorithms and practical
implications. The new smoothing results pave the way to use
auxiliary residuals, that is full sample estimates of the
disturbances, for diagnostic checking of unobserved components time
series models. Chapter 4 develops test statistics for auxiliary
residuals and it presents applications showing how they can be used
to detect and distinguish between outliers and structural change.
A cubic spline is a polynomial function of order three which
is regularly used for interpolation and curve-fitting. It has also
been applied to piecewise regressions, density approximations, etc.
Chapter 5 develops the cubic spline further by allowing it to vary
over time and by introducing it into time series models. These timevarying
cubic splines are an efficient way of handling slowly
changing periodic movements in time series.
This method for modelling a changing periodic pattern is
applied in a structural time series model used to forecast hourly
electricity load demand, with the periodic movements being intradaily
or intra-weekly. The full model contains other components,
including a temperature response which is also modelled using cubic
splines. A statistical computer package (SHELF) is developed to
produce, at any time, hourly load forecasts three days ahead
Covariates and latents in growth modelling.
Ph. D. University of KwaZulu-Natal, Pietermaritzburg 2014.The growth curve models are the natural models for the increment processes
taking place gradually over time. When individuals are observed over time it
is often apparent that they grow at different rates, even though they are
clones and no differences in treatment or environment are present.
Neverthless the classical growth curve model only deals with the average
growth and does not account for individual differences, nor does it have
room to accommodate covariates. Accordingly we strive to construct and
investigate tractable models which incorporate both individual effects and
covariates.
The study was motivated by plantations of fast growing tree species, and the
climatic and genetic factors that influence stem radial growth of juvenile
Eucalyptus hybrids grown on the east coast of South Africa. Measurement
of stem radius was conducted using dendrometres on eighteen sampled
trees of two Eucalyptus hybrid clones (E. grandis Ļ E.urophylla, GU and
E.grandis Ļ E. Camaldulensis, GC). Information on climatic data
(temperature, rainfall, solar radiation, relative humidity and wind speed)
was simultaneously collected from the study site.
We explored various functional statistical models which are able to handle
the growth, individual traits, and covariates. These models include partial
least squares approaches, principal component regression, path models,
fractional polynomial models, nonlinear mixed models and additive mixed
models. Each one of these models has strengths and weaknesses.
Application of these models is carried out by analysing the stem radial
growth data.
The partial least squares and principal component regression methods were
used to identify the most important predictor for stem radial growth. Path
models approach was then applied mainly to find some indirect effects of
climatic factors. We further explored the tree specific effects that are unique
to a particular tree under study by fitting a fractional polynomial model in
the context of linear mixed effects model. The fitted fractional polynomial
model showed that the relationship between stem radius and tree age is
nonlinear. The performance of fractional polynomial models was compared
with that of nonlinear mixed effects models.
Using nonlinear mixed effects models some growth parameters like inflection
points were estimated. Moreover, the fractional polynomial model fit was
almost as good as the nonlinear growth curves. Consequently, the fractional
polynomial model fit was extended to include the effects of all climatic
variables. Furthermore, the parametric methods do not allow the data to
decide the most suitable form of the functions. In order to capture the main
features of the longitudinal profiles in a more flexible way, a semiparametric
approach was adopted. Specifically, the additive mixed models
were used to model the effect of tree age as well as the effect of each climatic
factor
Recommended from our members
Zonal And Regional Load Forecasting In The New England Wholesale Electricity Market: A Semiparametric Regression Approach
Power system planning, reliability analysis and economically efficient capacity scheduling all rely heavily on electricity demand forecasting models. In the context of a deregulated wholesale electricity market, using scheduling a regionās bulk electricity generation is inherently linked to future values of demand. Predictive models are used by municipalities and suppliers to bid into the day-ahead market and by utilities in order to arrange contractual interchanges among neighboring utilities. These numerical predictions are therefore pervasive in the energy industry.
This research seeks to develop a regression-based forecasting model. Specifically, electricity demand is modeled as a function of calendar effects, lagged demand effects, weather effects, and a stochastic disturbance. Variables such as temperature, wind speed, cloud cover and humidity are known to be among the strongest predictors of electricity demand and as such are used as model inputs. It is well known, however, that the relationship between demand and weather can be highly nonlinear. Rather than assuming a linear functional form, the structural change in these relationships is explored. Those variables that indicate a nonlinear relationship with demand are accommodated with penalized splines in a semiparametric regression framework. The equivalence between penalized splines and the special case of a mixed model formulation allows for model estimation with currently available statistical packages such as R, STATA and SAS.
Historical data are available for the entire New England region as well as for the smaller zones that collectively make up the regional grid. As such, a secondary research objective of this thesis is to explore whether or not an aggregation of zonal forecasts might perform better than those produced from a single regional model. Prior to this research, neither the applicability of a semiparametric regression-based approach towards load forecasting nor the potential improvement in forecasting performance resulting from zonal load forecasting has been investigated for the New England wholesale electricity market
Statistical methods for NHS incident reporting data
The National Reporting and Learning System (NRLS) is the English and Welsh NHSā national repository of incident reports from healthcare. It aims to capture details of incident reports, at national level, and facilitate clinical review and learning to improve patient safety. These incident reports range from minor ānear-missesā to critical incidents that may lead to severe harm or death. NRLS data are currently reported as crude counts and proportions, but their major use is clinical review of the free-text descriptions of incidents. There are few well-developed quantitative analysis approaches for NRLS, and this thesis investigates these methods. A literature review revealed a wealth of clinical detail, but also systematic constraints of NRLSā structure, including non-mandatory reporting, missing data and misclassification. Summary statistics for reports from 2010/11 ā 2016/17 supported this and suggest NRLS was not suitable for statistical modelling in isolation. Modelling methods were advanced by creating a hybrid dataset using other sources of hospital casemix data from Hospital Episode Statistics (HES). A theoretical model was established, based on āexposureā variables (using casemix proxies), and ācultureā as a random-effect. The initial modelling approach examined Poisson regression, mixture and multilevel models. Overdispersion was significant, generated mainly by clustering and aggregation in the hybrid dataset, but models were chosen to reflect these structures. Further modelling approaches were examined, using Generalized Additive Models to smooth predictor variables, regression tree-based models including Random Forests, and Artificial Neural Networks. Models were also extended to examine a subset of death and severe harm incidents, exploring how sparse counts affect models. Text mining techniques were examined for analysis of incident descriptions and showed how term frequency might be used. Terms were used to generate latent topics models used, in-turn, to predict the harm level of incidents. Model outputs were used to create a āStandardised Incident Reporting Ratioā (SIRR) and cast this in the mould of current regulatory frameworks, using process control techniques such as funnel plots and cusum charts. A prototype online reporting tool was developed to allow NHS organisations to examine their SIRRs, provide supporting analyses, and link data points back to individual incident reports
The Telecommunications and Data Acquisition Report
Tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for new projects; networks consolidation program; and network sustaining are described
- ā¦