2,415 research outputs found

    Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    Get PDF
    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as ‘deterministic components’ or ‘trends’ even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures

    Wavelet-based filtration procedure for denoising the predicted CO2 waveforms in smart home within the Internet of Things

    Get PDF
    The operating cost minimization of smart homes can be achieved with the optimization of the management of the building's technical functions by determination of the current occupancy status of the individual monitored spaces of a smart home. To respect the privacy of the smart home residents, indirect methods (without using cameras and microphones) are possible for occupancy recognition of space in smart homes. This article describes a newly proposed indirect method to increase the accuracy of the occupancy recognition of monitored spaces of smart homes. The proposed procedure uses the prediction of the course of CO2 concentration from operationally measured quantities (temperature indoor and relative humidity indoor) using artificial neural networks with a multilayer perceptron algorithm. The mathematical wavelet transformation method is used for additive noise canceling from the predicted course of the CO2 concentration signal with an objective increase accuracy of the prediction. The calculated accuracy of CO2 concentration waveform prediction in the additive noise-canceling application was higher than 98% in selected experiments.Web of Science203art. no. 62

    Nonlinear Error Correction: The Case of Money Demand in the UK (1878-2000).

    Get PDF
    This paper explores single-equation nonlinear error correction (NEC) models with linear and nonlinear cointegrated variables. Within the class of semiparametric NEC models, we use smoothing splines. Within the class of parametric models, we discuss the interesting properties of cubic polynomial NEC models and we show how they can be used to identify unknown threshold points in asymmetric models and to check the stability properties of the long-run equilibrium. A new class of rational polynomial NEC models is also introduced. We found multiple long-run money demand equilibria. The stability observed in the money-demand parameter estimates during more than a century, 1878 to 2000, is remarkable.Money Demand; Nonlinear Error Correction; Cubic Polynomials; Rational Polynomials; Smoothing Splines; Nonlinear Cointegration;

    Single channel nonstationary signal separation using linear time-varying filters

    Get PDF

    Data-based analysis of extreme events: inference, numerics and applications

    Get PDF
    The concept of extreme events describes the above average behavior of a process, for instance, heat waves in climate or weather research, earthquakes in geology and financial crashes in economics. It is significant to study the behavior of extremes, in order to reduce their negative impacts. Key objectives include the identification of the appropriate mathematical/statistical model, description of the underlying dependence structure in the multivariate or the spatial case, and the investigation of the most relevant external factors. Extreme value analysis (EVA), based on Extreme Value Theory, provides the necessary statistical tools. Assuming that all relevant covariates are known and observed, EVA often deploys statistical regression analysis to study the changes in the model parameters. Modeling of the dependence structure implies a priori assumptions such as Gaussian, locally stationary or isotropic behavior. Based on EVA and advanced time-series analysis methodology, this thesis introduces a semiparametric, nonstationary and non- homogenous framework for statistical regression analysis of spatio-temporal extremes. The involved regression analysis accounts explicitly for systematically missing covariates; their influence was reduced to an additive nonstationary offset. The nonstationarity was resolved by the Finite Element Time Series Analysis Methodology (FEM). FEM approximates the underlying nonstationarity by a set of locally stationary models and a nonstationary hidden switching process with bounded variation (BV). The resulting FEM-BV- EVA approach goes beyond a priori assumptions of standard methods based, for instance, on Bayesian statistics, Hidden Markov Models or Local Kernel Smoothing. The multivariate/spatial extension of FEM-BV-EVA describes the underlying spatial variability by the model parameters, referring to hierarchical modeling. The spatio-temporal behavior of the model parameters was approximated by locally stationary models and a spatial nonstationary switching process. Further, it was shown that the resulting spatial FEM-BV-EVA formulation is consistent with the max-stability postulate and describes the underlying dependence structure in a nonparametric way. The proposed FEM-BV-EVA methodology was integrated into the existent FEM MATLAB toolbox. The FEM-BV-EVA framework is computationally efficient as it deploys gradient free MCMC based optimization methods and numerical solvers for constrained, large, structured quadratic and linear problems. In order to demonstrate its performance, FEM-BV-EVA was applied to various test-cases and real-data and compared to standard methods. It was shown that parametric approaches lead to biased results if significant covariates are unresolved. Comparison to nonparametric methods based on smoothing regression revealed their weakness, the locality property and the inability to resolve discontinuous functions. Spatial FEM-BV-EVA was applied to study the dynamics of extreme precipitation over Switzerland. The analysis identified among others three major spatially dependent regions

    Neural networks in economic modelling:An empirical study

    Get PDF
    This dissertation addresses the statistical aspects of neural networks and their usability for solving problems in economics and finance. Neural networks are discussed in a framework of modelling which is generally accepted in econometrics. Within this framework a neural network is regarded as a statistical technique that implements a model-free regression strategy. Model-free regression seems particularly useful in situations where economic theory cannot provide sensible model specifications. Neural networks are applied in three case studies: modelling house prices; predicting the production of new mortgage loans; predicting the foreign exchange rates. From these case studies is concluded that neural networks are a valuable addition to the econometrician's toolbox, but that they are no panacea

    Data-based mechanistic modelling, forecasting, and control.

    Get PDF
    This article briefly reviews the main aspects of the generic data based mechanistic (DBM) approach to modeling stochastic dynamic systems and shown how it is being applied to the analysis, forecasting, and control of environmental and agricultural systems. The advantages of this inductive approach to modeling lie in its wide range of applicability. It can be used to model linear, nonstationary, and nonlinear stochastic systems, and its exploitation of recursive estimation means that the modeling results are useful for both online and offline applications. To demonstrate the practical utility of the various methodological tools that underpin the DBM approach, the article also outlines several typical, practical examples in the area of environmental and agricultural systems analysis, where DBM models have formed the basis for simulation model reduction, control system design, and forecastin
    corecore