1,505 research outputs found

    Mathematical programming for piecewise linear regression analysis

    Get PDF
    In data mining, regression analysis is a computational tool that predicts continuous output variables from a number of independent input variables, by approximating their complex inner relationship. A large number of methods have been successfully proposed, based on various methodologies, including linear regression, support vector regression, neural network, piece-wise regression, etc. In terms of piece-wise regression, the existing methods in literature are usually restricted to problems of very small scale, due to their inherent non-linear nature. In this work, a more efficient piece-wise linear regression method is introduced based on a novel integer linear programming formulation. The proposed method partitions one input variable into multiple mutually exclusive segments, and fits one multivariate linear regression function per segment to minimise the total absolute error. Assuming both the single partition feature and the number of regions are known, the mixed integer linear model is proposed to simultaneously determine the locations of multiple break-points and regression coefficients for each segment. Furthermore, an efficient heuristic procedure is presented to identify the key partition feature and final number of break-points. 7 real world problems covering several application domains have been used to demonstrate the efficiency of our proposed method. It is shown that our proposed piece-wise regression method can be solved to global optimality for datasets of thousands samples, which also consistently achieves higher prediction accuracy than a number of state-of-the-art regression methods. Another advantage of the proposed method is that the learned model can be conveniently expressed as a small number of if-then rules that are easily interpretable. Overall, this work proposes an efficient rule-based multivariate regression method based on piece-wise functions and achieves better prediction performance than state-of-the-arts approaches. This novel method can benefit expert systems in various applications by automatically acquiring knowledge from databases to improve the quality of knowledge base

    Fuzzy Piecewise Linear Regression

    Get PDF
    International audienceFuzzy regression using possibilistic concepts allows the identification of models from uncertain data sets. However, some limitations still exist about the possible evolution of the output spread with respect to inputs. We present here a modified form of fuzzy linear model whose output can have any kind of output spread tendency. The formulation of the linear program used to identify the model introduces a modified criterion that assesses the model fuzziness independently of the collected data. These concepts are used in a global identification process in charge of building a piecewise model able to represent every kind of output evolution

    Piecewise Linear Control Systems

    Get PDF
    This thesis treats analysis and design of piecewise linear control systems. Piecewise linear systems capture many of the most common nonlinearities in engineering systems, and they can also be used for approximation of other nonlinear systems. Several aspects of linear systems with quadratic constraints are generalized to piecewise linear systems with piecewise quadratic constraints. It is shown how uncertainty models for linear systems can be extended to piecewise linear systems, and how these extensions give insight into the classical trade-offs between fidelity and complexity of a model. Stability of piecewise linear systems is investigated using piecewise quadratic Lyapunov functions. Piecewise quadratic Lyapunov functions are much more powerful than the commonly used quadratic Lyapunov functions. It is shown how piecewise quadratic Lyapunov functions can be computed via convex optimization in terms of linear matrix inequalities. The computations are based on a compact parameterization of continuous piecewise quadratic functions and conditional analysis using the S-procedure. A unifying framework for computation of a variety of Lyapunov functions via convex optimization is established based on this parameterization. Systems with attractive sliding modes and systems with bounded regions of attraction are also treated. Dissipativity analysis and optimal control problems with piecewise quadratic cost functions are solved via convex optimization. The basic results are extended to fuzzy systems, hybrid systems and smooth nonlinear systems. It is shown how Lyapunov functions with a discontinuous dependence on the discrete state can be computed via convex optimization. An automated procedure for increasing the flexibility of the Lyapunov function candidate is suggested based on linear programming duality. A Matlab toolbox that implements several of the results derived in the thesis is presented

    Automated Model Generation Approach Using MATLAB

    Get PDF

    Support Vector Machines in R

    Get PDF
    Being among the most popular and efficient classification and regression methods currently available, implementations of support vector machines exist in almost every popular programming language. Currently four R packages contain SVM related software. The purpose of this paper is to present and compare these implementations.

    Mathematical programming for piecewise linear regression analysis

    Get PDF

    Study on multi-SVM systems and their applications to pattern recognition

    Get PDF
    制度:新 ; 報告番号:甲3136号 ; 学位の種類:博士(工学) ; 授与年月日:2010/7/12 ; 早大学位記番号:新541

    Nondeterministic hybrid dynamical systems

    Get PDF
    This thesis is concerned with the analysis, control and identification of hybrid dynamical systems. The main focus is on a particular class of hybrid systems consisting of linear subsystems. The discrete dynamic, i.e., the change between subsystems, is unknown or nondeterministic and cannot be influenced, i.e. controlled, directly. However changes in the discrete dynamic can be detected immediately, such that the current dynamic (subsystem) is known. In order to motivate the study of hybrid systems and show the merits of hybrid control theory, an example is given. It is shown that real world systems like Anti Locking Brakes (ABS) are naturally modelled by such a class of linear hybrids systems. It is shown that purely continuous feedback is not suitable since it cannot achieve maximum braking performance. A hybrid control strategy, which overcomes this problem, is presented. For this class of linear hybrid system with unknown discrete dynamic, a framework for robust control is established. The analysis methodology developed gives a robustness radius such that the stability under parameter variations can be analysed. The controller synthesis procedure is illustrated in a practical example where the control for an active suspension of a car is designed. Optimal control for this class of hybrid system is introduced. It is shows how a control law is obtained which minimises a quadratic performance index. The synthesis procedure is stated in terms of a convex optimisation problem using linear matrix inequalities (LMI). The solution of the LMI not only returns the controller but also the performance bound. Since the proposed controller structures require knowledge of the continuous state, an observer design is proposed. It is shown that the estimation error converges quadratically while minimising the covariance of the estimation error. This is similar to the Kalman filter for discrete or continuous time systems. Further, we show that the synthesis of the observer can be cast into an LMI, which conveniently solves the synthesis problem

    VIVA: An Online Algorithm for Piecewise Curve Estimation Using ℓ\u3csup\u3e0\u3c/sup\u3e Norm Regularization

    Get PDF
    Many processes deal with piecewise input functions, which occur naturally as a result of digital commands, user interfaces requiring a confirmation action, or discrete-time sampling. Examples include the assembly of protein polymers and hourly adjustments to the infusion rate of IV fluids during treatment of burn victims. Estimation of the input is straightforward regression when the observer has access to the timing information. More work is needed if the input can change at unknown times. Successful recovery of the change timing is largely dependent on the choice of cost function minimized during parameter estimation. Optimal estimation of a piecewise input will often proceed by minimization of a cost function which includes an estimation error term (most commonly mean square error) and the number (cardinality) of input changes (number of commands). Because the cardinality (ℓ0 norm) is not convex, the ℓ2 norm (quadratic smoothing) and ℓ1 norm (total variation minimization) are often substituted because they permit the use of convex optimization algorithms. However, these penalize the magnitude of input changes and therefore bias the piecewise estimates. Another disadvantage is that global optimization methods must be run after the end of data collection. One approach to unbiasing the piecewise parameter fits would include application of total variation minimization to recover timing, followed by piecewise parameter fitting. Another method is presented herein: a dynamic programming approach which iteratively develops populations of candidate estimates of increasing length, pruning those proven to be dominated. Because the usage of input data is entirely causal, the algorithm recovers timing and parameter values online. A functional definition of the algorithm, which is an extension of Viterbi decoding and integrates the pruning concept from branch-and-bound, is presented. Modifications are introduced to improve handling of non-uniform sampling, non-uniform confidence, and burst errors. Performance tests using synthesized data sets as well as volume data from a research system recording fluid infusions show five-fold (piecewise-constant data) and 20-fold (piecewise-linear data) reduction in error compared to total variation minimization, along with improved sparsity and reduced sensitivity to the regularization parameter. Algorithmic complexity and delay are also considered
    corecore