96 research outputs found
(R, S) conjugate solution to coupled Sylvester complex matrix equations with conjugate of two unknowns
In this work, we are concerned with (R, S) ā conjugate solutions to coupled Sylvester complex matrix equations with conjugate of two unknowns. When the considered two matrix equations are consistent, it is demonstrated that the solutions can be obtained by utilizing this iterative algorithm for any initial arbitrary (R,S) ā conjugate matrices V1,W1. A necessary and sufficient condition is established to guarantee that the proposed method converges to the (R,S) ā conjugate solutions. Finally, two numerical examples are provided to demonstrate the efficiency of the described iterative technique
A gradient-based iterative algorithm for solving coupled Lyapunov equations of continuous-time Markovian jump systems
In this paper, a new gradient-based iterative algorithm is proposed to solve the coupled Lyapunov matrix equations associated with continuous-time Markovian jump linear systems. A necessary and sufficient condition is established for the proposed gradient-based iterative algorithm to be convergent. In addition, the optimal value of the tunable parameter achieving the fastest convergence rate of the proposed algorithm is given explicitly. Finally, some numerical simulations are given to validate the obtained theoretical results
Toward Solution of Matrix Equation X=Af(X)B+C
This paper studies the solvability, existence of unique solution, closed-form
solution and numerical solution of matrix equation with and where is the
unknown. It is proven that the solvability of these equations is equivalent to
the solvability of some auxiliary standard Stein equations in the form of
where the dimensions of the coefficient
matrices and are the same as those of
the original equation. Closed-form solutions of equation can then
be obtained by utilizing standard results on the standard Stein equation. On
the other hand, some generalized Stein iterations and accelerated Stein
iterations are proposed to obtain numerical solutions of equation equation
. Necessary and sufficient conditions are established to guarantee
the convergence of the iterations
Learning Output Kernels for Multi-Task Problems
Simultaneously solving multiple related learning tasks is beneficial under a
variety of circumstances, but the prior knowledge necessary to correctly model
task relationships is rarely available in practice. In this paper, we develop a
novel kernel-based multi-task learning technique that automatically reveals
structural inter-task relationships. Building over the framework of output
kernel learning (OKL), we introduce a method that jointly learns multiple
functions and a low-rank multi-task kernel by solving a non-convex
regularization problem. Optimization is carried out via a block coordinate
descent strategy, where each subproblem is solved using suitable conjugate
gradient (CG) type iterative methods for linear operator equations. The
effectiveness of the proposed approach is demonstrated on pharmacological and
collaborative filtering data
- ā¦