翻訳と辞書
Words near each other
・ Lobosh Peak
・ Lobosillo Solar Park
・ Lobosphaera
・ Lobosphaeropsis
・ Lobosporangium
・ Lobothallia
・ Lobothallia alphoplaca
・ Lobotka
・ Lobotomia
・ Lobotomizer
・ Lobotomy
・ Lobotomy Software
・ Lobotorna
・ Lobotos
・ Lobougoula
LOBPCG
・ Lobras
・ Lobry de Bruyn–van Ekenstein transformation
・ Lobsang Chökyi Gyaltsen, 4th Panchen Lama
・ Lobsang Gyatso (monk)
・ Lobsang Nyima Pal Sangpo
・ Lobsang Palden Yeshe, 6th Panchen Lama
・ Lobsang Pelden Tenpe Dronme
・ Lobsang Rampa
・ Lobsang Sangay
・ Lobsang Tengye Geshe
・ Lobsang Tenzin
・ Lobsang Tshering
・ Lobsang Wangyal
・ Lobsang Yeshe, 5th Panchen Lama


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

LOBPCG : ウィキペディア英語版
LOBPCG
Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) is a matrix-free method for finding the largest (or smallest) eigenvalues and the corresponding eigenvectors of a symmetric positive definite generalized eigenvalue problem
:A x= \lambda B x,
for a given pair (A, B) of complex Hermitian or real symmetric matrices, where
the matrix B is also assumed positive-definite.
==Algorithm==
The method performs an iterative maximization (or minimization) of the generalized Rayleigh quotient
:\rho(x) := \rho(A,B; x) :=\frac,
which results in finding largest (or smallest) eigenpairs of A x= \lambda B x.
The direction of the steepest ascent, which is the gradient, of the generalized Rayleigh quotient is positively proportional to the vector
:r := Ax - \rho(x) Bx,
called the eigenvector residual. If a preconditioner T is available, it is applied to the residual giving vector
:w := Tr,
called the preconditioned residual. Without preconditioning, we set
T := I and so w := r,. An iterative method
:x^ := x^i + \alpha^i T(Ax^i - \rho(x^i) Bx^i),
or, in short,
:x^ := x^i + \alpha^i w^i,\,
:w^i := Tr^i,\,
:r^i := Ax^i - \rho(x^i) Bx^i,
is known as preconditioned steepest ascent (or descent), where the scalar
\alpha^i is called the step size. The optimal step size can be determined by maximizing the Rayleigh quotient, i.e.,
:x^ := \arg\max_ := \arg\max_} \rho(y)
(use \arg\min in case of minimizing). The maximization/minimization of the Rayleigh quotient in a 3-dimensional subspace can be performed numerically by the Rayleigh–Ritz method.
As the iterations converge, the vectors x^i and x^ become nearly linearly dependent, making the Rayleigh–Ritz method numerically unstable in the presence of round-off errors. It is possible to substitute the vector x^ with an explicitly computed difference p^i=x^-x^i making the Rayleigh–Ritz method more stable; see.
This is a single-vector version of the LOBPCG method. It is one of possible generalization of the preconditioned conjugate gradient linear solvers to the case of symmetric eigenvalue problems. Even in the trivial case T=I and B=I the resulting approximation with i>3 will be different from that obtained by the Lanczos algorithm, although both approximations will belong to the same Krylov subspace.
Iterating several approximate eigenvectors together in a block in a similar locally optimal fashion, gives the full block version of the LOBPCG. It allows robust computation of eigenvectors corresponding to nearly-multiple eigenvalues.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「LOBPCG」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.