まぃふぇいばりっと

機械学習やってます.Julia大好きです.勉強したことの殴り書きです.

Julia 言語 いろいろな方法で固有値をr個,大きい順に計算する関数を用意した.

よく研究で使うのでメモがてら,ここにおいておこう.

行列K固有値を大きい順にrank個求める. 計算手法としては,lapack, lapack_sym, arpack, nys, randomizedの5種類. lapackarpack以外は入力行列が対称であるという条件を課しているので注意. nysは入力に正定値性を課しているので注意.

using Arpack
using LinearAlgebra
function get_eigen_system(K, rank; eigen_method="lapack")
    n = size(K)[1]
    if eigen_method == "lapack"
        f = eigen(K, sortby=-)
        lambda = f.values
        alpha = f.vectors
    elseif eigen_method == "lapack_sym"
        lambda, alpha = eigen(K, n-rank+1:n)
        lambda .= lambda[end:-1:1]
        alpha .= alpha[:, end:-1:1]
    elseif eigen_method == "arpack"
        lambda, alpha = eigs(K, nev=rank, which=:LR)
    elseif eigen_method == "nys"
        lambda, alpha = evdnys(K, rank, reconst=false)
    elseif eigen_method == "randomized"
        lambda, alpha = reigen(K, rank, reconst=false)
    else
        error("Eigenvalue Decomposion method $eigen_method is not defined")
    end
    return lambda, alpha
end

何度も載せているが,nysrandomizedの実装はこちら.

using LinearAlgebra

"""
Randomized eigenvalue decomposition via Nystrom Method.
See Fig 6 in the [original paper](https://arxiv.org/abs/1607.01649)
or See Algorithm 5.5 in the [original paper](https://epubs.siam.org/doi/10.1137/090771806)
This method is available for only a positive definite matrix.

# Aruguments
- 'A' : input positive definite matrix
- 'k' : target rank, number of wanted eigenvalues
- 'p' : over-sampling parameter, it should be 10 or 2k.
- 'reconst' : if ture, the output is reconstracted matrix which approximates 'A'

example:
    # make a positive definite matrix `A`.
    n = 5000; m = 25000; x = randn(m,n)
    demean_x = x .- mean(x, dims=1)
    A = demean_x'*demean_x ./(m-1)

    # run evdnys
    lowrank_mtx = evdnys(A,10, reconst=true)
"""
function evdnys(A::Symmetric, k::Int;p=10, reconst=false)
    n, n = size(A)
    G = randn(n, k+p)
    Y = A * G
    Q = qr(Y).Q*Matrix(I,n,k+p)
    B1 = A*Q
    B2 = Q'*B1
    C = cholesky(Symmetric(B2))
    #F = B1 * inv(C.U)
    F = B1 / C.U
    U, Σ, _ = svd(F)

    if reconst
        return U*diagm(Σ.^2)*U'
    else
        return Σ.^2, U
    end
end

"""
Randomized eigenvalue decomposition proposed by N.halko et al.
See Algorithm 5.6. in the
[original paper](https://epubs.siam.org/doi/10.1137/090771806).
This method is effective matrices with decaying eigenvalues.

# Aruguments
- 'A' : input positive definite matrix
- 'r' : target rank, number of wanted eigenvalues
- 'p' : over-sampling parameter, it should be 10 or 2r.
- 'reconst' : if ture, the output is reconstracted matrix which approximates 'A'

example:
    # make a matrix with decaying eigenvalues
    n = 150; A = rand(n,n); B = zeros(n,n)
    e, v = eigen(Symmetric(A))
    for i in 1:n
        B += e[i] * v[:,i] * v[:,i]' * (i>n-100 ? 1 : 1e-3)
    end

    # run reigen
    lowrank_mtx = reigen(A, 10, reconst=true)
"""
function reigen(A, r ;p=10, twopass=false, reconst=false)
    n = size(A)[1]
    m = r + p

    Ω = randn(n, m)
    Y = A*Ω

    if twopass
        Q = qr!(Y).Q * Matrix(I,n,m)
        T = Q' * A * Q
    else
        Q = qr(Y).Q * Matrix(I,n,m)
        T = (Q' * Y) / ( Q' * Ω )
    end
    λ, S = eigen(T)
    U = Q*S'
    if reconst
        Λ = Diagonal(λ)
        return U*Λ*U'
    else
        return λ,U
    end
end