In addition to (and as an integral part of) its support for multi-dimensional arrays, Julia contains native implementations of a large number of popular and helpful linear algebra operations. These operations can be loaded into Julia by utilising the LinearAlgebra library. The most fundamental of operations, including tr, det, and inv, are supported:
Julia > A = [1 2 3; 4 1 6; 7 8 1]
3×3 Matrix{Int64}:
1 2 3
4 1 6
7 8 1
Julia > tr(A)
3
Julia > det(A)
104.0
Julia > inv(A)
3×3 Matrix{Float64}:
-0.451923 0.211538 0.0865385
0.365385 -0.192308, 0.0576923,
0.240385 0.0576923 -0.0673077
A matrix can be pre-factorized into a form that is more amenable (for performance or memory reasons) to the problem by using Julia how to write funcitno to calculate eigenvalues, This can be used to speed up problems such as linear solution or matrix exponentiation. This is accomplished by factoring the matrix in question into a form that is more efficient.
Matrixes:
Matrix factorizations commonly include matrices with particular symmetries and structures in linear algebra. Julia’s specific matrix types allow quick computation with customised algorithms.
The following is a list of Julia’s special matrices and their LAPACK hooks.
Type :
symmetrical
Hermitian.
Upper Triangular.
Unit Upper Triangular.
Lower Triangular.
unit, lower triangular.
Upper Hessenberg.
tridiagonal.
SymTridiagonal.
Bidiagonal.
diagonal.
Uniform Scaling
Matrix factorizations:
One of the most important ideas in linear algebra is called matrix factorization, and it computes the factorization of a matrix into a product of other matrices. Matrix factorizations are also known as matrix decompositions.
The types of matrix factorizations that have been incorporated into Julia are outlined in the following table for convenient reference. The documentation for Linear Algebra contains a section titled “Standard functions,” in which specifics regarding the associated methods can be found.
Type Description
BunchKaufman Bunch-Kaufman factorization
Cholesky Cholesky factorization
CholeskyPivoted Pivoted Cholesky factorization
LDLt LDL(T) factorization
LU LU factorization
QR QR factorization
QRCompactWY Compact WY form of the QR factorization
QRPivoted Pivoted QR factorization
LQ QR factorization of transpose(A)
Hessenberg Hessenberg decomposition
Eigenspectral decomposition
Generalised Eigen Generalised spectral decomposition
SVD Singular value decomposition
GeneralisedSVD Generalised SVD
Schur Schur decomposition
Generalised Schul Generalised Schur decomposition
Standard functions are as follows:
The majority of the work that goes into implementing linear algebra functions in Julia is done by invoking those same functions from LAPACK.
Multiplication in a matrix.
Example:
julia> [1 1; 0 1] * [1 0; 1 1]
2×2 Matrix{Int64}:
2 1
1 1.
Poly-algorithm matrix division. When A is square, A*X == B for A. The structure of A determines the solver. The system can be solved by going forward or backward. A is A is substituted without factorization if A is upper or lower triangular (or diagonal). Non-triangular square matrices use LU factorization.
The minimum-norm least squares solution for rectangular A uses a pivoting QR factorization and an R-factor rank estimation. Sparse A uses a similar polygonal algorithm. LDLt factorization can fail even if the matrices are invertible since it doesn’t pivot.
LAPACK functions:
LinearAlgebra.LAPACK provides wrappers for several of the linear algebra functions that are provided by LAPACK. The names of the functions that overwrite one element of one of the input arrays terminate with a ‘!’ symbol.
In most cases, a function will have four methods specified, one for each of the following array types: Float64, Float32, ComplexF64, and ComplexF32.
Please be aware that Julia’s LAPACK API may, and likely will, undergo modifications in the near and distant future. Because this application programming interface does not interact directly with users, there is no guarantee that this particular collection of functions will be supported or deprecated in subsequent releases.