### Documentation and interfaces for public availabe functions

parent cddfb00c
 .TH "elpa_cholesky_complex" 3 "Wed Jun 29 2016" "ELPA" \" -*- nroff -*- .ad l .nh .SH NAME elpa_cholesky_complex \- Cholesky factorization of a complex hermetian matrix .br .SH SYNOPSIS .br .SS FORTRAN INTERFACE use elpa1 .br .br .RI "success = \fBelpa_cholesky_complex\fP (na, a(lda,matrixCols), lda, nblk, matrixCols, mpi_comm_rows, mpi_comm_cols, wantDebug)" .br .RI " " .br .RI "With the definintions of the input and output variables:" .br .RI "integer, intent(in) \fBna\fP: global dimension of quadratic matrix \fBa\fP to solve" .br .RI "complex*16, intent(inout) \fBa\fP: locally distributed part of the matrix \fBa\fP. The local dimensions are \fBlda\fP x \fBmatrixCols\fP" .br .RI "integer, intent(in) \fBlda\fP: leading dimension of locally distributed matrix \fBa\fP" .br .RI "integer, intent(in) \fBnblk\fP: blocksize of cyclic distribution, must be the same in both directions" .br .RI "integer, intent(in) \fBmatrixCols\fP: number of columns of locally distributed matrices \fBa\fP and \fBq\fP" .br .RI "integer, intent(in) \fBmpi_comm_rows\fP: communicator for communication in rows. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "integer, intent(in) \fBmpi_comm_cols\fP: communicator for communication in colums. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "logical, intent(in) \fBwantDebug\fP: if .true. , print more debug information in case of an error" .RI "logical \fBsuccess\fP: return value indicating success or failure" .br .SS C INTERFACE #include "elpa.h" .br #include .br .RI "\fBint\fP success = \fBelpa_cholesky_complex\fP (\fBint\fP na, \fB double complex *\fPa, \fBint\fP lda, \fBint\fP nblk, \fBint\fP matrixCols, \fBint\fP mpi_comm_rows, \fBint\fP mpi_comm_cols, \fBint\fP wantDebug );" .br .RI " " .br .RI "With the definintions of the input and output variables:" .br .RI "int \fBna\fP: global dimension of quadratic matrix \fBa\fP to solve" .br .RI "double complex *\fBa\fP: pointer to locally distributed part of the matrix \fBa\fP. The local dimensions are \fBlda\fP x \fBmatrixCols\fP" .br .RI "int \fBlda\fP: leading dimension of locally distributed matrix \fBa\fP" .br .RI "int \fBnblk\fP: blocksize of block cyclic distributin, must be the same in both directions" .br .RI "int \fBmatrixCols\fP: number of columns of locally distributed matrices \fBa\fP and \fBq\fP" .br .RI "int \fBmpi_comm_rows\fP: communicator for communication in rows. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "int \fBmpi_comm_cols\fP: communicator for communication in colums. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "int \fBwantDebug\fP: if 1, print more debug information in case of an error" .br .RI "int \fBsuccess\fP: return value indicating success (1) or failure (0) .SH DESCRIPTION Does a Cholesky factorization of a complex, hermetian matrix. The ELPA communicators \fBmpi_comm_rows\fP and \fBmpi_comm_cols\fP are obtained with the \fBget_elpa_communicators\fP(3) function. The distributed quadratic marix \fBa\fP has global dimensions \fBna\fP x \fBna\fP, and a local size \fBlda\fP x \fBmatrixCols\fP. .br .SH "SEE ALSO" \fBget_elpa_communicators\fP(3)
 .TH "elpa_cholesky_real" 3 "Wed Jun 29 2016" "ELPA" \" -*- nroff -*- .ad l .nh .SH NAME elpa_cholesky_real \- Cholesky factorization of a real symmetric matrix .br .SH SYNOPSIS .br .SS FORTRAN INTERFACE use elpa1 .br .br .RI "success = \fBelpa_cholesky_real\fP (na, a(lda,matrixCols), lda, nblk, matrixCols, mpi_comm_rows, mpi_comm_cols, wantDebug)" .br .RI " " .br .RI "With the definintions of the input and output variables:" .br .RI "integer, intent(in) \fBna\fP: global dimension of quadratic matrix \fBa\fP to solve" .br .RI "real*8, intent(inout) \fBa\fP: locally distributed part of the matrix \fBa\fP. The local dimensions are \fBlda\fP x \fBmatrixCols\fP" .br .RI "integer, intent(in) \fBlda\fP: leading dimension of locally distributed matrix \fBa\fP" .br .RI "integer, intent(in) \fBnblk\fP: blocksize of cyclic distribution, must be the same in both directions" .br .RI "integer, intent(in) \fBmatrixCols\fP: number of columns of locally distributed matrices \fBa\fP and \fBq\fP" .br .RI "integer, intent(in) \fBmpi_comm_rows\fP: communicator for communication in rows. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "integer, intent(in) \fBmpi_comm_cols\fP: communicator for communication in colums. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "logical, intent(in) \fBwantDebug\fP: if .true. , print more debug information in case of an error" .RI "logical \fBsuccess\fP: return value indicating success or failure" .br .SS C INTERFACE #include "elpa.h" .br .RI "\fBint\fP success = \fBelpa_cholesky_real\fP (\fBint\fP na, \fB double *\fPa, \fBint\fP lda, \fBint\fP nblk, \fBint\fP matrixCols, \fBint\fP mpi_comm_rows, \fBint\fP mpi_comm_cols, \fBint\fP wantDebug );" .br .RI " " .br .RI "With the definintions of the input and output variables:" .br .RI "int \fBna\fP: global dimension of quadratic matrix \fBa\fP to solve" .br .RI "double *\fBa\fP: pointer to locally distributed part of the matrix \fBa\fP. The local dimensions are \fBlda\fP x \fBmatrixCols\fP" .br .RI "int \fBlda\fP: leading dimension of locally distributed matrix \fBa\fP" .br .RI "int \fBnblk\fP: blocksize of block cyclic distributin, must be the same in both directions" .br .RI "int \fBmatrixCols\fP: number of columns of locally distributed matrices \fBa\fP and \fBq\fP" .br .RI "int \fBmpi_comm_rows\fP: communicator for communication in rows. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "int \fBmpi_comm_cols\fP: communicator for communication in colums. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "int \fBwantDebug\fP: if 1, print more debug information in case of an error" .br .RI "int \fBsuccess\fP: return value indicating success (1) or failure (0) .SH DESCRIPTION Does a Cholesky factorization of a real, symmetric matrix. The ELPA communicators \fBmpi_comm_rows\fP and \fBmpi_comm_cols\fP are obtained with the \fBget_elpa_communicators\fP(3) function. The distributed quadratic marix \fBa\fP has global dimensions \fBna\fP x \fBna\fP, and a local size \fBlda\fP x \fBmatrixCols\fP. .br .SH "SEE ALSO" \fBget_elpa_communicators\fP(3)
 .TH "elpa_solve_tridi" 3 "Wed Jun 29 2016" "ELPA" \" -*- nroff -*- .ad l .nh .SH NAME elpa_solve_tridi \- Solve tridiagonal eigensystem with divide and conquer method .br .SH SYNOPSIS .br .SS FORTRAN INTERFACE use elpa1 .br .br .RI "success = \fBelpa_solve_trid\fP (na, nev, d(na), e(na), q(ldq,matrixCols), ldq, nblk, matrixCols, mpi_comm_rows, mpi_comm_cols, wantDebug)" .br .RI " " .br .RI "With the definintions of the input and output variables:" .br .RI "integer, intent(in) \fBna\fP: global dimension of quadratic matrix \fBa\fP to solve" .br .RI "integer, intent(in) \fBnev\fP: number of eigenvalues/vectors to be computed" .br .RI "real*8, intent(inout) \fBd(na)\fP: array d(na) on input diagonal elements of tridiagonal matrix, on output the eigenvalues in ascending order" .br .RI "real*8, intent(in) \fBe(na)\fP: array e(na) on input subdiagonal elements of matrix, on exit destroyed" .br .RI "real*8, intent(inout) \fBq\fP: on exit \fBq\fP contains the eigenvectors. The local dimensions are \fBldq\fP x \fBmatrixCols\fP" .br .RI "integer, intent(in) \fBldq\fP: leading dimension of locally distributed matrix \fBq\fP" .br .RI "integer, intent(in) \fBnblk\fP: blocksize of cyclic distribution, must be the same in both directions" .br .RI "integer, intent(in) \fBmatrixCols\fP: number of columns of locally distributed matrices \fBa\fP and \fBq\fP" .br .RI "integer, intent(in) \fBmpi_comm_rows\fP: communicator for communication in rows. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "integer, intent(in) \fBmpi_comm_cols\fP: communicator for communication in colums. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "logical, intent(in) \fBwantDebug\fP: if .true. , print more debug information in case of an error" .RI "logical \fBsuccess\fP: return value indicating success or failure" .br .SS C INTERFACE #include "elpa.h" .br .RI "\fBint\fP success = \fBelpa_solve_tridi\fP (\fBint\fP na, \fBint\fP nev, \fB double *\fPd,\fB double *\fPe ,\fB double *\fPq, \fBint\fP ldq, \fBint\fP nblk, \fBint\fP matrixCols, \fBint\fP mpi_comm_rows, \fBint\fP mpi_comm_cols, \fBint\fP wantDebug );" .br .RI " " .br .RI "With the definintions of the input and output variables:" .br .RI "int \fBna\fP: global dimension of quadratic matrix \fBa\fP to solve" .br .RI "int \fBnev\fP: number of eigenvalues/eigenvectors to be computed" .br .RI "double *\fBd\fP: pointer to array d(na) on input diagonal elements of tridiagonal matrix, on output the eigenvalues in ascending order" .br .RI "double *\fBe\fP: pointer to array e(na) on input subdiagonal elements of matrix, on exit destroyed" .br .RI "double *\fBq\fP: on exit \fBq\fP contains the eigenvectors. The local dimensions are \fBldq\fP x \fBmatrixCols\fP" .br .RI "int \fBldq\fP: leading dimension of locally distributed matrix \fBq\fP" .br .RI "int \fBnblk\fP: blocksize of block cyclic distributin, must be the same in both directions" .br .RI "int \fBmatrixCols\fP: number of columns of locally distributed matrices \fBa\fP and \fBq\fP" .br .RI "int \fBmpi_comm_rows\fP: communicator for communication in rows. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "int \fBmpi_comm_cols\fP: communicator for communication in colums. Constructed with \fBget_elpa_communicators\fP(3)" .br .RI "int \fBwantDebug\fP: if 1, print more debug information in case of an error" .br .RI "int \fBsuccess\fP: return value indicating success (1) or failure (0) .SH DESCRIPTION Solves a tri-diagonal matrix and returns \fBnev\fP eigenvalues/eigenvectors. The ELPA communicators \fBmpi_comm_rows\fP and \fBmpi_comm_cols\fP are obtained with the \fBget_elpa_communicators\fP(3) function. The distributed quadratic marix \fBq\fP has global dimensions \fBna\fP x \fBna\fP, and a local size \fBldq\fP x \fBmatrixCols\fP. .br .SH "SEE ALSO" \fBget_elpa_communicators\fP(3)
 ... ... @@ -100,16 +100,26 @@ module ELPA1 ! imported from elpa1_auxilliary public :: mult_at_b_real !< Multiply real matrices A**T * B public :: mult_ah_b_complex !< Multiply complex matrices A**H * B public :: elpa_mult_at_b_real !< Multiply real matrices A**T * B public :: mult_at_b_real !< old, deprecated interface to multiply real matrices A**T * B public :: invert_trm_real !< Invert real triangular matrix public :: invert_trm_complex !< Invert complex triangular matrix public :: elpa_mult_ah_b_complex !< Multiply complex matrices A**H * B public :: mult_ah_b_complex !< old, deprecated interface to multiply complex matrices A**H * B public :: cholesky_real !< Cholesky factorization of a real matrix public :: cholesky_complex !< Cholesky factorization of a complex matrix public :: elpa_invert_trm_real !< Invert real triangular matrix public :: invert_trm_real !< old, deprecated interface to invert real triangular matrix public :: elpa_invert_trm_complex !< Invert complex triangular matrix public :: invert_trm_complex !< old, deprecated interface to invert complex triangular matrix public :: elpa_cholesky_real !< Cholesky factorization of a real matrix public :: cholesky_real !< old, deprecated interface to do Cholesky factorization of a real matrix public :: elpa_cholesky_complex !< Cholesky factorization of a complex matrix public :: cholesky_complex !< old, deprecated interface to do Cholesky factorization of a complex matrix public :: elpa_solve_tridi !< Solve tridiagonal eigensystem with divide and conquer method public :: solve_tridi !< Solve tridiagonal eigensystem with divide and conquer method ! Timing results, set by every call to solve_evp_xxx ... ...
This diff is collapsed.