hypre/parcsr_es/LOBPCG
2003-03-25 00:20:30 +00:00
..
blue.1 New version given by Andrew. 2003-03-21 23:04:22 +00:00
doc.txt New version given by Andrew. 2003-03-21 23:04:22 +00:00
eigval.mtx Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
eigvalhistory.mtx Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
hypre_arch.sh Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
HYPRE_lobpcg.h New version given by Andrew. 2003-03-21 23:04:22 +00:00
IJ_eigen_solver.c New version given by Andrew. 2003-03-21 23:04:22 +00:00
IJ_eigen_solver.sh Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
laplacian_10_10_10.mtx Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
lobpcg_matrix.c Check in newest stuff from Martin. 2003-03-22 02:11:52 +00:00
lobpcg_utilities.c Correct Andrew's error. 2003-03-25 00:20:30 +00:00
lobpcg.c Check in newest stuff from Martin. 2003-03-22 02:11:52 +00:00
lobpcg.h Check in newest stuff from Martin. 2003-03-22 02:11:52 +00:00
Makefile.beowulf_gcc_scali Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Makefile.beowulf_mpich Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Makefile.beowulf_pgcc_scali Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Makefile.blue Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Makefile.dec Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Makefile.in Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Makefile.linux Check in latest stuff from Andrew. 2003-03-25 00:11:46 +00:00
mmio.h Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
mpirun.beowulf_mpich.sh Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
mpirun.dec Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
mpirun.linux Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
readme.txt Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
resvec.mtx Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00
Xin.mtx Put lobpcg into hypre. 2003-02-20 21:45:10 +00:00

--------------------------------------------------------------
System interface:
See the file doc.txt for documentation of the system interface
between lobpcg and hypre.



--------------------------------------------------------------
LLNL comments: 

The LLNL file system is shared, so make sure you untar hypre 
into different directories for every cluster. You can use only one directory 
for tera and gps because they are binary compatible.  
To compile hypre, just 
cd hypre-1.6.0/src
and simply run configure and make. 
configure is tuned to work on every LLNL cluster without any special options.

To compile lobcpg, create a new directory hypre-1.6.0/src/lobpcg
uncompress/untar the lobpcg distribution file and run 
make -f Makefile.cluster
where Makefile.cluster is the corresponding make file:
Makefile.blue for Blue,
Makefile.dec for Tera/GPS,
Makefile.linux for LX.  

To run the lobpcg eigensolver on each machine:

On blue, use PSUB/POE, IBM's analog of PBS - follow a comment in blue.1
You'll need to edit a line in blue.1 that points to the right directory.
If you want to learn more, check
http://www.llnl.gov/asci/platforms/bluepac/psub.example.html
http://www.llnl.gov/asci/platforms/bluepac/
http://www.llnl.gov/computing/tutorials/workshops/workshop/poe/MAIN.html

On tera, gps and lx, our shell script IJ_eigen_solver.sh works, so just run it.
On LX, you first need to enable passwordless ssh between the nodes,
by exectuting from the command line:
kinit -f
It will ask you for a password again.

To use tera, gps and lx for larger problem, one needs to use PBS. 
Read some stuff at
http://www.llnl.gov/icc/lc/OCF_resources.html
in particular things from
http://www.llnl.gov/computing/training/#tutorials



--------------------------------------------------------------
CU-Denver MATH Beowulf comments: 

There are several options to compile hypre, see 
corresponging comments in:  

Makefile.beowulf_gcc_scali   (gnu)
Makefile.beowulf_pgcc_scali  (pgi)
Makefile.beowulf_mpich  

and then to complile lobpcg use the right Makefile. 
To run the lobpcg eigensolver, the procedure depends 
on if scali or mpich is used for compilation. 

a) For scali: 
Examples (background job):
scasub -mpimon -np 6 -npn 2 IJ_eigen_solver -n 50 50 50

b) For mpich: 
Examples (interactive job):
mpirun -np 5 IJ_eigen_solver -solver 12 -itr 20
mpimon IJ_eigen_solver -- node1 2 node2 2