Building VASP on Sol (fully manual)

There are two pages for using VASP on Sol:

In most occasions, using the pre-build module load for vasp will suffice. Building vasp should only be required for users who need to adjust the source code then compile.

The benefit of the fully-manual approach includes many conveniences such as:

  1. instructions are portable to work on any supercomputer/workstation

VASP can be compiled on Sol without any special permissions. You can build it entirely start-to-finish as your own, unprivileged supercomputer user.

The steps outlined in this page compile vasp 6.4.1, though the scripts are not limited to only this version. However, to change the VASP version (older or newer)--or changing VASP to a different source tree altogether--may warrant changes this tutorial cannot anticipate.

It is recommended to complete this tutorial with the unchanged files to familiarize yourself with the process and the steps involved.

Setting up the Compilation Process

We will start by copying over the vasp-building scripts to our own scratch space. This space is designated as /scratch/$USER, such as /scratch/wdizon/. Let’s start by moving to a compute node to copy the scripts:

wdizon@login01 $ interactive -c 20 wdizon@c001 $ cp -R /packages/apps/letsbuildvasp /scratch/$USER/

-c 20 is chosen as it is the maximum number of cores VASP will compile in-parallel for. This is a separate limitation from how many cores the built binaries can run on--the completed binary will not be limited by the number chosen here.

Let’s review the files we have copied:

$ cd /scratch/$USER/letsbuildvasp $ ls build_deps* build_vasp* compiler_variables src/ tarballs/ $ cat compiler_variables #!/bin/bash export VASP_VER=6.4.1 export VASP_SRC=/scratch/$USER/letsbuildvasp export VASP_INSTALL=/scratch/$USER/vasp_compiles export OPENBLAS_VER=0.3.24 export SCALAPACK_VER=2.2.0 export MPICH_VER=3.0.4 export FFTW_VER=3.3.10 # vasp downloads typically extract to a directory following # the format vasp-X.Y.Z. In some cases, this naming convention # is not followed, or the source tree might be an archive # of a differently named directory. If this is the case, # enter the full extracted directory name here, instead. # default: VASP_TARBALL_DIRNAME=vasp.6.4.1 # e.g., VASPv5.1 export VASP_TARBALL_DIRNAME=vasp.${VASP_VER} # COMPILER SETUP export DOWNLOAD_SOURCES=true export EXTRACT_SOURCES=true export BUILD_VASP=true export MAKE_PROCS=20 export CC=gcc export CXX=g++ export FC=gfortran export FCFLAGS=-m64 export F77=gfortran export FFLAGS=-m64

The file compiler_variables is the only file that generally will require any user editing. That said, the defaults of this file are known to work and properly compile these versions using MPICH and the GCC compiler without any modifications at all:

  • VASP 6.4.1

Extracting and Compiling Libraries

Execute the following lines:

cd /scratch/$USER/letsbuildvasp ./build_deps

If there are no issues (no conflicting module load or other environment issues), the last few lines should indicate SUCCESS twice. All known builds of VASP on Sol have been built with the separate system compiler (gcc 9.4.0), which means that there is one module load required for operation:

module load module load gcc-9.4.0-gcc-11.2.0

This is because there is a known-issue with gcc 8.x that prevents VASP from operating correctly.

If you see SUCCESS twice, as above, this means at least the following:

  1. All files have been extracted from their tarballs (/scratch/$USER/letsbuildvasp/tarballs/6.4.1 into /scratch/$USER/letsbuildvasp/src)

  2. The GCC compiler was successfully able to compile a c and fortran code, showing readiness to continue.

If you do not see output matching above, do not continue.

If necessary, start a new terminal session and ensure no conflicting modules are loaded (module purge), then module load.

What’s happened so far

At this point in the process, MPICH, NETCDF-C, NETCDF-FORTRAN, JASPER, ZLIB and LIBPNG have successfully built. They are stored in /scratch/$USER/vasp_compiles/libraries . Should you choose to have multiple WRF builds, these libraries can be reused to save the time of recompiling again.

Compiling VASP

The remaining step is to compile VASP. These are consolidated into a single script. You can start the process with the following line:

Indication of Success

At the end of the script, you should see Compilation Complete and you will be returned to the prompt.

Usage Notes

Alternate Modules

These steps build MPICH manually and does not use any system modules (e.g., from module load). Usage of these binaries often will necessitate using full paths of the compiled binaries for your SBATCH scripts and interactive use. Example:

mpiexec or mpiexec.hydra might be your preferred MPI launcher, but you must invoke it with:

/scratch/$USER/vasp_compiles/libraries/mpich/bin/mpiexec

$USER variable

The $USER variable will translate to your login username, which matches your ASURITE ID. The $USER variable therefore is to simplify copy/paste operations, rather than expecting the user to type in, for example, /scratch/wdizon , which is a completely permissible/workable alternative.

Repeated Builds

If your Vasp code changes, but your library/dependencies are remaining constant, you can speed up your testing by following these steps (full overview):

  1. Identify tarball needed for Vasp source code, place in /scratch/$USER/letsbuildvasp/tarballs/6.4.1

  2. In compiler.variables, make any changes as desired.

  3. ./build_deps to completion

  4. Identify the newly created directory name in /scratch/$USER/letsbuilvasp/src

  5. ./build_vasp

  6. Test vasp, use vasp, and when needing to make changes…

    1. Make changes to source in /scratch/$USER/letsbuildvasp/src/<dir>

    2. ./build_vasp

    3. Repeat #6

Now that it is built:

Vasp’s Files are located at /scratch/$USER/vasp_compiles/vasp-6.4.1/run.

Vasp is built using MPICH, and mpiexec.hydra, e.g., /scratch/$USER/vasp_compiles/LIBRARIES/mpich/bin/mpiexec.hydra -np 12 ./vasp_std

If you run this interactively, be sure to choose -c <num cores> to match -np <num cores>. If you are submitting this with a batch job, makes sure your #SBATCH -c <num cores> matches.