Building VASP on Sol (pre-compiled libraries)

There are two pages for using Vasp on Sol:

The process outlined in this page will utilize pre-compiled libraries vetted by Research Computing staff. Building VASP (fully manual) describes a process in which the libraries are built by the user and placed in user storage (e.g., scratch). When using pre-compiled libraries, the user will only compile the VASP binaries.

The benefit of the pre-compiled library approach includes many conveniences such as:

  1. libraries known to work

  2. save user from worrying about dependency trees and binary compilation time

  3. user configuration (vasp_variables) is more straightforward

  4. easily compile multiple VASP instances in their own independent directory

VASP can be compiled on Sol without any special permissions. You can build it entirely start-to-finish as your own, unprivileged supercomputer user.

The steps outlined in this page compile vasp 6.4.1, though the scripts are not limited to only this version. However, to change the VASP version (older or newer)--or changing VASP to a different source tree altogether--may warrant changes this tutorial cannot anticipate.

It is recommended to complete this tutorial with the unchanged files to familiarize yourself with the process and the steps involved.

Setting up the Compilation Process

We will start by copying over the vasp-building scripts to our own scratch space. This space is designated as /scratch/$USER, such as /scratch/wdizon/. Let’s start by moving to a compute node to copy the scripts:

wdizon@login01 $ interactive -c 20 wdizon@c001 $ cp -R /packages/apps/letsbuild/vasp /scratch/$USER/

-c 20 is chosen as it is the maximum number of cores VASP will compile in-parallel for. This is a separate limitation from how many cores the built binaries can run on--the completed binary will not be limited by the number chosen here.

Let’s review the files we have copied:

$ cd /scratch/$USER/vasp $ ls build* check_sanity@ compiler_variables@ extracted_src tarballs/ vasp_variables $ cat vasp_variables #!/bin/bash # VASP source code is located at: export VASP_SOURCE_EXTRACTED=/scratch/$USER/vasp-src # During `configure`, this script will copy the above source # paths into the below directory structure: $VASP_INSTALL/$VASP_DESTDIR # Any changes to source code should happen in $VASP_SOURCE_EXTRACTED. # Base directory for all vasp compilations (many may exist in parallel) export VASP_INSTALL=/scratch/$USER/vasp_compiles # Install VASP into $VASP_INSTALL with the directory name $VASP_DESTDIR # Rename this for each compilation you want to exist independently # e.g., VASP-6.4.1-patched # e.g., VASP-6.4.1-custom export VASP_DESTDIR=VASP-6-custom

The file vasp_variables is the only file that generally will require any user editing. That said, the defaults of this file are known to work and properly compile these versions using MPICH and the GCC compiler without any modifications at all:

  • VASP 6.4.1

Testing the Environment is Sane

Load a module for the compiler. VASP 6.4.1 has incompatibilities with GCC 8.5.0, so we will be loading an additional module: gcc-9.4.0-gcc-11.2.0

Execute the following lines:

$ cd /scratch/$USER/vasp $ module load gcc-9.4.0-gcc-11.2.0 $ ./check_sanity setting up compiler vars... testing compiler sanity... SUCCESS test 1 fortran only fixed format Assume Fortran 2003: has FLUSH, ALLOCATABLE derived type, and ISO C Binding SUCCESS test 2 fortran only free format SUCCESS test 3 C only C function called by Fortran Values are xx = 2.00 and ii = 1 SUCCESS test 4 fortran calling c SUCCESS csh test SUCCESS perl test SUCCESS sh test testing compiled netcdf+mpich sanity... C function called by Fortran Values are xx = 2.00 and ii = 1 SUCCESS test 1 fortran + c + netcdf C function called by Fortran Values are xx = 2.00 and ii = 1 status = 2 SUCCESS test 2 fortran + c + netcdf + mpi

If there are no issues (no conflicting module load or other environment issues), the last few lines should indicate SUCCESS twice. All known builds of VASP on Sol have been built with this compiler (gcc 9.4.0).

This is because there is a known-issue with gcc 8.x that prevents VASP from operating correctly.

If you see SUCCESS twice, as above, this means at least the following:

  1. All files have been extracted from their tarballs (/scratch/$USER/letsbuildvasp/tarballs/6.4.1 into /scratch/$USER/letsbuildvasp/src)

  2. The GCC compiler was successfully able to compile a c and fortran code, showing readiness to continue.

If you do not see output matching above, do not continue.

If necessary, start a new terminal session and ensure no conflicting modules are loaded (module purge), then module load.

Compiling VASP

The remaining step is to compile VASP. These are consolidated into a single script. You can start the process with the following line:

You will be prompted about which BLAS library to use; type b or l – both are the same library (point to the same filepath).

Indication of Success

At the end of the script, you should see SUCCESS and Compilation Complete and you will be returned to the prompt.

Usage Notes

Alternate Modules

Using this approach requires two modules:

mpiexec or mpiexec.hydra might be your preferred MPI launcher, and you can invoke directly with the module loaded. Note, letsbuild/vasp is not loaded for the compilation process, only the runtime.

$USER variable

The $USER variable will translate to your login username, which matches your ASURITE ID. The $USER variable therefore is to simplify copy/paste operations, rather than expecting the user to type in, for example, /scratch/wdizon , which is a completely permissible/workable alternative.

Repeated Builds

If your Vasp code changes, but your library/dependencies are remaining constant, you can speed up your testing by following these steps (full overview):

  1. Identify tarball needed for Vasp source code, place in /scratch/$USER/vasp-src

  2. In vasp_variables, make any changes as desired.

  3. Load modules and ./build to completion

Now that it is built:

Vasp’s Files are located at /scratch/$USER/vasp_compiles/VASP-6-custom/bin.

WRF is built using MPICH, and mpiexec.hydra, e.g., mpiexec.hydra -np 12 ./vasp_std

If you run this interactively, be sure to choose -c <num cores> to match -np <num cores>. If you are submitting this with a batch job, makes sure your #SBATCH -c <num cores> matches.