Skip to content

Application Guide: ABAQUS

ABAQUS Licensing

ABAQUS is licensed using a combination of Execute Tokens (for compute jobs) and Interactive Seats (for running an ABAQUS graphical user interface).
Advanced Research Computing fund 175 Execute Tokens and 16 Interactive Seats for general use on BlueBEAR. Additionally, research groups can purchase their own licences in order to guarantee access to ABAQUS tokens or seats.

The number of Execute Tokens required is dependent on the number of cores being used. This relationship is given as follows:

\[ int(b \times N^{0.422}) \]
  • \(b\) = base value, i.e. for 1 core (see table below)
  • \(N\) = number of cores (equivalent to --ntasks in Slurm)
  • \(int\) denotes truncation instead of rounding

An overview of the Execute Tokens required is given in the following table:

Number of Cores Standard, Explicit and CFD Foundation Aqua and Design
1 5 3 6
2 6 4 8
4 8 5 10
8 12 7 14
12 14 8 17
16 16 9 19
24 19 11 22
32 21 12 25
64 28 17 34
128 38 23 46

Querying ABAQUS Licences

Information on the ABAQUS licence features that are available can be found by executing the following command:

Non-BEAR licences

Please be aware that these licence information commands show ABAQUS licence usage across the entire University. Output will therefore include licences owned by specific users or groups, which other users cannot access and which may be allocated to non-BEAR systems.

abaqus licensing r

Information on current licence usage can be found be executing this command:

abaqus licensing ru

(N.B. for both of these commands you will first need to load an ABAQUS module.)

The above command produces a large amount of output. Search for lines such as the following to determine licence availability for the feature that you require:

Users of cae:  (Total of 38 licenses issued;  Total of 36 licenses in use)

Insufficient Licences

Where there are insufficient licences available, a message similar to the following will be displayed in your Slurm output file:

"standard" license request queued for the License Server.
Total time in queue: 600 seconds.

Depending on the potential wait time for an available licence and also your job's walltime, you may wish to consider resubmitting your job at a later time.

ABAQUS Example

Running ABAQUS as a batch job requires an input file that defines the analysis to be performed.
The following example shows how to run a simple ABAQUS job on BlueBEAR using Slurm, using the force_shearflex_beam3d_xpl.inp input file that is included with the ABAQUS installation.

#!/bin/bash
#SBATCH --account=_project-account_
#SBATCh --qos=bbshort
#SBATCH --nodes=1
#SBATCH --ntasks=8
#SBATCH --time=0-00:10:00

set -e

module purge; module load bluebear
module load ABAQUS/2021-hotfix-2117 # (1)!

if [ ! -f "force_shearflex_beam3d_xpl.inp" ]; # (2)!
  then
    cp $EBROOTABAQUS/doc/English/SIMAINPRefResources/force_shearflex_beam3d_xpl.inp .
fi

abaqus job=force_shearflex_beam3d_xpl cpus=${SLURM_NTASKS} interactive # (3)!
  1. This example is only valid for the 2021-hotfix2117 version.
  2. Checks if the input file exists and if it does not it copies it to the working directory.
  3. ABAQUS command:

    • The job name matches the input file.
    • cpus parallelises the job, using the available cores.
    • Passing the interactive option ensures that the command prints to the Slurm output file.

Abaqus jobs generate several different files, which can be viewed in full after your job has completed:

Extension Purpose
.odb Output database used for post processing results in ABAQUS/CAE
.dat Printed output with summary of results, warning and solver messages
.msg Message file with detailed step-by-step analysis progress
.sta Status file showing increment summaries and convergence info
.fil Results file (ASC11 or binary) used for custom postprocessing or external tools
.res Restart file to continue or recover an interrupted analysis
.log Log file with timestamps and module execution info

You can view the results by loading the .odb file in the ABAQUS/CAE GUI. You can also view results locally by either transferring your .odb file or by mounting your BEAR project directory on your local machine.