2.3. Using Altair’s Sample Script

Altair provides a short sample script for running Fluent inside of PBS Professional. This script uses a configuration file to tell it several pieces of information that it needs (outlined in the section that follows). Optionally, these pieces of information can be specified on the command line. The fluent_args option does not need to be used for specifying the number of CPUs in the default script. The script will address this for the user, based on their PBS Professional resource requests.

2.3.1. Configuration File Example

The sample script uses a configuration file called pbs_fluent.conf if no command line arguments are present. This configuration file should be present in the directory from which the jobs are submitted (which is also the directory in which the jobs are executed). The following is an example of what the content of pbs_fluent.conf can be:

  input="example_small.flin"
  case="Small-1.65m.cas"
  fluent_args="3d -pinfiniband"
  outfile="fluent_test.out"
  mpp="true"

The following is an explanation of the parameters:

input

is the name of the input file.

case

is the name of the .cas file that the input file will utilize.

fluent_args

are extra Fluent arguments. As shown in the previous example, you can specify the interconnect by using the -p<interconnect> command. The available interconnects include ethernet (the default) and infiniband. The MPI is selected automatically, based on the specified interconnect.

outfile

is the name of the file to which the standard output will be sent.

mpp="true"

will tell the job script to execute the job across multiple processors.

2.3.2. Altair’s Sample Script

Altair’s Sample script is not intended to be a full solution for running Fluent jobs in PBS Professional, but rather a simple starting point. It runs the jobs out of the directory from which they are submitted (PBS_O_WORKDIR). Care should be taken to submit jobs from locations that are going to be available on any node (perhaps via NFS).

 #!/bin/sh
 cd $PBS_O_WORKDIR
 
 #We assume that if they didn’t specify arguments then they should use the
 #config file if [ "xx${input}${case}${mpp}${fluent_args}zz" = "xxzz" ]; then
   if [ -f pbs_fluent.conf ]; then
     . pbs_fluent.conf
   else
     printf "No command line arguments specified, "
     printf "and no configuration file found.  Exiting \n"
   fi
 fi
 
 #Set up the license information (Note: you need to substitute your own
 #port and server in this command)
 export ANSYSLMD_License_FILE
 
 #Augment the Fluent command line arguments case "$mpp" in
   true)
     #MPI job execution scenario
     num_nodes=‘cat $PBS_NODEFILE | sort -u | wc -l‘
     cpus=‘expr $num_nodes \* $NCPUS‘
     #Default arguments for mpp jobs, these should be changed to suit your
     #needs.
     fluent_args="-t${cpus} $fluent_args -cnf=$PBS_NODEFILE"
     ;;
   *)
     #SMP case
     #Default arguments for smp jobs, should be adjusted to suit your
     #needs.
     fluent_args="-t$NCPUS $fluent_args"
     ;;
 esac
 #Default arguments for all jobs
 fluent_args="-ssh -g -i $input $fluent_args"

 echo "---------- Going to start a fluent job with the following settings:
 Input: $input
 Case: $case
 Output: $outfile
 Fluent arguments: $fluent_args"
 
 #run the solver
 /ansys_inc/v242/fluent/bin/fluent $fluent_args  > $outfile

Note that for versions of Fluent prior to 12.0, the final line of the sample script should be changed to the following:

  usr/apps/Fluent.Inc/bin/fluent $fluent_args  > $outfile 

2.3.3. Submitting Altair’s Sample Script

  • To submit the script with command line arguments:

    • qsub -l <resource_requests> -v input=<input_file>,case=<case_file>,fluent_args= <fluent_arguments>,outfile=<output_file>[,mpp="true"] fluent-job.sh

  • To submit the script without command line arguments:

    • Edit fluent_pbs.conf to suit your needs

    • qsub -l <resource_requests> fluent-job.sh

Note that the resources necessary for the job (that is, <resource_requests>) should be entered with the proper syntax. For more information about requesting resources, see the PBS Professional 7.1 Users Guide Section 4.3.

2.3.4. Epilogue/Prologue

PBS Professional provides the ability to script some actions immediately before the job starts and immediately after it ends (even if it is removed). The epilogue is a good place to put functionality needed to clean up after a job and to end any job-related processes that are still running. For instance, you could include functionality to clean up the working/scratch area for a job if it is deleted before its completion. Fluent provides a way to get rid of job-related processes in the form of a shell script in the work directory (cleanup-fluent-<host-pid>). It could be useful to have the epilogue call this script to ensure all errant processes are cleaned up after a job completes.