Setting Up Client-to-HPC Communication and File Transfers


Introduction

To be able to submit solve jobs to an HPC resource (for example, a cluster) via Ansys Remote Solve Manager (RSM), you must establish how RSM clients will communicate with a remote HPC resource, and how job files will get from client machines to the HPC resource.

When a job is submitted from a client (such as Workbench or Mechanical) to RSM, the client must be able to communicate with the remote HPC resource in some way. You can configure direct client-to-HPC communication, or use SSH or a custom mechanism for client-to-HPC communication if required.

Upon job submission, a client working directory is created to house all job input files. The files in client working directories must be made accessible to the HPC resource. You can accomplish this in one of two ways:

  • In the client application, specify that working directories should be created under an HPC staging directory.

  • Make it possible for files to be transferred from the working directory to an HPC staging directory.

The HPC staging directory is a shared directory that is accessible to all execution nodes. Client files will first be transferred to this area before they are made available to all nodes used by the job.

RSM handles the following two functions independently:

  • Client-to-HPC communication. Refers mainly to the submission of jobs from client machines to the cluster submit host. You will need to determine if your organization's IT policy requires SSH or custom communication between clients and the HPC resource. Otherwise, a form of direct communication will be used. See Setting Up Client-to-HPC Communication.

  • Client-to-HPC file transfers. Refers to the transfer of job files from the client working directory to the HPC staging directory. You will need to decide how job files will be handled, and take any necessary steps to ensure that the files will be available to the cluster. See Setting Up Client-to-HPC File Transfers.

If you choose to use SSH for client-to-HPC communication, that does not mean that you have to use SSH for file transfers as well. RSM provides the flexibility to configure these functions separately.

This tutorial describes the available communication options and file handling methods in RSM, and the steps you need to take to ensure that client-to-HPC communication and file transfers are successful. The steps presented apply to all cluster types, including Ansys RSM Cluster (ARC), Windows HPC, and Linux LSF, PBS, Torque with Moab, and Altair Grid Engine (formerly UGE).

Assumptions

These instructions assume the following:

  • You have installed and properly configured a cluster, and the execution nodes can access the cluster head node.

  • For a Linux cluster, you have passwordless SSH set up between the head node and execution nodes. Consult an IT professional for assistance with setting up passwordless SSH.

  • You are a local administrator of the cluster and know how to share directories and map network drives. If you do not know how to perform these tasks, contact your Systems Administrator for assistance.