Configuring a Parallel HFSS Regions Simulation in HFSS 3D Layout

From the SIwave sweep setup, complete these steps to configure a parallel solve.

  1. Configure a distributed solution setup.
  2. Set up an SIwave regions simulation.
  3. From the Edit Frequency Sweep window, click HFSS (user-defined regions) and Solve regions in parallel.
    Note:

    If the Solve regions in parallel option is unavailable, ensure that Generate regions schematic is deselected. These options are mutually exclusive.

  4. Click Configure to open the Parallel HFSS Region Compute Resource Allocation window.

  5. Specify the number of regions to solve in parallel, and allocate a percentage of total resources toward each region. See the following example.

    The percentage values are converted to specific machine names, CPU counts and memory footprints during simulation setup.

  6. Click OK to return to the Edit Frequency Sweep window.
  7. Click OK to close the setup.
Example

Suppose localhost has been configured to use 36 cores and 90% of available memory, and cdcebudevw15 has been configured to use 48 cores and 90% of available memory:

In total, there are 84 cores at our disposal, so each region configured to use 25% of total available resources gets 21 cores.

Memory is divided among the regions using the following formula:

HFSS region memory % = (% memory to use on machine) X (#cores to use for region)/(#cores available on machine)

Cores are allotted to regions starting with the first machine in the distributed solve setup (localhost, in this example). Once all cores on that machine have been exhausted, cores are pulled on the next machine in the list (cdcebudevw15, in this example). Once all cores on all machines have been exhausted, the solver loops back to the first machine and starts again.

To avoid resource contention, you must balance compute resource percentage assignments for each region with the number of regions being solved in parallel. Working through this core/memory assignment logic,ending up with the following: