Large Scale DSO Command Line Syntax
Large Scale DSO operates through a non-graphical
batch application called desktopjob
. You can run the desktopjob
command line
to perform parametric analysis DSO. The command-line interface supported
by this batch program is consistent with the command line used for current
DSO jobs. desktopjob -help
lists all available command-line options
as shown below:
Command Line Syntax:
desktopjob.exe
<options> <project-path-on-shared-drive>
Note that the project path can be to an archive file.
Options:
- -help
Prints the help text.
- -cmd
Specifies the command to run.
Available choices: dso
- -ng
Runs the analysis in non-graphical mode.
- -monitor
Outputs progress and messages to standard output/error.
- -waitforlicense
Queues the job until licenses are available.
- -preserve
Preserves the local storage space of the distributed job for investigation into the job's run. If local storage directory (for example, the temp directory) is provisioned by scheduler, ensure it is also configured to preserve the job's local storage. This storage should be deleted manually.
- -batchoptions
Overrides the Tools/Option entries through either a batchoptions file or batchoptions string.
-batchoptions specific for Large Scale DSO include:
To retry failed variations, new batch option LargeScaleDSO/FailedVarRetryCount.
For each task to re-simulate its failed variations when its assigned variations are finished, specify the FailedVarRetryCount batch option with positive integer. Default 0; Value-Zero(0) will disable the re-simulation for failed variations. Value ranges from 0 or positive integer.
Redistribution batch option, LargeScaleDSO/VarRedistribution .Value 0 disables redistribution (default). Value 1 enables redistribution.
Redistribution limit batch option, LargeScaleDSO/RedistributionLimit. Minimum estimated remaining time (in minutes) for variations to redistribute to another task. Value must be positive integer. Default 3 minutes
Example:
-batchoptions <config-file-on-shared-drive>
-batchoptions "'name1'='val1' 'n2'='v2' "
- -jobid
Specify a custom job ID for the job. The job's output is organized into a folder with job ID name. This parameter is ignored when run under a scheduler.
- -machinelist
- In the context of Ansoft RSM:
Specify machines for distributed analysis. Machine list is specified either inline (as a comma separated machine names) or through a file. Multiple cores are specified by repeating the name of machine or by embedding number of cores in the machine name, using a colon separator.
Example 1:
-machinelist "list=m1,m1,m1,m2,m2,m3"
Example 2:
-machinelist "list=m1:3,m2:2,m3"
Example 3:
-machinelist "list=m1:1:3,m2:2:2"
Example 4:
-machinelist "file=machines.txt"
- In the context of a scheduler such as LSF:
Specify the portion of total machines for distributed analysis. Use remaining for overhead or shared memory multiprocessing.
Manual Example:
-machinelist "Num=10"
Auto Example:
-machinelist "NumCores=40"
- In the context of Ansoft RSM:
- -auto
- Run the leaf jobs in auto mode. See Submitting Large Scale DSO Job Examples.
-
-numdistributedvariations:
Specify the number of parallel leaf jobs to run in auto mode. Required when running under a scheduler. If not specified when not running under a scheduler, then the number of tasks for each machine must be specified in the machine list. The total number of tasks will be the number of parallel variations (leaf jobs) to run.
Example:
-machinelist NumCores=40 -auto -NumDistributedVariations 10
- -usefolderasinput
Choose this option if the job's input represents an entire folder rather than just the project file.
- -maxfolderInMB
Specify the maximum size input folder that is allowed for a valid job (in MBytes). By default, the maximum size allowed for input is 10MB. Specify a value of 0 to remove this size restriction and enable inputs of any size. This option applies when
-usefolderasinput
is used. - -workdir
Specifies the shared drive folder for status and result files generated by analysis. By default, the results folder of input project is used as the work directory.
- -meregecsv: [acrossDPs | singleDP | both]
across DPs: Merge report csv files for all design-points (variations). One file is created per trace, across all variations.
singleDP: Merge csv files within a single design-point (variation). One file is created per variation, per a set of traces that can be merged.
both: Merge all traces that have the same primary sweep for all design-points (variations) into one csv file.
Interpolation note: If primary sweep values are not uniformly spaced, mergecsv is enabled with traced values and are re-sampled uniformly using '-batchoptions' syntax as shown below:
-batchoptions "'LargeScaleDSO/NumTracePoints'=500"
-batchoptions "'LargeScaleDSO/NumTracePoints'='PrimarySweepName:200'""
-batchoptions "'LargeScaleDSO/NumTracePoints'='ReportName1:Trace1:100;ReportName1:Trace1:100;ReportName1:TraceName2:200'" - -abort
Abort a running job identified through the job's working directory. Example: -abort <projectresultsfolder-path>/>jobid>. For a complete discussion of methods for aborting jobs or specific tasks, see the discussion of Aborting a Large Scale DSO Simulation under Large Scale DSO for Parametric Analysis.
- -repackageresults
Choose this option to add simulations results to the input archive file. Note: this option only applies if an archive file is provided as input.
- -batchsolve
Solves the specified parametric setup.
Syntax for the setup:
<design-name>:Optimetrics:<parametric-setup>