/Collapse all
- 1. Introduction
- 2. Design of Experiments
- 3. Metamodeling techniques
- 3.1. Regression models for scalar outputs
- 3.1.1. Polynomial regression
- 3.1.2. Box-Cox transformation
- 3.1.3. Moving Least Squares approximation
- 3.1.4. Kriging
- 3.1.5. Radial basis functions
- 3.1.6. Support Vector Regression
- 3.1.7. Sparse Grid Approximation
- 3.1.8. Genetic Aggregation Response Surface
- 3.1.9. Neural networks
- 3.1.10. Deep Infinite Mixture Gaussian Process (DIM-GP)
- 3.2. Model quality measures
- 3.2.1. Residual analysis
- 3.2.1.1. Residual sum of squares
- 3.2.1.2. Mean squared error (MSE)
- 3.2.1.3. Root mean squared error (RMSE)
- 3.2.1.4. Relative root mean squared error
- 3.2.1.5. Maximum residual
- 3.2.1.6. Maximum Relative Residual
- 3.2.1.7. Relative Maximum Absolute Error
- 3.2.1.8. Relative Average Absolute Error
- 3.2.1.9. PRESS residuals for polynomial regression
- 3.2.1.10. R2 for prediction of polynomial regression
- 3.2.2. Generalized cross-validation and Akaike"s final prediction error
- 3.2.3. k-fold cross-validation
- 3.2.4. Coefficient of Determination (R2)
- 3.2.5. Coefficient of Prognosis
- 3.3. Model Sensitivity measures
- 3.4. MOP principle and competition
- 3.5. Adaptive metamodeling
- 3.6. Metamodels Glossary
- 4. Optimization methods
- 4.1. Optimization setup
- 4.2. Single-objective optimization
- 4.2.1. Accompanying example: optimization of a damped oscillator
- 4.2.2. Gradient-based methods
- 4.2.2.1. Non-Linear Programming by Quadratic Lagrangian (NLPQLP)
- 4.2.2.2. Mixed-Integer Sequantial Quadratic Programming (MISQP)
- 4.2.2.3. Leapfrog optimizer for constrained minimization (LFOPC)
- 4.2.2.4. DAKOTA Coliny Solis-Wets
- 4.2.2.5. DAKOTA CONMIN
- 4.2.2.6. DAKOTA OPT++ Finite Differences Newton (FDN)
- 4.2.2.7. DAKOTA OPT++ Quasi Newton (QN)
- 4.2.2.8. DAKOTA OPT++ Polak-Ribiere Conjugate Gradient (PR)
- 4.2.3. Pattern search methods
- 4.2.4. Response surface based methods
- 4.2.5. Nature-inspired methods
- 4.2.5.1. Evolutionary algorithms
- 4.2.5.2. Darwin algorithm
- 4.2.5.3. EVOLVE
- 4.2.5.4. DAKOTA Coliny Evolutionary Algorithm (EA)
- 4.2.5.5. Covariance Matrix Adaptation
- 4.2.5.6. Particle Swarm Optimization
- 4.2.5.7. Stochastic Design Improvement
- 4.2.5.8. Adaptive Simulated Annealing
- 4.2.5.9. Differential Evolution
- 4.2.6. Hybrid methods
- 4.3. Multi-objective optimization
- 4.3.1. Pareto optimization
- 4.3.2. Weighted sum
- 4.3.3. ε-Constraint
- 4.3.4. Response Surface based methods
- 4.3.5. Natured inspired methods
- 4.3.6. Hybrid methods
- 4.3.7. Performance metrics
- 4.3.7.1. Number of nondominated points
- 4.3.7.2. Spread
- 4.3.7.3. Standard deviation of crowding distance
- 4.3.7.4. Min/Max of objectives
- 4.3.7.5. Hypervolume
- 4.3.7.6. Number of common points
- 4.3.7.7. Number of new nondominated solutions
- 4.3.7.8. Number of old dominated solutions n(Q)
- 4.3.7.9. Consolidation ratio
- 4.3.7.10. Improvement ratio
- 4.4. Optimization Glossary