Date of Award

2019

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Computer Science : Modeling and Simulation

Committee Chair

Mikel D. Petty

Committee Member

Huaming Zhang

Committee Member

Sara Graves

Committee Member

Buford Shipley

Committee Member

Bryan Mesmer

Subject(s)

Simulation methods, Regression analysis, Kalman filtering, Machine learning

Abstract

Optimization is the search for the input settings or configuration that will produce a desired result from the system of interest. When direct system experimentation is prohibitive, experts construct a system model that represents those aspects relevant to the problem and abstracts those that are not. This model serves as an objective function, a mapping from input settings or configuration to the value of a system. Most objective functions are deterministic; however, some objective functions represent variability through stochastic behavior. An optimization algorithm determines the optimal system inputs through repeated evaluation of the objective function with different input values. Response surface methodology is a unique application of machine learning techniques (specifically, regression) to enable efficient optimization of computationally expensive objective functions. Response surface methodology replaces the objective function with a computationally inexpensive and deterministic mathematical approximation, such as a spline, which can then be easily searched to find the optimum. The deterministic approximation that emulates the objective function is referred to as a surrogate model. The five surrogate modeling techniques most commonly found in the literature – polynomial regression, spline regression, artificial neural networks, radial basis functions, and kriging – are analyzed in a broad empirical comparison. Data on the representation accuracy for each surrogate modeling technique is presented across 18 different test functions at 6 different levels of noise. Then a novel adaptation of surrogate model training procedures utilizing Kalman filtering is developed, improving numerical stability of model creation, particularly in data sets with significant noise. The effect of filter-based training procedures is evaluated across the same suite of test functions for the two highest accuracy surrogate types. In addition, the effects of using noisy gradient information for model training, as well as different training algorithms, are examined for potential accelerated convergence. Finally, a comparison of an optimization algorithm, using the modified surrogate model to other conventional optimization algorithms, is performed. This comparison assesses which optimization algorithm is best suited to the problem based on metrics that can be computed from function data. This information enables practitioners to select the optimization algorithm best suited to their problem.

Garrison Code Repository.zip (54 kB)
Supplementary Dissertation Code

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.