Derivative and GA-based methods in metamodeling of back-propagation neural networks for constrained approximate optimization

Jongsoo Lee, Heeseok Jeong, Seongkyu Kang

Research output: Contribution to journalArticlepeer-review

22 Citations (Scopus)

Abstract

Artificial neural networks (ANN) have been extensively used as global approximation tools in the context of approximate optimization. ANN traditionally minimizes the absolute difference between target outputs and approximate outputs thereby resulting in approximate optimal solutions being sometimes actually infeasible when it is used as a metamodel for inequality constraint functions. The paper explores the development of the efficient back-propagation neural network (BPN)-based metamodel that ensures the constraint feasibility of approximate optimal solution. The BPN architecture is optimized via two approaches of both derivative-based method and genetic algorithm (GA) to determine interconnection weights between layers in the network. The verification of the proposed approach is examined by adopting a standard ten-bar truss problem. Finally, a GA-based approximate optimization of suspension with an optical flying head is conducted to enhance the shock resistance capability in addition to dynamic characteristics.

Original languageEnglish
Pages (from-to)29-40
Number of pages12
JournalStructural and Multidisciplinary Optimization
Volume35
Issue number1
DOIs
Publication statusPublished - 2008 Jan

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design
  • Control and Optimization

Fingerprint

Dive into the research topics of 'Derivative and GA-based methods in metamodeling of back-propagation neural networks for constrained approximate optimization'. Together they form a unique fingerprint.

Cite this