Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Derivative free algorithms for large scale non-smooth optimization and their applications
Download
index.pdf
Date
2013
Author
Tor, Ali Hakan
Metadata
Show full item record
Item Usage Stats
215
views
84
downloads
Cite This
In this thesis, various numerical methods are developed to solve nonsmooth and in particular, nonconvex optimization problems. More speciﬁcally, three numerical algorithms are developed for solving nonsmooth convex optimization problems and one algorithm is proposed to solve nonsmooth nonconvex optimization problems. In general, main diﬀerences between algorithms of smooth optimization are in the calculation of search directions, line searches for ﬁnding step-sizes and stopping criteria. However, in nonsmooth optimiza- tion there is one additional diﬀerence between algorithms. These al gorithms may use diﬀerent gener- alizations of the gradient. In order to develop algorithms for solving nonsmooth convex optimization problems we use the concept of codiﬀerential. Although there exists the odiﬀerential calculus, the calculation of the whole codiﬀerential is not an easy task. Therefore, in the ﬁrst numerical method, only a few elements of the codiﬀerential are used to calculate search directions. In order to reduce the number of codiﬀerential evaluations, in the second method elements of the codiﬀerential calculated in previous iterations are used to calculate search directions. In both the ﬁrst and second methods the problem of calculation of search directions is reduced to the solution of a certain quadratic programming problem. The size of this problem can increase signiﬁcantly as the number of variables increases. In order to avoid this problem in the third method, called the aggregate codiﬀerential method, the number of elements of the codiﬀerential used to ﬁnd search directions is ﬁxed. Such an approach allows one to signiﬁcantly reduce the complexity of codiﬀerential methods and to make them applicable for solving large scale problems of nonsmooth optimization. These methods are applied to some well-known nonsmooth optimization test problems, such as, min-max and general type nonsmooth optimization problems. The obtained numerical results are visualized using performance profiles. In addition, the validation of these methods is made by comparing them with the subgradient and bundle methods using results of numerical experiments. The convergence of methods is analyzed. Finally, the first method is extended to minimize nonsmooth convex functions subject to linear inequalities using slack variables. The notion of quasisecant is used to design an algorithm for solving nonsmooth nonconvex unconstrained optimization problems. In this method, to find descent direction the subgradient algorithm is applied for the solution of a set of linear inequalities. The convergence of the proposed method is analyzed, and the numerical experiments are carried out using general type nonsmooth optimization test problems. To validate this method, the results are compared with those by the subgradient method.
Subject Keywords
Nonsmooth optimization.
,
Mathematical optimization.
URI
http://etd.lib.metu.edu.tr/upload/12615700/index.pdf
https://hdl.handle.net/11511/22602
Collections
Graduate School of Natural and Applied Sciences, Thesis
Suggestions
OpenMETU
Core
Discrete gradient method: Derivative-free method for nonsmooth optimization
Bagirov, A. M.; Karasözen, Bülent; Sezer, M. (2008-05-01)
A new derivative-free method is developed for solving unconstrained nonsmooth optimization problems. This method is based on the notion of a discrete gradient. It is demonstrated that the discrete gradients can be used to approximate subgradients of a broad class of nonsmooth functions. It is also shown that the discrete gradients can be applied to find descent directions of nonsmooth functions. The preliminary results of numerical experiments with unconstrained nonsmooth optimization problems as well as th...
Derivative free optimization methods for optimizing stirrer configurations
Uğur, Ömür; SCHAEFER, M.; YAPICI, KEREM (2008-12-16)
In this paper a numerical approach for the optimization of stirrer configurations is presented. The methodology is based on a flow solver, and a mathematical optimization tool, which are integrated into an automated procedure. The flow solver is based on the discretization of the incompressible Navier-Stokes equations by means of a fully conservative finite-volume method for block-structured, boundary-fitted grids, for allowing a flexible discretization of complex stirrer geometries. Two derivative free opt...
Derivative free optimization of stirrer configurations
Schaefer, M.; Karasözen, Bülent; Uğur, Ömür; Yapici, K. (2005-07-01)
In the present work a numerical approach for the optimization of stirrer configurations is presented. The methodology is based on a parametrized grid generator, a flow solver, and a mathematical optimization tool, which are integrated into an automated procedure. The grid generator allows the parametrized generation of block-structured grids for the stirrer geometries. The flow solver is based on the discretization of the incompressible Navier-Stokes equations by means of a fully conservative finite-volume ...
DERIVATIVE FREE MULTILEVEL OPTIMIZATION
Karasözen, Bülent (2015-01-01)
Optimization problems with different levels arise by discretization of ordinary and partial differential equations. We present a trust-region based derivative-free multilevel optimization algorithm. The performance of the algorithm is shown on a shape optimization problem and global convergence to the first order critical point is proved.
Exact decomposition algorithms for nonlinear location and hub location problems
Gündoğdu, Emine; Gürel, Sinan; Department of Industrial Engineering (2018)
Developing exact solution algorithms to solve difficult optimization problems is one of the most important subjects in the operations research literature. In this dissertation, we develop Benders decomposition based exact solution algorithms (BDTAs) for handling nonlinearity in three selected nonlinear integer location/hub location problems. The first and second problem include nonlinear capacity constraints, while in the last problem, both objective function and the capacity constraints are nonlinear. In o...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
A. H. Tor, “Derivative free algorithms for large scale non-smooth optimization and their applications,” Ph.D. - Doctoral Program, Middle East Technical University, 2013.