Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Discrete gradient method: Derivative-free method for nonsmooth optimization
Download
index.pdf
Date
2008-05-01
Author
Bagirov, A. M.
Karasözen, Bülent
Sezer, M.
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
223
views
250
downloads
Cite This
A new derivative-free method is developed for solving unconstrained nonsmooth optimization problems. This method is based on the notion of a discrete gradient. It is demonstrated that the discrete gradients can be used to approximate subgradients of a broad class of nonsmooth functions. It is also shown that the discrete gradients can be applied to find descent directions of nonsmooth functions. The preliminary results of numerical experiments with unconstrained nonsmooth optimization problems as well as the comparison of the proposed method with the nonsmooth optimization solver DNLP from CONOPT-GAMS and the derivative-free optimization solver CONDOR are presented.
Subject Keywords
Nonsmooth optimization
,
Derivative-free optimization
,
Subdifferential
,
Discrete gradients
URI
https://hdl.handle.net/11511/32669
Journal
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS
DOI
https://doi.org/10.1007/s10957-007-9335-5
Collections
Graduate School of Applied Mathematics, Article
Suggestions
OpenMETU
Core
Derivative free algorithms for large scale non-smooth optimization and their applications
Tor, Ali Hakan; Karasözen, Bülent; Department of Mathematics (2013)
In this thesis, various numerical methods are developed to solve nonsmooth and in particular, nonconvex optimization problems. More specifically, three numerical algorithms are developed for solving nonsmooth convex optimization problems and one algorithm is proposed to solve nonsmooth nonconvex optimization problems. In general, main differences between algorithms of smooth optimization are in the calculation of search directions, line searches for finding step-sizes and stopping criteria. However, in nonsmoo...
Aggregate codifferential method for nonsmooth DC optimization
Tor, Ali Hakan; Bagirov, Adil; Karasözen, Bülent (2014-03-15)
A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth o...
Derivative free optimization methods for optimizing stirrer configurations
Uğur, Ömür; SCHAEFER, M.; YAPICI, KEREM (2008-12-16)
In this paper a numerical approach for the optimization of stirrer configurations is presented. The methodology is based on a flow solver, and a mathematical optimization tool, which are integrated into an automated procedure. The flow solver is based on the discretization of the incompressible Navier-Stokes equations by means of a fully conservative finite-volume method for block-structured, boundary-fitted grids, for allowing a flexible discretization of complex stirrer geometries. Two derivative free opt...
DERIVATIVE FREE MULTILEVEL OPTIMIZATION
Karasözen, Bülent (2015-01-01)
Optimization problems with different levels arise by discretization of ordinary and partial differential equations. We present a trust-region based derivative-free multilevel optimization algorithm. The performance of the algorithm is shown on a shape optimization problem and global convergence to the first order critical point is proved.
Continuous optimization approaches for clustering via minimum sum of squares
Akteke-Ozturk, Basak; Weber, Gerhard Wilhelm; Kropat, Erik (2008-05-23)
In this paper, we survey the usage of semidefinite programming (SDP), and nonsmooth optimization approaches for solving the minimum sum of squares problem which is of fundamental importance in clustering. We point out that the main clustering idea of support vector clustering (SVC) method could be interpreted as a minimum sum of squares problem and explain the derivation of semidefinite programming and a nonsmooth optimization formulation for the minimum sum of squares problem. We compare the numerical resu...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
A. M. Bagirov, B. Karasözen, and M. Sezer, “Discrete gradient method: Derivative-free method for nonsmooth optimization,”
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS
, pp. 317–334, 2008, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/32669.