Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Dictionary learning for efficient classification with 1-sparse representations
Download
index.pdf
Date
2018
Author
Engin, Ege
Metadata
Show full item record
Item Usage Stats
272
views
132
downloads
Cite This
Sparse representations have the goal of expressing a given signal as a linear combination of a small number of signals that capture well its characteristics. Dictionary models allowing sparse representations have proven to be quite useful for the treatment and analysis of data in recent years. In particular, the learning of dictionaries in a manner adapted to the characteristics of each data class in a supervised learning problem and representing the data with the learned dictionaries significantly improve the accuracy of classifiers. However, large dictionary sizes and the complexity of the computation of sparse representations may limit the applicability of these methods especially over platforms with limited storage and computational resources. In this thesis, we study the problem of supervised dictionary learning for fast and efficient classification of test samples. In order to achieve low computational complexity and efficient usage of memory, our method learns analytically represented supervised dictionaries that allow an accurate classification of test samples based on 1-sparse representations. We adopt a representation of dictionary atoms in a two-dimensional analytical basis, where the atoms are learned with respect to an objective involving their distance to the samples from the same class and different classes, as well as an incoherence term encouraging the variability between dictionary atoms. The performance of the proposed method is evaluated with experiments on different image datasets. The comparison of the method to reference supervised and unsupervised dictionary learning methods suggests that it provides satisfactory classification performance under 1-sparse signal representations.
Subject Keywords
Supervised dictionary learning.
,
Machine learning.
,
Sparse coding.
URI
http://etd.lib.metu.edu.tr/upload/12622110/index.pdf
https://hdl.handle.net/11511/27309
Collections
Graduate School of Natural and Applied Sciences, Thesis
Suggestions
OpenMETU
Core
Dimension reduced robust beamforming for towed arrays
Topçu, Emre; Candan, Çağatay; Department of Electrical and Electronics Engineering (2015)
Adaptive beamforming methods are used to obtain higher signal to interference plus noise ratio at the array output. However, these methods are very sensitive to steering vector and covariance matrix estimation errors. To overcome this issue, robust methods are usually employed. On the other hand, implementation of these robust methods can be computationally expensive for arrays with large number of sensors. Reduced dimension techniques aim to lower the computational load of adaptive beamforming algorithms w...
Triple stub circuit topology as simultaneous insertion phase, amplitude and impedance control circuit for phased array applications
Unlu, M.; Demir, Şimşek; Akın, Tayfun (Institution of Engineering and Technology (IET), 2012-10-23)
This study shows that the well-known triple stub circuit topology can also be used for controlling the insertion phase and amplitude of a given signal simultaneously, as well as preserving its impedance transformation ability. The triple stub circuit topology, which is nothing but an extension of the conventional double stub loaded-line phase shifter, results in one more degree of freedom to its solution when it is solved for its insertion phase. This additional degree of freedom not only brings the impedan...
Noise Estimation for Hyperspectral Imagery using Spectral Unmixing and Synthesis
DEMİRKESEN, CAN; Leloğlu, Uğur Murat (2014-09-25)
Most hyperspectral image (HSI) processing algorithms assume a signal to noise ratio model in their formulation which makes them dependent on accurate noise estimation. Many techniques have been proposed to estimate the noise. A very comprehensive comparative study on the subject is done by Gao et al. [1]. In a nut-shell, most techniques are based on the idea of calculating standard deviation from assumed-to-be homogenous regions in the image. Some of these algorithms work on a regular grid parameterized wit...
Estimation of partially observed multiple graph signals by learning spectrally concentrated graph kernels
Turhan, Gülce; Vural, Elif; Department of Electrical and Electronics Engineering (2021-3-31)
Graph models provide flexible tools for the representation and analysis of signals defined over domains such as social or sensor networks. However, in real applications data observations are often not available over the whole graph, due to practical problems such as broken sensors, connection loss, or storage problems. In this thesis, we study the problem of estimating partially observed graph signals on multiple graphs. We consider possibly multiple graph domains over which a set of signals is available wi...
Synchronizing linear systems via partial-state coupling
Tuna, Sezai Emre (2008-08-01)
A basic result in the synchronization of linear systems via output coupling is presented. For identical discrete-time linear systems that are detectable from their Outputs and neutrally stable, it is shown that a linear Output feedback law exists under which the Coupled systems globally asymptotically synchronize for all fixed connected (asymmetrical) network topologies. An algorithm is provided to Compute Such a feedback law based on individual system parameters.
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
E. Engin, “Dictionary learning for efficient classification with 1-sparse representations,” M.S. - Master of Science, Middle East Technical University, 2018.