Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Bayesian learning under nonnormality
Download
index.pdf
Date
2004
Author
Yılmaz, Yıldız Elif
Metadata
Show full item record
Item Usage Stats
272
views
109
downloads
Cite This
Naive Bayes classifier and maximum likelihood hypotheses in Bayesian learning are considered when the errors have non-normal distribution. For location and scale parameters, efficient and robust estimators that are obtained by using the modified maximum likelihood estimation (MML) technique are used. In naive Bayes classifier, the error distributions from class to class and from feature to feature are assumed to be non-identical and Generalized Secant Hyperbolic (GSH) and Generalized Logistic (GL) distribution families have been used instead of normal distribution. It is shown that the non-normal naive Bayes classifier obtained in this way classifies the data more accurately than the one based on the normality assumption. Furthermore, the maximum likelihood (ML) hypotheses are obtained under the assumption of non-normality, which also produce better results compared to the conventional ML approach.
Subject Keywords
Electronic computers.
,
Computer science.
URI
http://etd.lib.metu.edu.tr/upload/3/12605582/index.pdf
https://hdl.handle.net/11511/15006
Collections
Graduate School of Natural and Applied Sciences, Thesis
Suggestions
OpenMETU
Core
Comparison of rough multi layer perceptron and rough radial basis function networks using fuzzy attributes
Vural, Hülya; Alpaslan, Ferda Nur; Department of Computer Engineering (2004)
The hybridization of soft computing methods of Radial Basis Function (RBF) neural networks, Multi Layer Perceptron (MLP) neural networks with back-propagation learning, fuzzy sets and rough sets are studied in the scope of this thesis. Conventional MLP, conventional RBF, fuzzy MLP, fuzzy RBF, rough fuzzy MLP, and rough fuzzy RBF networks are compared. In the fuzzy neural networks implemented in this thesis, the input data and the desired outputs are given fuzzy membership values as the fuzzy properties أlow...
Random Set Methods Estimation of Multiple Extended Objects
Granstrom, Karl; Lundquist, Christian; Gustafsson, Fredrik; Orguner, Umut (Institute of Electrical and Electronics Engineers (IEEE), 2014-06-01)
Random set-based methods have provided a rigorous Bayesian framework and have been used extensively in the last decade for point object estimation. In this article, we emphasize that the same methodology offers an equally powerful approach to estimation of so-called extended objects, i.e., objects that result in multiple detections on the sensor side. Building upon the analogy between Bayesian state estimation of a single object and random finite set (RFS) estimation for multiple objects, we give a tutorial...
Domain adaptation on graphs by learning graph topologies: theoretical analysis and an algorithm
Vural, Elif (The Scientific and Technological Research Council of Turkey, 2019-01-01)
Traditional machine learning algorithms assume that the training and test data have the same distribution, while this assumption does not necessarily hold in real applications. Domain adaptation methods take into account the deviations in data distribution. In this work, we study the problem of domain adaptation on graphs. We consider a source graph and a target graph constructed with samples drawn from data manifolds. We study the problem of estimating the unknown class labels on the target graph using the...
Deriving a dynamic programming algorithm for batch scheduling in the refinement calculus
Aktuğ, İrem; Oğuztüzün, Mehmet Halit S.; Department of Computer Engineering (2003)
Refinement Calculus is a formalization of stepwise program construction.In this approach a program is derived from its specification by applying refinement rules.The Refinement Calculator,developed at TUCS,Finland,provides tool support for the Refinement Calculus.This thesis presents a case study aiming to evaluate the applicability of the theory and the performance of the tool.The Refinement Calculator is used for deriving a dynamic progaramming algorithm for a single-machine batch scheduling problem.A qua...
Sphere-packing bound for block-codes with feedback and finite memory
Giacomo, Como; Nakiboğlu, Barış (2010-07-23)
A lower bound bound is established on the error probability of fixed-length block-coding systems with finite memory feedback, which can be described in terms of a time dependent finite state machine. It is shown that the reliability function of such coding systems over discrete memoryless channels is upper-bounded by the sphere-packing exponent.
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
Y. E. Yılmaz, “Bayesian learning under nonnormality,” M.S. - Master of Science, Middle East Technical University, 2004.