Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Parameter estimation in generalized partial linear models with Tikhanov regularization
Download
index.pdf
Date
2010
Author
Kayhan, Belgin
Metadata
Show full item record
Item Usage Stats
111
views
45
downloads
Cite This
Regression analysis refers to techniques for modeling and analyzing several variables in statistical learning. There are various types of regression models. In our study, we analyzed Generalized Partial Linear Models (GPLMs), which decomposes input variables into two sets, and additively combines classical linear models with nonlinear model part. By separating linear models from nonlinear ones, an inverse problem method Tikhonov regularization was applied for the nonlinear submodels separately, within the entire GPLM. Such a particular representation of submodels provides both a better accuracy and a better stability (regularity) under noise in the data. We aim to smooth the nonparametric part of GPLM by using a modified form of Multiple Adaptive Regression Spline (MARS) which is very useful for high-dimensional problems and does not impose any specific relationship between the predictor and dependent variables. Instead, it can estimate the contribution of the basis functions so that both the additive and interaction effects of the predictors are allowed to determine the dependent variable. The MARS algorithm has two steps: the forward and backward stepwise algorithms. In the rst one, the model is built by adding basis functions until a maximum level of complexity is reached. On the other hand, the backward stepwise algorithm starts with removing the least significant basis functions from the model. In this study, we propose to use a penalized residual sum of squares (PRSS) instead of the backward stepwise algorithm and construct PRSS for MARS as a Tikhonov regularization problem. Besides, we provide numeric example with two data sets; one has interaction and the other one does not have. As well as studying the regularization of the nonparametric part, we also mention theoretically the regularization of the parametric part. Furthermore, we make a comparison between Infinite Kernel Learning (IKL) and Tikhonov regularization by using two data sets, with the difference consisting in the (non-)homogeneity of the data set. The thesis concludes with an outlook on future research.
Subject Keywords
Parameter estimation.
URI
http://etd.lib.metu.edu.tr/upload/12612530/index.pdf
https://hdl.handle.net/11511/20143
Collections
Graduate School of Applied Mathematics, Thesis
Suggestions
OpenMETU
Core
Parameter estimation in generalized partial linear models with conic quadratic programming
Çelik, Gül; Weber, Gerhard Wilhelm; Department of Scientific Computing (2010)
In statistics, regression analysis is a technique, used to understand and model the relationship between a dependent variable and one or more independent variables. Multiple Adaptive Regression Spline (MARS) is a form of regression analysis. It is a non-parametric regression technique and can be seen as an extension of linear models that automatically models non-linearities and interactions. MARS is very important in both classification and regression, with an increasing number of applications in many areas...
Analysis Window Length Selection For Linear Signal Models
Yazar, Alper; Candan, Çağatay (2015-05-19)
A method is presented for the selection of analysis window length, or the number of input samples, for linear signal modeling without compromising the model assumptions. It is assumed that the signal of interest lies in a known linear space and noisy samples of the signal is provided. The goal is to use as many signal samples as possible to mitigate the effect of noise without violating the assumptions on the model. An application example is provided to illustrate the suggested method.
CMARS: a new contribution to nonparametric regression with multivariate adaptive regression splines supported by continuous optimization
Weber, Gerhard-Wilhelm; Batmaz, İnci; Köksal, Gülser; Taylan, Pakize; Yerlikaya-Ozkurt, Fatma (2012-01-01)
Regression analysis is a widely used statistical method for modelling relationships between variables. Multivariate adaptive regression splines (MARS) especially is very useful for high-dimensional problems and fitting nonlinear multivariate functions. A special advantage of MARS lies in its ability to estimate contributions of some basis functions so that both additive and interactive effects of the predictors are allowed to determine the response variable. The MARS method consists of two parts: forward an...
ON ROBUSTNESS OF CONCURRENT LEARNING ADAPTIVE CONTROL TO TIME-VARYING DISTURBANCES AND SYSTEM UNCERTAINTIES
Sarsilmaz, S. Burak; Kutay, Ali Türker; Yucelen, Tansel (2017-11-09)
In this paper, we study the robustness characteristics of a recently developed concurrent learning model reference adaptive control approach to time-varying disturbances and system uncertainties. Specifically, the commonly-used constant (or slowly time-varying) assumption on disturbances and system uncertainties for this particular adaptive control approach is replaced with its bounded counterpart with piecewise continuous and bounded derivatives. Based on the Lyapunov's direct method, we then show that the...
TRACEMIN Fiedler A Parallel Algorithm for Computing the Fiedler Vector
Manguoğlu, Murat; Saied, Faisal; Sameh, Ahmed (null; 2010-06-25)
The eigenvector corresponding to the second smallest eigenvalue of the Laplacian of a graph, known as the Fiedler vector, has a number of applications in areas that include matrix reordering, graph partitioning, protein analysis, data mining, machine learning, and web search. The computation of the Fiedler vector has been regarded as an expensive process as it involves solving a large eigenvalue problem. We present a novel and efficient parallel algorithm for computing the Fiedler vector of large graphs bas...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
B. Kayhan, “Parameter estimation in generalized partial linear models with Tikhanov regularization,” M.S. - Master of Science, Middle East Technical University, 2010.