Learning to Increment A Contextual Model

2018-12-07
Kalkan, Sinan
Doğan, Fethiye Irmak
Bozcan, İlker
In this paper, we summarized our efforts on incremental construction of latent variables in context (topic) models. With our models, an agent can incrementally learn a representation of critical contextual information. We demonstrated that a learning-based formulation outperforms rule-based models, and generalizes well across many settings and to real data
32nd Conference on Neural Information Processing Systems (NIPS 2018)

Suggestions

On a Minimal Spanning, Tree Approach in the Cluster Validation Problem
Barzily, Zeev; Volkovich, Zeev; Öztürk, Başak; Weber, Gerhard Wilhelm (2009-01-01)
In this paper, a method for the study of cluster stability is purposed. We draw pairs of samples from the data, according to two sampling distributions. The first distribution corresponds to the high density zones of data-elements distribution. Thus it is associated with the clusters cores. The second one, associated with file cluster margins, is related to the low density zones. The samples are clustered and the two obtained partitions are compared. The partitions are considered to be consistent if the obt...
Modeling and implementation of local volatility surfaces in Bayesian framework
Animoku, Abdulwahab; Uğur, Ömür; Yolcu-Okur, Yeliz (2018-06-01)
In this study, we focus on the reconstruction of volatility surfaces via a Bayesian framework. Apart from classical methods, such as, parametric and non-parametric models, we study the Bayesian analysis of the (stochastically) parametrized volatility structure in Dupire local volatility model. We systematically develop and implement novel mathematical tools for handling the classical methods of constructing local volatility surfaces. The most critical limitation of the classical methods is obtaining negativ...
Cluster stability using minimal spanning trees
Barzily, Zeev; Volkovich, Zeev; Akteke-Oeztuerk, Basak; Weber, Gerhard Wilhelm (2008-05-23)
In this paper, a method for the study of cluster stability is purposed. We draw pairs of samples from the data, according to two sampling distributions. The first distribution corresponds to the high density zones of data-elements distribution. It is associated with the clusters cores. The second one, associated with the cluster margins, is related to the low density zones. The samples are clustered and the two obtained partitions are compared. The partitions are considered to be consistent if the obtained ...
Learning on the border: Active learning in imbalanced data classification
Ertekin Bolelli, Şeyda; Bottou, Leon; Giles, C Lee (2007-10-06)
This paper is concerned with the class imbalance problem which has been known to hinder the learning performance of classification algorithms. The problem occurs when there are significantly less number of observations of the target concept. Various real-world classification tasks, such as medical diagnosis, text categorization and fraud detection suffer from this phenomenon. The standard machine learning algorithms yield better prediction performance with balanced datasets. In this paper, we demonstrate th...
Learning Context on a Humanoid Robot using Incremental Latent Dirichlet Allocation
Çelikkanat, Hande; Orhan, Guner; Pugeault, Nicolas; Guerin, Frank; Şahin, Erol; Kalkan, Sinan (Institute of Electrical and Electronics Engineers (IEEE), 2016-03-01)
In this paper, we formalize and model context in terms of a set of concepts grounded in the sensorimotor interactions of a robot. The concepts are modeled as a web using Markov Random Field (MRF), inspired from the concept web hypothesis for representing concepts in humans. On this concept web, we treat context as a latent variable of Latent Dirichlet Allocation (LDA), which is a widely-used method in computational linguistics for modeling topics in texts. We extend the standard LDA method in order to make ...
Citation Formats
S. Kalkan, F. I. Doğan, and İ. Bozcan, “Learning to Increment A Contextual Model,” Montréal, Canada, 2018, p. 1, Accessed: 00, 2021. [Online]. Available: http://www.kovan.ceng.metu.edu.tr/~sinan/publications/NIPS2018_CL.pdf.