Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Semantic dimensionality reduction during language acquisition: A window into concept representation
Download
Rojda_Ozcan_Semantic_Dimensionality_Reduction.pdf
Date
2022-8-31
Author
Özcan, Rojda
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
363
views
229
downloads
Cite This
We explore the dimensionality in the semantic representations derived from the Eve fragment of the CHILDES database to gain insights into whether or not semantic dimensionality reduc- tion (DR) occurs during language acquisition, and if so to gain insights into how this reduction of dimensions could look like. We start exploring these representations that are in the form of lambda terms (LTs) by trying to find different representations for them which would be more suitable for the use of DR techniques on them. Seeing as they make the data set more suitable for the application of DR, we prepare a version of the original data set where the LTs are turned into De Bruijn terms (DBTs). After coming up with vectorial representations for the LTs and DBTs (For DBTs we prepare 2 different versions of the data set.), we study the dimensionality in these 3 data sets by utilizing Principal Component Analysis, Cosine Similarity Comparisons, and Affinity Propagation Clustering, and report the results.
Subject Keywords
Language acquisition
,
Semantics
,
Dimensionality reduction
,
De bruijn index
,
Language of thought
URI
https://hdl.handle.net/11511/99545
Collections
Graduate School of Informatics, Thesis
Suggestions
OpenMETU
Core
Sentiment and Context-refined Word Embeddings for Sentiment Analysis
Deniz, Ayca; Angin, Merih; Angın, Pelin (2021-01-01)
Word embeddings have become the de-facto tool for representing text in natural language processing (NLP) tasks, as they can capture semantic and syntactic relations, unlike their precedents such as Bag-of-Words. Although word embeddings have been employed in various studies in recent years and proven to be effective in many NLP tasks, they are still immature for sentiment analysis, as they suffer from insufficient sentiment information. General word embedding models pre-trained on large corpora with methods...
Learning semi-supervised nonlinear embeddings for domain-adaptive pattern recognition
Vural, Elif (null; 2019-05-20)
We study the problem of learning nonlinear data embeddings in order to obtain representations for efficient and domain-invariant recognition of visual patterns. Given observations of a training set of patterns from different classes in two different domains, we propose a method to learn a nonlinear mapping of the data samples from different domains into a common domain. The nonlinear mapping is learnt such that the class means of different domains are mapped to nearby points in the common domain in order to...
Semantic Communications in Networked Systems: A Data Significance Perspective
Uysal, Elif; KAYA, ONUR; Ephremides, Anthony; Gross, James; Codreanu, Marian; Popovski, Petar; Assaad, Mohamad; Liva, Gianluigi; Munari, Andrea; Soret, Beatriz; Soleymani, Touraj; Johansson, Karl Henrik (2022-7-01)
We present our vision for a departure from the established way of architecting and assessing communication networks, by incorporating the semantics of information, defined not necessarily as the meaning of the messages, but as their significance, possibly within a real-time constraint, relative to the purpose of the data exchange. We argue that research efforts must focus on laying the theoretical foundations of a redesign of the entire process of information generation, transmission, and usage for networke...
Learning by optimization in random neural networks
Atalay, Mehmet Volkan (1998-10-28)
The random neural network model proposed by Gelenbe has a number of interesting features in addition to a well established theory. Gelenbe has also developed a learning algorithm for the recurrent random network model using gradient descent of a quadratic error function. We present a quadratic optimization approach for learning in the random neural network, particularly for image texture reconstruction.
Process ontology development using natural language processing: a multiple case study
Gurbuz, Ozge; Rabhi, Fethi; Demirörs, Onur (2019-09-17)
Purpose Integrating ontologies with process modeling has gained increasing attention in recent years since it enhances data representations and makes it easier to query, store and reuse knowledge at the semantic level. The authors focused on a process and ontology integration approach by extracting the activities, roles and other concepts related to the process models from organizational sources using natural language processing techniques. As part of this study, a process ontology population (PrOnPo) metho...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
R. Özcan, “Semantic dimensionality reduction during language acquisition: A window into concept representation,” M.S. - Master of Science, Middle East Technical University, 2022.