Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Memory Networks as Neurocomputational Models of Cognitive Functions
Download
MEMORY_NETWORKS_AS_NEUROCOMPUTATIONAL_MODELS_OF_COGNITIVE_FUNCTIONS_Sinan_Altinuc.pdf
Date
2024-7-16
Author
ALTINUÇ, Sinan Onur
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
56
views
54
downloads
Cite This
This thesis develops a cognitive computational framework for exploring memory networks as neurocomputational models of syntax and rule learning, with a particular focus on Neural Turing Machines (NTMs). It contrasts NTMs with recurrent neural networks such as long short-term memory (LSTM) and Legendre memory unit (LMU) architectures. By using artificial grammars to evaluate syntactic processing capabilities and specifically assessing the rule-learning and generalization abilities of these neural architectures, it studies the role of memory mechanisms in cognitive models, particularly the role of external memory. The experiments involve training neural networks on various types of grammar, including regular, context-free grammars (CFGs) and mildly context-sensitive grammars (MCFGs), with a focus on hierarchical structures and long-distance dependencies. Comparative analysis in the thesis shows that unlike regular recurrent networks, neural networks with external memory structures such as NTM were able to learn and generalize CFGs beyond training data by learning the rules. Supported also by meta-analysis, this thesis shows that augmented memory networks can generalize grammars with MCSG properties with a few exceptions. This capacity of the memory networks is compared with linguistic, and neuroscientific studies to understand the structural biases that might be required for these tasks and their potential neural correlates in the human brain. This thesis suggests that the ability to represent and store explicit representations in an external memory might be a key factor for rule-based symbolic operations on neural networks with implications for neuroscience and AI.
Subject Keywords
Memory Networks
,
Rule Learning
,
Generalization
,
Computational Models of Mind
,
Cognitive Models
,
Working Memory
URI
https://hdl.handle.net/11511/110953
Collections
Graduate School of Informatics, Thesis
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
S. O. ALTINUÇ, “Memory Networks as Neurocomputational Models of Cognitive Functions,” Ph.D. - Doctoral Program, Middle East Technical University, 2024.