Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
NON-EUCLIDEAN VECTOR PRODUCT FOR NEURAL NETWORKS
Date
2018-04-20
Author
Afrasiyabi, Arman
Badawi, Diaa
Nasır, Barış
Yildiz, Ozan
Yarman Vural, Fatoş Tunay
ÇETİN, AHMET ENİS
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
163
views
0
downloads
Cite This
We present a non-Euclidean vector product for artificial neural networks. The vector product operator does not require any multiplications while providing correlation information between two vectors. Ordinary neurons require inner product of two vectors. We propose a class of neural networks with the universal approximation property over the space of Lebesgue integrable functions based on the proposed non-Euclidean vector product. In this new network, the "product" of two real numbers is defined as the sum of their absolute values, with the sign determined by the sign of the product of the numbers. This "product' is used to construct a vector product in R-N. The vector product induces the l(1) norm. The additive neural network successfully solves the XOR problem. Experiments on MNIST and CIFAR datasets show that the classification performance of the proposed additive neural network is comparable to the corresponding multi-layer perceptron and convolutional neural networks.
Subject Keywords
Non-Euclidean operator
,
Additive neural networks
,
Multiplication-free operator
URI
https://hdl.handle.net/11511/55281
Conference Name
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Collections
Department of Computer Engineering, Conference / Seminar
Suggestions
OpenMETU
Core
Asymptotic Behavior of Lotz-Rabiger and Martingale Nets
Emelyanov, Eduard (2010-09-01)
Using Theorem 1 (of convergence) in [1], we prove several results on LR- and M-nets by a unified approach to these nets that appear as the two extreme types of asymptotically abelian nets.
Representing temporal knowledge in connectionist expert systems
Alpaslan, Ferda Nur (1996-09-27)
This paper introduces a new temporal neural networks model which can be used in connectionist expert systems. Also, a Variation of backpropagation algorithm, called the temporal feedforward backpropagation algorithm is introduced as a method for training the neural network. The algorithm was tested using training examples extracted from a medical expert system. A series of experiments were carried out using the temporal model and the temporal backpropagation algorithm. The experiments indicated that the alg...
Domain-Structured Chaos in a Hopfield Neural Network
Akhmet, Marat (World Scientific Pub Co Pte Lt, 2019-12-30)
In this paper, we provide a new method for constructing chaotic Hopfield neural networks. Our approach is based on structuring the domain to form a special set through the discrete evolution of the network state variables. In the chaotic regime, the formed set is invariant under the system governing the dynamics of the neural network. The approach can be viewed as an extension of the unimodality technique for one-dimensional map, thereby generating chaos from higher-dimensional systems. We show that the dis...
Dual-Band Antenna Array Optimizations Using Heuristic Algorithms and the Multilevel Fast Multipole Algorithm
Onol, Can; Gokce, Ozer; Ergül, Özgür Salih (2015-05-24)
We consider design and simulations of dual-band antenna arrays and their optimizations via heuristic algorithms, particularly, genetic algorithms (GAs) and particle swarm optimization (PSO) methods. As shown below, these arrays consist of patch antennas of different sizes, depending on the target frequencies. The resulting radiation problems are solved iteratively, where the matrix-vector multiplications are performed efficiently with the multilevel fast multipole algorithm (MLFMA). MLFMA allows for realist...
Binarized Weight Networks for Inverse Problems
Ozkan, Savas; Becek, Kadircan; Inci, Alperen; Kutukcu, Basar; Ugurcali, Faruk; Kaya, Mete Can; Akar, Gözde (2020-01-01)
In this paper, we present a binarized neural network structure for inverse problems. In this structure, memory requirements and computation time are significantly reduced with a negligible performance drop compared to full-precision models. For this purpose, a unique architecture is proposed based on a residual learning. Precisely, it opts to reconstruct only the error between input and output images, which is eventually centralized the responses around zero. To this end, this provides several advantages fo...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
A. Afrasiyabi, D. Badawi, B. Nasır, O. Yildiz, F. T. Yarman Vural, and A. E. ÇETİN, “NON-EUCLIDEAN VECTOR PRODUCT FOR NEURAL NETWORKS,” presented at the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Calgary, CANADA, 2018, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/55281.