Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Efficient physics-based learned reconstruction methods for real-time 3D near-field MIMO radar imaging
Date
2024-01-01
Author
Manisali, Irfan
Oral, Okyanus
Öktem, Sevinç Figen
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
309
views
0
downloads
Cite This
Near-field multiple-input multiple-output (MIMO) radar imaging systems have recently gained significant attention. These systems generally reconstruct the three-dimensional (3D) complex-valued reflectivity distribution of the scene using sparse measurements. Consequently, imaging quality highly relies on the image reconstruction approach. Existing analytical reconstruction approaches suffer from either high computational cost or low image quality. In this paper, we develop novel non-iterative deep learning-based reconstruction methods for real-time near-field MIMO imaging. The goal is to achieve high image quality with low computational cost at compressive settings. The developed approaches have two stages. In the first approach, physics-based initial stage performs adjoint operation to back-project the measurements to the image-space, and deep neural network (DNN)-based second stage converts the 3D backprojected measurements to a magnitude-only reflectivity image. Since scene reflectivities often have random phase, DNN processes directly the magnitude of the adjoint result. As DNN, 3D U-Net is used to jointly exploit range and cross-range correlations. To comparatively evaluate the significance of exploiting physics in a learning-based approach, two additional approaches that replace the physics-based first stage with fully connected layers are also developed as purely learning-based methods. The performance is also analyzed by changing the DNN architecture for the second stage to include complex-valued processing (instead of magnitude-only processing), 2D convolution kernels (instead of 3D), and ResNet architecture (instead of U-Net). Moreover, we develop a synthesizer to generate large-scale dataset for training the neural networks with 3D extended targets. We illustrate the performance through experimental data and extensive simulations. The results show the effectiveness of the developed physics-based learned reconstruction approach compared to commonly used approaches in terms of both runtime and image quality at highly compressive settings. Our source codes and dataset are made available at https://github.com/METU-SPACE-Lab/Efficient-Learned-3D-Near-Field-MIMO-Imaging upon publication to advance research in this field.
Subject Keywords
3D inverse imaging problems
,
Deep learning
,
Near-field microwave imaging
,
Radar imaging
,
Sparse MIMO array
URI
https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85175846278&origin=inward
https://hdl.handle.net/11511/106097
Journal
Digital Signal Processing: A Review Journal
DOI
https://doi.org/10.1016/j.dsp.2023.104274
Collections
Department of Electrical and Electronics Engineering, Article
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
I. Manisali, O. Oral, and S. F. Öktem, “Efficient physics-based learned reconstruction methods for real-time 3D near-field MIMO radar imaging,”
Digital Signal Processing: A Review Journal
, vol. 144, pp. 0–0, 2024, Accessed: 00, 2023. [Online]. Available: https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85175846278&origin=inward.