Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Real time FPGA implementation of Full Search video stabilization method
Date
2012-04-20
Author
ÖZSARAÇ, ismail
Ulusoy, İlkay
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
181
views
0
downloads
Cite This
Full Search video stabilization method is implemented on FPGA to realize its real time performance. Also, the method is implemented and tested in MATLAB. FPGA results are compared with MATLAB's to see the accuracy performance. The input video is PAL which frame period is 40 ms. The FPGA implementation is capable of producing new stabilization data at every PAL frame which allows the implementation to be classified as real time. Simulation and hardware tests show that FPGA implementation can reach the MATLAB accuracy performance.
Subject Keywords
Field programmable gate arrays
,
Streaming media
,
Application software
,
PSNR
,
Real time systems
,
Computer languages
,
Motion estimation
URI
https://hdl.handle.net/11511/42558
DOI
https://doi.org/10.1109/siu.2012.6204649
Collections
Department of Electrical and Electronics Engineering, Conference / Seminar
Suggestions
OpenMETU
Core
Real time color blending of rendered and captured video
Reinhard, Erik; Akyüz, Ahmet Oğuz; Colbert, Mark; Hughes, Charles; Oconnor, Matthew (2004-12-04)
Augmented reality involves mixing captured video with rendered elements in real-time. For augmented reality to be effective in training and simulation applications, the computer generated components need to blend in well with the captured video. Straightforward compositing is not sufficient, since the chromatic content of video and rendered data may be very different such that it is immediately obvious which parts of the composited image were rendered and which were captured. We propose a simple and effecti...
Real time color based object tracking
Özzaman, Gökhan; Erkmen, İsmet; Department of Electrical and Electronics Engineering (2005)
A method for real time tracking of non-rigid arbitrary objects is proposed in this study. The approach builds on and extends work on multidimensional color histogram based target representation, which is enhanced by spatial masking with a monotonically decreasing kernel profile prior to back-projection. The masking suppresses the influence of the background pixels and induces a spatially smooth target model representation suitable for gradient-based optimization. The main idea behind this approach is that a...
Real-time hardware-in-the-loop simulation of electrical machine systems using FPGAs
Üşenme, Serdar; Dilan, R.A.; Dölen, Melik; Koku, Ahmet Buğra (2009-11-18)
This study focuses on the development an integrated software and hardware platform that is capable of performing real-time simulation of dynamic systems, including electrical machinery, for the purpose of hardware-in-the-loop simulation (HILS). The system to be controlled is first defined using a block diagram editor. The defined model is then compiled and downloaded onto an FPGA (¿Field Programmable Gate Array¿) based hardware platform, which is to interface with the controller under test and carry out the...
MULTI-RESOLUTION MOTION ESTIMATION FOR MOTION COMPENSATED FRAME INTERPOLATION
Guenyel, Bertan; Alatan, Abdullah Aydın (2010-09-29)
A multi-resolution motion estimation scheme is proposed for tracking of the true 2D motion in video sequences for motion compensated image interpolation. The proposed algorithm utilizes frames with different resolutions and adaptive block dimensions for efficient representation of motion. Firstly, motion vectors for each block are assigned as a result of predictive search in each pass. Then, the outlier motion vectors are detected and corrected at the end of each pass. Simulation results with respect to dif...
USER DIRECTED VIEW SYNTHESIS ON OMAP PROCESSORS
Yildiz, Mursel; Akar, Gözde (2010-06-09)
In this paper, we propose a system for user directed real time view synthesis for hand-held devices. Stored image frames with corresponding depth maps are used as input to the system. Users view point choice is captured using a GYRO based system. OMAP3530 microprocessor is used as the main processor which processes suggested view synthesis algorithm with occlusion handling and frame enhancement techniques. Proposed algorithms are implemented on DSP core and ARM core of OMAP3530 separately and their performa...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
i. ÖZSARAÇ and İ. Ulusoy, “Real time FPGA implementation of Full Search video stabilization method,” 2012, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/42558.