Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Music driven real-time 3D concert simulation
Date
2006-01-01
Author
Yılmaz, Erdal
Çetin, Yasemin
Erdem, Cigdem Eroglu
Erdem, Tanju
Ozkan, Mehmet
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
173
views
0
downloads
Cite This
Music visualization has always attracted interest from people and it became more popular in the recent years after PCs and MP3 songs emerged as an alternative to existing audio systems. Most of the PC-based music visualization tools employ visual effects such as bars, waves and particle animations. In this work we define a new music visualization scheme that aims to create life-like interactive virtual environment which simulates concert arena by combining different research areas such as crowd animation, facial animation, character modeling and audio analysis.
URI
https://hdl.handle.net/11511/55036
Journal
MULTIMEDIA CONTENT REPRESENTATION, CLASSIFICATION AND SECURITY
Collections
Graduate School of Natural and Applied Sciences, Article
Suggestions
OpenMETU
Core
Interactive object based analysis and manipulation of digital video
Eren, Pekin Erhan; Fu, Yue; TEKALP, AHMET MURAT (1998-12-09)
With the advent of MPEG-4, object-based natural/synthetic hybrid multimedia content is becoming more ubiquitous. In this paper, we address object-based interactive analysis of natural video for editing/authoring natural/synthetic hybrid content. Boundary and local motion of video objects are described by snake and 2-D mesh representations, respectively. The 2-D mesh modeling in effect performs a mapping of a natural video object into a computer graphics representation, namely geometry with motion and a text...
Musical instrument recognition with wavelet envelopes
Hacıhabiboğlu, Hüseyin (2002-09-16)
Automatic recognition of instrument type from raw audio data containing monophonic music is a fundamental problem for audio content analysis. There are many methods for the solution of this problem, which use common spectro-temporal properties like cepstral coefficients or spectral envelopes. A new method for instrument recognition utilising short-time amplitude envelopes of wavelet coefficients as feature vectors is presented. The classification engine is a distinctively small multilayer perceptron (MLP) n...
TOWARDS IMPROVEMENT OF INTERACTION AESTHETICS OF MOBILE MUSIC LISTENING JOURNEYS
Sen Guzin, Sen Guzin; Sener, Bahar (2015-07-31)
Mobile music listening can be traced back to the introduction of Sony Walkman that upgraded music players both with privacy and portability. With the mobile listening media, our daily journeys in public environment have become more privatized, aestheticized and contented. It is a challenge to perform such a private activity in public environment with many people and audio-visual stimuli around. The journeys with music become additionally challenging with the music players' interfaces confined into tiny butt...
Designing for new generation electronic musical instruments: Strategies to improve interaction, user experience and live performance
Öke, Ethem Hürsu; Pedgley, Owaın Francıs; Şener Pedgley, Bahar; Department of Industrial Design (2020-10-30)
Since the turn of the 21st century, ground-breaking advancements in technology have led to the emergence of a completely new ‘species’ of electronic musical instrument. These instruments, which are heavily driven or dependant on technology, have been accompanied by an interdisciplinary movement in the field of musical instrument research and design, interconnecting music-making to disciplines including, but not limited to, industrial design, interaction design, user experience (UX) design, computer sc...
Instrument based wavelet packet decomposition for audio feature extraction
Hacıhabiboğlu, Hüseyin (null; 2001-09-10)
Feature extraction from audio data is a major concern in computer assisted music applications and content based audio retrieval. For general non-stationary signals, wavelet packet decomposition is used with entropy functions for best basis search. Musical instruments have well defined frequency ranges. Thus when audio data containing a solo instrument is concerned, wavelet packet decomposition may be adapted to that instrument's individual characteristics. The method discussed in this paper uses a number of...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
E. Yılmaz, Y. Çetin, C. E. Erdem, T. Erdem, and M. Ozkan, “Music driven real-time 3D concert simulation,”
MULTIMEDIA CONTENT REPRESENTATION, CLASSIFICATION AND SECURITY
, pp. 379–386, 2006, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/55036.