Abstract
In this paper, we present an automated music video generation framework that utilizes emotion synchronization between video and music. After a user uploads a video or music, the framework segments the video and music, and then predicts the emotion of each of the segments. The preprocessing result is stored on the server's database. The user can select a set of videos and music from the database, and the framework will generate a music video. The system finds the most closely associated video segment with the music segment by comparing certain low level features and the emotion differences. We compare our work to a similar music video generation method by performing a user preference study, and show that our method generates a preferable result.
Original language | English |
---|---|
Title of host publication | 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 2594-2597 |
Number of pages | 4 |
ISBN (Electronic) | 9781509018970 |
DOIs | |
Publication status | Published - 2017 Feb 6 |
Event | 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Budapest, Hungary Duration: 2016 Oct 9 → 2016 Oct 12 |
Publication series
Name | 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings |
---|
Other
Other | 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 |
---|---|
Country/Territory | Hungary |
City | Budapest |
Period | 16/10/9 → 16/10/12 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Artificial Intelligence
- Control and Optimization
- Human-Computer Interaction