Emotional image and musical information retrieval with interactive genetic algorithm

Research output: Contribution to journalArticlepeer-review

36 Citations (Scopus)


Several techniques in artificial intelligence have shown a great potential to develop useful human-computer interfaces, but it is still quite far from realizing a system of matching the human performance, especially in terms of emotion, intuition and inspiration. To overcome this shortcoming, we present a promising technique called interactive genetic algorithm (IGA), which performs optimization with human evaluation, and with which the user can obtain what he has in mind through repeated interaction. To project the usefulness of the IGA to develop emotional human-computer interfaces, we have applied it to the problems of image and music information retrieval. Several experiments show that our approach allows us to design and search digital media not only explicitly expressed, but also abstract images such as "cheerful impression," and "gloomy impression." It is expected that the same approach can be applied to many other problems in musical information retrieval and manipulation based on intuition and inspiration.

Original languageEnglish
Pages (from-to)702-711
Number of pages10
JournalProceedings of the IEEE
Issue number4
Publication statusPublished - 2004 Apr

Bibliographical note

Funding Information:
Manuscript received February 5, 2003; revised November 7, 2003. This work was supported by the Brain Science and Engineering Research Program sponsored by the Korean Ministry of Science and Technology. S.-B. Cho is with the Department of Computer Science, Yonsei University, Seoul 120-749, Korea (e-mail: sbcho@cs.yonsei.ac.kr). Digital Object Identifier 10.1109/JPROC.2004.825900

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Electrical and Electronic Engineering


Dive into the research topics of 'Emotional image and musical information retrieval with interactive genetic algorithm'. Together they form a unique fingerprint.

Cite this