

In A CM Multimedia 95 - Electronic Proceedings, 1995.

Query by humming - musical information retrieval in an audio database. Interval and contour in melody processing. In Canadian Association for Information Science proceedings of the 2$rd Annual Conference, Connectedness: Information, Systems, People, Organisations, pages 135-46. The musifind music information retrieval project, phase iii: evaluation of indexing options. The Journal for the acoustical society of America, 49(2):524-531, 1971. Contour, interval and pitch recognition in memory for melodies. In The Psychology of Music, chapter 13, pages 413-429. Melodic information processing and its development. Scale and contour: Two components of a theory of memory for melodies. In The Psychology of Music, chapter 4, pages 99-134. In Computer Representations and Models in Music, pages 143-170. Automatic transcription of german lute tablatures: an artificial intelligence application. Design patterns for interactive musical systems, iEEE Multimedia, 5(3):36-46, 1998. In Proceedings: A CM Multimedia 98, September 11-15, 1998 Bristol, England. A tool for content-based navigation of music. Journal of Experimental Psychology: Human Perception and Performance, 6(3):501-515, 1980. Recognition of transposed melodies: A key-distance effect in developmental perspective. These experiments show that different comparison techniques differ widely in their effectiveness and that, by instantiating the framework with appropriate music manipulation and comparison techniques, pieces of music that match a query can be identified in a large collection. We use the framework to compare a range of techniques for determining whether two pieces of music are similar, by experimentally testing their ability to retrieve different transcriptions of the same piece of music from a large collection of MIDI files. In this paper we propose a three-stage framework for matching pieces of music. One mode of search is by similarity, but, for music, similarity search presents several difficulties: in particular, for melodic query support, deciding what part of the music is likely to be perceived as the theme by a listener, and deciding whether two pieces of music with different sequences of notes represent the same theme. With the growth in digital representations of music, and of music stored in these representations, it is increasingly attractive to search collections of music.
