Hierarchical Clustering of Music towards Human Mood

D. Rajesh, Karpagalakshmi R. C.

Abstract


This paper presents the methods for mining the music, based on mood dimension. Mood is an emerging metadata type and access point
in music digital libraries (MDL) and online music repositories. There is a growing interest in developing and evaluating Music Information
Retrieval (MIR) systems that can provide automated access to the mood dimension of music. Music is nice thing to all. Mood as a music access
feature that is not well understood as well as not standardized. To better understanding we develop method to evaluate automated mood access
techniques. This paper explore the relationships that mood has with genre, artist and usage metadata. There is an important consistency within
the genre-mood and artist-mood relationships. These consistencies lead to us to develop a cluster based approach by creating a relatively small
set of data derived. The emotional component of music has been recognized as the most important factor. Music information behavior studies
have also identified music mood as an important criterion used by people in music. Music evokes various human emotions or creates music
moods through low level musical features. In fact, typical music consists of one or more moods and this can be used as an important factor for
determining the similarity between music. In this paper, we propose a new music retrieval scheme based on the mood change pattern.

 

Keywords: music mood classification, audio features, mood labels.


Full Text:

PDF


DOI: https://doi.org/10.26483/ijarcs.v1i4.154

Refbacks

  • There are currently no refbacks.




Copyright (c) 2016 International Journal of Advanced Research in Computer Science