On Mon, 8 Aug 2005, Richard Smith wrote:
Indeed, manual classification other than a subjective/personal rating system is useless.
Useless for what _you_ guys do with pyTone. For my purposes classification is essential and the core of what Dancebox will do for the user. My classifications are much more specific than what a single field genre allows and allow a song to span mutiple classifications.
I should have said a time-waster instead of useless. (unless you have a limited set of files) Can you give a list of different characteristics that you use for classification ?
I bet most (if not all) the objective characteristics can be deduced automatically. I've been looking at some literature on this. Search for 'automatic classification of music', 'music fingerprinting' or 'accoustic fingerprinting'.
There is much more literature, research and even code examples. Some of the characteristics use only specific frequencies (like what's used for drums), voice recognition patterns to find instrumental music or built neural networks for classification purposes.
Characteristics include:
perceived tempo (very slow, slow, medium, fast, very fast) mood (happy/neutral/sad) emotion (soft/neutral/aggresive) complexity (low, medium, high) focus (vocals, both, instrumental) or Timbre-related Rhythm-related Pitch-related
Based on a good set of characteristics, each of the genres will have upper and lower limits for every characteristic and it's quite possible that some of the genres have an overlap (where a song can fall both in Jazz and Blues genres).
I'm pretty sure it can be done (and gradually improved over time) and not limited to just Genre classification (which can be done versus an online database like freedb).
Based on user feedback a centralized neural network could be gradually improved to work for the complete set of music that exists digitally.
Kind regards, -- dag wieers, dag@wieers.com, http://dag.wieers.com/ -- [all I want is a warm bed and a kind word and unlimited power]