Being creatures who crave organization, humans like to put things into nice, neat piles. That includes music, of course. Who doesn’t distinguish between sounds by using words like “rock,” “hip hop,” “pop” and hundreds of groups, sub-groups, sub-sub-sub-groups, and even sub-sub-sub-sub-groups? That’s worked reasonably well for decades, but maybe there’s something better, something more…scientific.
Researchers from McGill, Cambridge, and Stanford have come up with just that. They put music into three categories. The first is “Arousal,” which describes the energy of the music. “Valence” is all about the emotion in the music on a scale of happy to sad. And “Depth” is the intellect or sophistication in the music.
By using these metrics, researchers think they can help streaming music platforms better focus their music recommendation algorithms.