Music Beat Analysis

ArrayFireCase Studies Leave a Comment

Did you ever wonder how the music visualizer in your media player works? Watching it pulsate in synchrony with the beats of the song is almost as entertaining as listening to the song itself! Researchers have been attempting to detect beats in audio signals for many years, and there are many techniques available, from the simplest (and least accurate) to more complicated algorithms that are highly accurate. All algorithms, though, perform some form of signal processing and frequency analysis, applications highly suited to GPU Computing.

The beat visualizer described here was developed by researchers at Rice University, and is simple and fast. An incoming signal is broken down into six frequency bands for analysis. After smoothing out these bands and performing further signal processing steps (such as convolution and filtering), the algorithm proceeds to find the band with the highest energy.

There are 4 plots shown during the analysis:

  1. Top Left: Input audio waveform
  2. Top Right: Output of a few functions of the input waveform
  3. Bottom Left: 6 Spheres corresponding to 6 frequency bands. They expand when a beat is detected in that band.
  4. Bottom Right: A sphere that pulses when a beat is detected overall.

Music Beat Visualizer with Jacket

Conversion of the MATLAB-based visualizer to Jacket was made possible, mainly through the clever use of two of Jacket’s functions:

  1. Use of the CLASS function to execute code either on CPU or GPU, based on the input
  2. Use of the GFOR function to execute serial loops in parallel.

The original MATLAB code was adapted for Jacket by Vidhur Vohra, a grad student at Georgia Tech. He shared with us his thoughts on converting the code to Jacket:

…The main reason for the speed up is that previously the bands were put through the functions serially, but with the capabilities of JACKET/CUDA it was possible to put all six bands through most calculations at once, thereby greatly reducing the time for computation.

As a result of this, the code has been observed to run up to 6 times faster on a Tesla C2075 (vs a Core 2 Quad CPU). The computationally-intensive core of the visualizer was observed to run 15 times faster!

Observed times:

Computational core (rhythm.m)

  • Core 2 Quad (Q9400): 0.37 s per iteration
  • NVIDIA Tesla C2075: 0.024 s per iteration

Overall:

  • Core 2 Quad (Q9400): 11 iterations per second
  • NVIDIA Tesla C2075: 64 iterations per second

The Music Visualizer Example will soon be available with Jacket 1.9!   If you can’t wait to try it out, you may download and install Jacket’s nightly installers from this link (find out about our Nightlies here) and type:

cd [Jacket Location]
addpath examples/music_visualizer_example
music_visualizer_example gpu

We would like to thank the following for their efforts:

  1.  Kileen Cheng, Bobak Nazer, Jyoti Uppuluri and Ryan Verret at Rice University
  2. MIT Media Lab for the original beat-detection algorithm
  3. Vidhur Vohra, Georgia Institute of Technology

References:
CPU source code: Adapted from “Beat This“, a beat-detection algorithm adapted from the MIT Media Lab by researchers at the Rice University.

This code project won the first prize at the Georgia Tech GPU Coding Challenge. AccelerEyes is proud to have hosted this event and engaged the vibrant student community at Georgia Tech. Stay tuned for more such events at http://www.accelereyes.com!

Leave a Reply

Your email address will not be published. Required fields are marked *