News may filter through Boing Boing, Slashdot, and Reddit – and certainly, this story already has. But oddly, I learned of this item when I happened to meet up with the blog item’s author in Somerville, Massachusetts. He has digital analysis he believes may prove that a track was recorded to a click track.
Paul Lamere is a developer at Echo Nest, a brainy think-tank of music geeks developing new ways of processing musical metadata in the cloud. Whereas services like Last.fm focus mainly on content and community, Echo Nest’s API wants to make the computers in the cloud smarter about how they listen to your music. We’ve had a look at their work twice before:
The Remix API crunches data about rhythmic information at a number of levels. Since we first saw it, that API has led to an SDK (read: something you can program more directly), all assembled in Python. The Python-based SDK is now capable of creating the world’s most unlistenable mash-ups, among other things – some oddly compelling. On Friday, I got to listen to tunes with every other eighth note removed and Michael Jackson crossed with tunes – that is, until the programmers in the office started to complain because they were about to lose their mind. (Echo Nest uses a Sonos system to pipe music office-wide. I hope we can give you a preview of those clips soon.)
Remix SDK (currently Python)
But perhaps the most interesting thing this team has done so far is Paul’s work on plotting rhythmic analysis. Plots of tempo deviation, measured in beat durations, yield two interesting revelations:
In search of the click track [Music Machinery]
1. Much of the music you know has a lot of rhythmic variation. (Dizzy Miss Lizzie by the Beatles, anyone? No Ringo Starr jokes, please.)
2. A lot of the other music has disturbingly little rhythmic variation.
Yes, indeed, the use of click tracks (and, I suspect, metronomes, drum machines, quantized loops, and the whole lot) seems to be sucking some of the rhythmic spice out of music. You’ve already heard complaints about the “loudness wars” that have quantized out dynamic range. But, after decades of drum machines and digital tech, there’s surprisingly little complaint about quantized rhythmic values. Okay, perhaps I should scratch that – some people complain an awful lot. What we haven’t had until now is a visual representation of what’s going on.
Note/update: Just for the record, I’m not opposed to quantized beats. We’re very big fans of techno around here. The post Paul wrote begins, “Sometime in the last 10 or 20 years, rock drumming has changed.” Note, rock drumming. I think there are all sorts of rhythmic possibilities in different musical expressions.
I could go on, but I’m not having a very smart day. (The evening pot of coffee is on; I have high hopes.) Instead, I’m curious what people think of Paul’s methodology. This was just a programmer working along a line of thought with some experimental code, so I’m sure he doesn’t claim this to be an entirely scientific method. But that said, do you think his conclusions are correct? Is there more to be said about this subject?
For that matter, would there be a way to do more scientific work along these lines?
As for the engine that powered this: the Remix API and SDK from Echo Nest should be capable of quite a lot more, from gorgeous animated visualizations like the album art for Matmos we saw last year to unusual, new collaborative Web remix apps. The one catch is the analysis must be performed on their servers, so it’s not something you can apply without sending your content to the cloud – but you do get the metadata back, so I still think some sort of self-remixing applications might be possible, too. I’m eager to see a Java version of the SDK and not just Python, because that’d make it easier to add 3D elements or work with tools like Processing. Can I get an amen?
Well worth checking out Paul’s blog for lots of commentary on a variety of musical enthusiast topics: