Saturday, January 21, 2012

Pazz and Jop 2011 - the results

My first thought when I clicked on the main results page was that somebody had hacked into the Village Voice's website and put their friend's band at #1. It took about ten seconds for me to realize that the whole thing wasn't some elaborate joke. Who the hell are tUnE-yArDs? Obviously I hadn't paid close enough attention when they (or rather, she) had turned up on other year-end lists (which I admit to not having perused too carefully, other than searching for some sort of end of year consensus). So I immediately watched the video for "Bizness" and was appalled. tUnE-yArDs are horrible.

What's even more amazing is that "w h o k i l l" received nearly the same number of points (and more points/mention) than Animal Collective did in 2009 (and yes, there were nearly identical numbers of voters in those years). Does this mean that tUnE-yArDs were a phenomenon nearly on par with Animal Collective, who seemed virtually inescapable for all of 2009? In the past few days, I've seen that I wasn't the only one who was completely in the dark about tUnE-yArDs' "success". She seems to be the perfect poster girl for a year with such a surprising lack of consensus. Everyone who doesn't love her, hates her or has no idea who she is, and you could say the same about most of the P&J top ten albums this year.

Here are my top ten albums and their placements in the poll:

M83, "Hurry Up, We're Dreaming" (#19)
PJ Harvey, "Let England Shake" (#2)
Wolves in the Throne Room, "Celestial Lineage" (#144)
The Caretaker, "An Empty Bliss Beyond This World" (#84)
Death in Vegas, "Trans-Love Energies" (#419)
Modeselektor, "Monkeytown" (#1670)
I Break Horses, "Hearts" (#819)
Tim Hecker, Ravedeath, 1972 (#39)
Surgeon, "Breaking the Frame" (#613)
Mogwai, "Hardcore Will Never Die, But You Will" (#116)

In Centricity rankings, I was #378, roughly the median of the 700-person voter list. This puts me in a position similar to 2009, where I voted for two very high ranking albums. This was enough to put me in the middle of the Centricity rankings even though most of the albums I voted for finished way down the list. Tim Hecker at #39 is astoundingly high, even though it's been a constant in the lists on indie and electronic music sites, I don't think anyone would have predicted that it would finish in the top 40 of a generalist poll like P&J. The Caretaker also finished surprisingly high thanks to one of the highest points/mention ratios in the poll (Tim Hecker got more than three times as many mentions as The Caretaker).

The rest of my top took their fairly predictable positions. There are always enough ten or so Mogwai fans who manage to get their albums into the top 200. I cast the only vote for Modeselektor (note that #1670 = third last, behind only single mentions with six and five points), not unlike two years ago when I was one of two voters for the eponymous Moderat album. The P&J electorate simply do not care for them. If there was ever an album that I assumed I'd be the only one to vote for, it was "Breaking the Frame", but it garnered two other votes, one of them by longtime electronic music critic Todd Burns, who basically only votes for techno year in and year out.

As for my singles ballot, "Till the World Ends" had a somewhat shocking top ten finish (somehow this became the go-to Britney single), Gaga and Nicki Minaj were huge, and the rest were "songs" that only I voted for (except for "Try To Sleep", with five other mentions). I'm glad I stopped listing songs from my top ten albums list -- PJ Harvey placed eight songs from "Let England Shake", and tUnE-yArDs placed seven from "w h o k i l l". Sometimes you can't avoid voting for songs from the year's best albums, but when they clog up the songs list like this, it makes for a boring list.

Glenn McDonald noted that only four voters finished with Centricity higher than 0.7 this year, compared to 26 in 2010 (recall that a voter whose top ten matched the P&J top ten -- in content, not necessarily in the same order -- would have a Centricity score of 1). Let's quantify this a bit more, because in 2009 there were only 12 voters over 0.7, and that was the year of GAPDY (Girls, AnCo, Phoenix, Dirty Projectors, YYY) where loads of people were in arms about Pitchfork taking over P&J and everyone supposedly voting for exactly the same five indie albums.

Consider the number of voters with Centricity equal to or above the following cutoffs in 2009-11:

-0.750.650.6median
2011013360.295
20101642740.289
2009719300.233


The most telling drop off is above 0.75 -- nobody's individual taste even came close to representing the consensus, in the strictest sense. But if you take a more moderate definition of consensus, like > 0.6, then 2011 and the supposed hivemind of 2009 are very similar.

At the high end of the Centricity rankings, 2010 looks like a huge outlier, I think this is because the crazy huge number of votes for Kanye West are skewing the rankings. The median Centricity scores for 2010 and 2011 are almost identical, i.e. the outliers on the high and low end of the consensus scale balance each other out. After starting to write this, I noticed that Glenn had tabulated something similar for 2008-11. Take out 2010, and the degree of consensus doesn't change much over the past four years. His "consensus" number is clearly weighted according to the high end of the Centricity rankings (i.e. the number of voters who tended to make consensus picks), whereas his "diversity" number is probably more like the standard deviation of all the Centricity scores, which would tend to be mostly clustered around the median for that year.

Two of my comments were printed this year, which was a nice surprise, especially considering I didn't have much time to work on them. Even when there isn't enough time to work on detailed comments, I always try to fit in something about my #1 album. I figure it's my duty to at least try to justify that pick. This year they picked my "weird" comment too, although the theme of getting older and failing to keep up with new music and remember the names of artists/songs turned out to be a mildly popular one, judging by the other comments expressing similar sentiments.

I haven't read all the essays and comments yet, but Chris Molanphy's The Incredible Shrinking Album is unique and I think very important. We don't tend to talk about sales figures when breaking down these lists, which is strange because the music industry as a whole is all shrinking sales figures all the time. tUnE-yArDs' likely record as the weakest selling P&J #1 seems like it will be tough to break, but as Chris points out, in a year where two of the top twenty albums were free, downloadable mixtapes, and two of the top ten singles barely exist outside of Youtube, can we really say that with much confidence?

And finally, where art thou gone, b-factor? tUnE-yArDs scored a 6, Bon Iver an 8. Obviously this wasn't intended to be a predictive factor, i.e. just because your upcoming album has a b-factor of 20 doesn't mean you're destined for success, but a claim that an artist that does break through is quite likely to have a b-factor in a particular range. Is this what happens in years where consensus doesn't form? That is, can we expect a critical "free for all" where the usual rules of tastemaking don't apply?

Saturday, January 14, 2012

Clearing the critical bar

It may have started when I wrote about M83 and Spiritualized in my Top Albums of 2011. I can't remember exactly what sparked it, but I've been listening obsessively to Spiritualized all week -- live recordings, mainly. I revisited those spine-tingling Acoustic Mainline shows from '06-'07 (and a pristine quality audio and sound recording from a special acoustic performance from a festival in Iceland in 2010). I heard a complete performance of "LAGWAFIS", recorded in New York in July 2010, that far exceeded my expectations (I'm not a fan of the complete album performance fad, but if you can count on any band to make it interesting and not just do a note for note run through, it's Spiritualized). I listened to recordings of the fall 2001 tour with the audacious 13-member band. I happily suffered through poorly recorded early gigs from 1991 and 1999 (the latter was a one-off gig, I believe, with full choir accompaniment). In the pre-Youtube era you would always cycle through the same recordings of your favourite bands every time you wanted to go on a listening binge, but now you can discover a nearly bottomless pit of new treasures -- TV and festival performances, interviews, cell phone clips, etc.

What was I thinking when I compared M83 with Spiritualized? M83 are amazing, easily the most consistently excellent band of the past several years, but nobody tops Spiritualized when they're at their best. At least not this week.

But this post is supposed to be about M83's sudden jump into elite status, judging by the unexpectedly good showing (at least to me) of "Hurry Up, We're Dreaming" in many year-end polls and lists. But it could be applied equally well to Spiritualized c. 1997. And a host of other bands.

It's not like M83 were unknowns before this year. They had a healthy base of support among critics, enough to put "Saturdays = Youth" in the top 30 in Pazz and Jop in 2008, which indicates a high level of exposure but not an "everybody has an opinion on your music" level of exposure. So what exactly changed this year? Was "Hurry Up, We're Dreaming" a huge step forward compared to their other albums? Not really. You can argue about how to rank their albums or what have you but most longtime followers of the band wouldn't say that their music suddenly took a jump into a different league.

A similar thing happened with Spiritualized in 1997. Now obviously the differences between "Ladies and Gentlemen We Are Floating in Space" and their earlier albums are a lot more dramatic compared with M83 and "Hurry Up, We're Dreaming". They went from making semi-instrumental space rock to more of a free jazz/rock hybrid complete with actual love songs, so it doesn't take much imagination to understand why they became a lot more accessible to a lot more people nearly overnight. I think a band can wake up one day and discover -- not just because of their sustained run of excellent music -- that it's their time to be in the spotlight. But still, why exactly did the breakthrough happen for Spiritualized in 1997, and not before or after?

A great album and a classic album are not at all the same thing. Bands make great albums all the time. Making a great album is largely a question of talent, and being able to reach the right audience who will appreciate that talent. But classic albums are much rarer. These are the albums that people other than the band's usual fans will hear and remember. It's not just about who has the most talent. It's also a question of timing.

There might be no better timing than the release of a debut album. Everyone loves debut albums. They're about the thrill of hearing a great new band for the first time, or the collective experience of discovering and getting excited about them at almost the exact same time as everyone else. The band will never be that flawless ever again, over time they'll hopefully make a lot more great music but they'll also make a ton of mistakes and missteps. But with their debut album, without the burden of having to overcome past screw-ups or reputations, they couldn't be more perfect.

On the other hand, the sophomore slump is real. That saying about how bands have their whole lives to write their debut album but only 18 months to write and record the follow-up is true in the sense that the second album almost never seems to be the best in any band's catalog.

By their third or fourth album, a band is somewhat established and has built up some name recognition. They have enough of a following to justify keeping the band going at least on a part time basis, so they're not likely to fold up shop for lack of money or interest. Each new album will be hotly anticipated by fans, and critics don't want to get behind an obvious failure. By album three or four, you're fairly comfortable in saying that a band will have a sustainable career ahead of them. It would be a bit embarrassing to heap loads of "album of the year!!" praise on a band and album that ceased to exist one year later. They also don't want to laud some journeyman band that's been around forever but has never achieved much success. You want to get behind a band when they're about to reach their peak, not after they've already peaked, and not after too many years when you're still waiting for them (perhaps hopelessly) to peak.

Further down the road, once a band has been around for a number of years and released a bunch of albums, overfamiliarity sets in and they aren't as newsworthy anymore. They reach the point where they cater mainly to their existing fan base (which nonetheless may be large and extremely loyal) but don't pick up many new fans. Non fans, including critics who are familiar with their music but wouldn't necessarily listen to it in their spare time, see them as a band to be respected but not adored or fawned over. They can be very successful, but not a phenomenon in the critical sense.

A picture is starting to form of two "sweet spots" in a band or artist's career. The first is the release of their debut album, and the second occurs about 5-7 years and 3-5 albums.

So let's consider the following simple formula:

(# of years since debut + 1) x (# of albums)

I couldn't think of a funny acronym (CRIt-LOve Peak Prediction factor = CRILOPP factor ... ugh, forget it) so I'll call it the b-factor for now.

"Years since debut" is the number of years since the band's debut album was released. I thought about using the total number of years the band had been active, but most bands toil away in obscurity for a while without anyone really knowing who they are. The debut album marks the first time that a larger audience can be made aware of their music, and that's what we're trying to capture here. "Number of albums" is fairly self-explanatory, although one might need to differentiate between "proper" studio albums and other releases such as live albums, EPs or mini-albums (particularly for a new band), compilations, soundtrack work, etc. Obviously there is room for interpretation here, but I tried to stick closely to official studio albums as best I could.

A debut album, therefore, always has a b-factor of one. Very large b-factors (we'll try to quantify this later on) indicate that a band is past their likely peak, or at least past the time when they can reasonably expect a critical or popular breakthrough. Going by the sweet spot estimation of 5-7 years and 3-5 albums, we'd expect the ideal time for critical blowjob end-of-year chart topping success to occur for b-factors of 25 +/- 10.

[aside: I think it's more realistic to weight the two halves of the b-factor differently. (# of albums) feels more discriminatory than (# of years since debut), i.e. over-familiarity comes more from releasing a lot of albums than by taking a lot of time between albums. So maybe it would be better to use something like (# of years since debut +1) x (# of albums)^(3/2). But in the interest of keeping the calculation basic (something that doesn't require a calculator) and for using nice, easy to remember integer numbers, we'll stick with the simpler formula for now.]

As a test, here are the b-factors for the #1 albums on Pitchfork's year-end critics polls, from 1999-2011:

Year Artist Album "Years" "Albums" b-factor
2011 Bon Iver "Bon Iver" 3 2 8
2010 Kanye West "My Beautiful Dark Twisted Fantasy" 6 5 35
2009 Animal Collective "Merriweather Post Pavilion" 9 9 90
2008 Fleet Foxes "Fleet Foxes"0 1 1
2007 Panda Bear "Person Pitch" 8 3 27
2006 The Knife "Silent Shout" 5 4 24
2005 Sufjan Stevens "Illinois" 5 5 30
2004 Arcade Fire "Funeral"0 1 1
2003 The Rapture "Echoes"0 1 1
2002 Interpol Turn on the Bright Lights"0 1 1
2001 Microphones "The Glow Pt. 2" 3 5 20
2000 Radiohead "Kid A" 7 4 32
1999 The Dismemberment Plan "Emergency and I" 4 3 15


Out of thirteen albums, four are debuts, and another seven fall within a fairly narrow b-factor range from 15-35.

Getting back to the examples discussed at the start of this post, M83's "Hurry Up, We're Dreaming" has a b-factor of 55 (note that I didn't count "Digital Shades, Pt. 1") and Spiritualized's "Ladies and Gentlemen, We Are Floating In Space" has a b-factor of 18. If we consider recent Pazz and Jop polls, we see many of the same patterns: 15 for TV on the Radio's "Dear Science" (2008), 50 for Outkast's "Speakerboxxx/The Love Below" (2003), and 32 for Wilco's "Yankee Hotel Foxtrot" (2002) (discounting the Woody Guthrie covers albums with Billy Bragg, but even if they were counted, YHF would only score a 48). The numbers seem to suggest that bands peak and/or have their critical breakthrough at b-factors in the 15-35 range, with a significant tail of the distribution extending up to around 50-60.

I doubt I'm the first to notice all this, in which case, consider this article to be my musings on the subject.

Obviously this is not meant to be some kind of grand theory of music crit everything. There are a number of instances where the reasoning behind the b-factor wouldn't really apply, for example:

-- Side projects or solo records from established artists. Example: Michael Jackson. "Thriller" was his sixth solo album, but his second as the "adult" Michael that became a megastar. However, we was already a household name with the Jackson 5 before that. Defining his "debut" is a murky issue. A different example: LCD Soundsystem. "Sound of Silver" (#1 on P&J in 2007) has a b-factor of only six, but James Murphy and LCD Soundsystem had been around for years prior to their official debut album, making their name via singles and DFA Records.

-- Megastars in general. Kanye's "My Beautiful Dark Twisted Fantasy" falls into the 15-35 range, but all his albums have been critical and popular smashes. Bon Iver may fall into this category as well.

-- Career resurgences by artists such as Bob Dylan, Brian Wilson, or PJ Harvey's "Let England Shake" (b-factor 160, which would be higher if you count "4 Track Demos" and the John Parish collaborations). However, you could easily argue that an artist making their second, third, etc. breakthrough is already in a different category, and that b-factors should only be relevant for an artist's initial breakthrough.

What would be a "large" b-factor, where overfamiliarity sets in and a breakthrough becomes extremely unlikely? Based on the examples considered here, anything over 50-60 is already unlikely to break through, a safer bet would be about 100. You can argue that Animal Collective are an exception because they're not a "band" in the usual sense of the word (members come and go from album to album, and most of their albums don't feature everyone in the collective), but I'm not sure I'd subscribe to that reasoning. "Merriweather Post Pavilion" and its extremely high b-factor (90) may the kind of breakthrough we won't see again for a long time.

Friday, January 13, 2012

Diary of Musical Thoughts Podcast Episode 7, A 2011 mix

I've been making year-end mixes for the past few years, but this is the first time I've posted any of them anywhere. And now that Mixcloud has gotten rid of the 100 MB size limit, I'm going to be using them as my platform for posting mixes in the near future.

I like these year-end mixes to have a decent flow to them, which I have to map out the track selection and ordering fairly precisely, contrary to my usual preferred method of making mixes. The idea is to fit in a bunch of different styles of music, and go beyond my known favourite albums and songs, i.e. not to take just the best tracks from my top ten lists or whatever. It's more of a "what music sounded like to me in 2011" mix.

Tracklist info is in the comments, and on the mixcloud page!

Wednesday, January 04, 2012

Biggest Tours of 2011

Pollstar published the year end top 25 tours for North America and worldwide.

Bands are always ranked in order of total gross ticket sales, but wouldn't it make more sense to rank them in terms of total number of tickets sold? Total gross is largely reflective of how much ticket prices are, and if Celine Dion wants to gouge her fans by charging an average of $166 per ticket then that's more reflective of what the typical Celine fan earns in a year, not how "big" her tour was or how many fans she has. On the North American list, her tour finished 10th in money but only 23rd in total tickets sold.

U2, Taylor Swift, and Kenny Chesney were far and away the top N.A. tour performers in terms of money and tickets sold. But Jason Aldean, Journey, Katy Perry, and Trans-Siberian Orchestra finished 4th, 5th, 6th, and 8th in total tickets sold and the top money earners among them were Journey at #15. They have huge fan bases and they made an effort to keep prices affordable. They deserve some credit for that.

Even average gross per show would be better than total gross as an indicator of how "big" a band was in a given year. IOW, did they tour constantly and make their money from playing arenas, or are they big enough to play in the world's largest stadiums? U2 easily topped the chart in that category, earning more than $7M per show, followed by, oddly enough, Dave Matthews Band Caravan jam band traveling circus.

Kanye West and Jay-Z somehow got away with charging more per ticket than anyone except Celine Dion (whose fans are mostly rich white people), Paul McCartney (possibly the biggest legend touring today, who only played nine N.A. shows), and the Dave Matthews Band Caravan (which was a series of three day festivals so it doesn't really compare). Congrats to Kanye and Jay-Z for loving money so much.

Cirque du Soleil have three entries in the N.A. top 25. Really? This made me realize that average tickets and gross are calculated per city, not per show, which is also a bit odd.

Worldwide, U2 are once again the biggest, and there aren't too many surprises except for maybe Foo Fighters. Are they really the 12th biggest band in the world? I wouldn't have guessed (and with one of the lowest average ticket prices on the top 25). Everyone outside of N.A. is probably looking at the list, scratching their heads, and saying "who is Kenny Chesney?" (at #8, he was the highest ranking act who didn't play a concert outside of N.A.).

The Take That reunion tour grossed just $8M less than U2, and played to nearly as many fans. Just, wow. And Elton John, who is no spring chicken, played 110 shows in 2011, more than anyone on the top 25 except for Katy Perry and Cirque du Soleil (which is really a category unto itself).