Tuesday, September 29, 2009

2004 was a strange year, plus Pitchfork's Best 200 Albums of the 2000's AKA T-3 days to "Kid A"'s inevitable win

Or maybe you haven't heard about how parts of "Kid A" were made with computers, thereby making it a prophetic statement on the futuristic info-age that we now live in. Causality vs correlation, people! Anyway, let's just prepare to take our medicine and move on to the 010's.

I'm getting some bad vibes so far from that list (through 200-101). There are some placings that seem inconsistent with previous charts, for example, Mountains Goats' "Sunset Tree" which was #24 in 2005 and #102 of the entire decade, and ahead of a few other albums that ranked much higher on the '05 chart. Not to pick on Mountain Goats, but I wasn't aware of any slow-burning critical consensus behind that record, so I found that placing a bit odd, but I won't lose sleep over it. After all, we're looking at, on average, twenty albums per year for this list (actually more from 2000-8, since 2009 will surely be underrepresented) and these lists aren't compiled by crunching statistics from past charts, they represent the opinions of actual people (and not even the same people who were writing for the site in 2005) whose feelings can change over the years.

No, what's weird is the inconsistencies between the placings of albums on their 2008 chart, and their placings on the Top 200, for example, TVOTR's "Dear Science" (#6 in 2008, #140 of the decade) was outplaced by Erykah Badu's "New Amerykah Part One" (#13 in 2008, #133 of the decade) and M83's "Saturdays = Youth" (#8 in 2008, #111 of the decade). This represents the opinions of virtually the same people that were compiling the 2008 list, and how much could their opinions have changed in the past nine months? Is a TVOTR backlash setting in as part of a reaction to that album's more mainstream critical acceptance (e.g. winning P&J 2008)? If so, shouldn't a serious critics' list be above those sorts of about-face evaluations? Aren't these lists supposed to stand the test of time, to serve as critical benchmarks? People do rely on P&J as a benchmark of the "critical mindset" from that given year, and you can quibble about details (who did/didn't vote, genres not represented, points schemes) but given the size of the poll, it's hard to think of a better representation. But where's the value in lists drawn up by critics who change their minds every year?

Maybe I'm overanalyzing (probably). Maybe I shouldn't read too much into half of a list. Maybe I'm the only one who thinks that #200-101 contains a lot of second and third-best albums by a lot of acts, suggesting that more albums by those acts will be appearing in the top 100, making for a someone predictable list, not to mention one that will be lacking in variety due to all of the repeat acts. But the bad vibes aren't going away ...

As for my own ongoing introspection, I'm not sure how to evaluate 2004. It's looking like a freakishly abnormal year with a great many good albums, but very few great ones. A puzzling combination, to be sure. Or consider this: for me, that year was a turning point, in which the majority of my new music consumption/listening shifted from CDs/music stores to computers/internet. Suddenly, I needed to find a way to process more music in a shorter time span than I ever had before, and it's possible that I couldn't adjust that quickly. I heard a lot of good music, but was overwhelmed and couldn't connect strongly with most of it. By 2005, and in the following years, my musical body clock found a way to handle the volume, leaving 2004 as my personal transitional anomaly.

No comments:

Post a Comment