Mathematically proving which movies are best

09 May

Coming out of The Avengers this past weekend, my friends and I were giddy. We all read comic books as kids (some of us still do), so we all loved the movie. Indeed, we were quick to praise it as the best film of all time.

We were, of course, suffering from the sort of irrational exuberance that one typically feels after having one’s adrenal glands stimulated for several hours. Once post-film sobriety settled in, we did return to sanity. As good as The Avengers was and is, surely it’s not the best movie of all time. Or is it?

For much of cinema’s history, the quality of a film has been a purely subjective discussion. Beauty is, after all, in the eye of the beholder and everyone likes different things.

But I started wondering if there is a way to mathematically prove which movies are good, if not the best. In an age where data on everything is gathered and mined, surely there’s some way to apply actual science to such a subjective debate.

When it comes to movies, I believe there is. Or rather, such a system is emerging.

Firstly, there are some factors that should not be counted. Oscar nominations, for example, should not be included in any empirical attempt because they are decided on by a relatively small group of people. Films can also rack up wins in technical categories, but that doesn’t necessarily make them good.

Box office take or any other revenue should also be irrelevant. Otherwise, Michael Bay would have several of the best movies ever made. Ahem. Excuse me, I think I just threw up on myself.

Turning to resources that should be counted, there are two large websites that should be integral: Metacritic and the Internet Movie Database. Metacritic launched in 2001 as an aggregation site for reviews, while IMDb started in 1990 as a general repository of film information. Metacritic compiles reviews from well-respected critics, assigns them a numeric value out of 100, then averages them. The Avengers, for example, has a score of 69, which means the 42 critics counted (at the time of this writing) have been generally positive about it.

IMDb, on the other hand, asks regular people to assign ratings to movies out of 10. The Avengers scores considerably better with the general public, with close to 100,000 website visitors giving it an average of 8.8 as of this writing. Scores on both sites do change slightly over time as more reviews are added to the averages.

Putting the two together, I think, is a nice composite. Metacritic is a measure of what professional film critics think while IMDb provides insight into what the general public feels. Most importantly, both sites have crowd-sourcing at their cores – by averaging things out between dozens of critics or thousands of movie-goers, individual preferences and biases are smoothed out and a general picture emerges. So, if an average Metacritic or IMDb score is high and there are enough reviews in the sample, chances are the movie is good. The more reviews, the more accurate the measure.

For kicks, I thought I’d put the system to the test. I grabbed the best 100 movies from both Metacritic and IMDb and set to averaging the scores (I bumped IMDb’s scores up to double digits, so an 8.2 became an 82, for consistency’s sake).

I did come across one big problem – Metacritic’s reviews are inconsistent in their sample sizes the further back they go. While modern movies have dozens of reviews to work with, older films sometimes only have a few. That’s not as problematic as the fact that, before 1970, things are even more sketchy. Some classic films such as The Manchurian Candidate (1962), for example, have scores based on a decent number of reviews (94, based on 14 critics) while others have none. Citizen Kane (1941) is often thought of as the best film of all time, yet it has no score on Metacritic.

This isn’t a problem on IMDb, where older movies often garner as many reviews as new ones. Citizen Kane, for example, gets an 8.6 from nearly 175,000 reviewers.

With that issue in mind, I decided to limit my sampling to films made from 1970 onward, after which review sample sizes are generally good.

It should come as no surprise that critics varied significantly from the general public in terms of the films they loved. Professional movie reviewers, in general, seem to really like foreign films. Audiences, meanwhile, usually like more action and genre films. Movies such as Fight Club and Aliens are on IMDb’s top list, but not on Metacritic’s counterpart.

The top movie on IMDb is The Shawshank Redemption, which doesn’t even show up in the critics’ top 100. Audiences liked The Empire Strikes Back better than the first Star Wars, which is the reverse of what critics thought.

One bizarre occurrence I found while fiddling with these numbers was the inexplicable absence of a Metacritic rating for Back to the Future. IMDb users generally liked the movie, giving it 8.5, which was enough to make the top 100. I expected critics to go significantly lower, but there’s no rating at all on the site. It’s the only highly rated, post-1970 film where this happened.

Another interesting tidbit I discovered seems to speak to the notion of not considering Oscar wins as a barometer of a good movie. Only 13 of the top-rated movies on the composite list won the Best Picture award, which is a pretty low percentage.

Here then, with no further ado, are the top 10 best movies since 1970, according to the composite scores of the two sites:

  1. The Godfather (192)
  2. Pulp Fiction (183)
  3. Fanny and Alexander (182)
  4. The Lord of the Rings: The Return of the King (182)
  5. Schindler’s List (182)
  6. The Conformist (181)
  7. Pan’s Labyrinth (181)
  8. Shoah (180)
  9. A Separation (179)
  10. Spirited Away (179)

There you have it. According to my rudimentary system, The Godfather is the best movie of the past 41 years, beating out The Avengers (157) by a relatively wide margin. If you want to see the complete top 100, click here (opens PDF file).

I’m sure there are other issues with this system and I’d love to hear additional suggestions in the comments below. Nevertheless, trying to apply data to what has always been a subjective discussion is a fun exercise if you’re into such nerdy things. As sample sizes grow and widen in the future, I suspect such conversations will become even less subjective.

Comments Off on Mathematically proving which movies are best

Posted by on May 9, 2012 in movies


Comments are closed.

%d bloggers like this: