There's no perfect way to quantify whether or not a movie is underated. This is largely due the way that the two most commonly cited rating agregators work.
IMDB, the most commonly visited movie site on the internet, generates an average score based on user feedback. While that does a pretty good job of measuring the movie's popularity, it doesn't realiably measure the quality of the movie.
The other commonly referenced review site is Rotten Tomatoes. Their standard metric is a simple percentage of accredited movie critics that give the movie a positive review. This is the metric most commonly cited by the media. There are a litany of problems with their methodology as well, but given that there are some quality control measures in place (some of which I take issue with), I will use the Rotten Tomatoes (RT) score throughout the article.
In order to quantify how underated a movie was by the critics, I have assigned number grades to the 40 or so best movies I've seen this year, and compared them with their RT scores. Here are the results.
10) The American -12 (RT: 65, Me: 77)
9) Ondine -20 (RT: 70, Me: 90)
8) Valhalla Rising -21 (RT: 69, Me: 90)
7) The Book of Eli -25 (RT: 48, Me: 73)
6) Dinner for Schmucks -26 (RT: 44, Me: 70)
5) Never Let Me Go -27 (RT: 66, Me: 93)
4) Prince of Persia -37 (RT:36, Me: 73)
3) The Wolfman -47 (RT: 33, Me: 80)
2) The Tourist -57 (RT: 20, Me: 77)
1) Jonah Hex -64 (RT: 13, Me: 77)