Movie Review Aggregators Grow in Popularity, But Do They Matter?

Rotten Tomatoes is the best known of three movie review aggregators, but Metacritic and Movie Review Intelligence are increasingly in the mix

With media splintering more each passing year, movie review aggregator sites like Rotten Tomatoes and Metacritic have grown in popularity, regularly popping up in studio marketing materials and box office reporting.

But how important are they? Are they any more reliable gauges of quality – or success – than traditional film critics were in their heyday?

Depends on who you ask. Studios certainly pay attention to top aggregators, even if they don't always like to assign much importance to them. Film critics like the exposure on Rotten Tomatoes, Metacritic and Movie Review Intelligence, but they are uneasy with some of the methodologies used.

“I don’t know if they mean anything,” one studio distribution chief told TheWrap, before proceeding to rattle off scores that the studio’s movies had recently received.

Another executive said that studios pay attention for a very simple reason: because viewers do.

“In the old days, you used to follow a critic,” the executive said, but that has changed. “People don’t want to read 40 reviews, so in less than 60 seconds, you can get a sense that 80 percent of the critics thought this movie was great – or not.”

Also read: Rotten Tomatoes, a Division of Warner Bros. — Can It Be Unbiased?

Rotten Tomatoes gets some 9 million unique hits a month and is the most populist of the top three aggregators. Metacritic and newer aggregator Movie Review Intelligence draw from a smaller critics pool and audience; Metacritic gets about 2.5 million worldwide hits and Movie Review Intelligence less than 5 percent of that.

Another site, the Movie Review Query Engine, offers aggregation along with other services. While that site offers information for consumers, the company itself is primarily a business-to-business firm.

Sites like Rotten Tomatoes, on the other hand, concentrate on moviegoers.

“I know that almost everybody in the industry is checking our site,” Matt Atchity, Rotten Tomatoes’ editor-in-chief, told TheWrap. “I know that filmmakers are checking our site the day a movie opens, because they’ll challenge any review that they think is borderline.”

Film critics like the aggregators for broadening their reach, but even they have mixed feelings about them.

"They are good for critics in that they call attention to our work and find us more readers," Roger Ebert, the Pulitzer Prize-winning movie critic for the Chicago Sun-Times, told TheWrap in an email.

He’s less convinced of the value of the aggregated rankings.

"People quote the Tomato Meter," Ebert said. "But what does it really mean?"

The Tomato Meter gives a very simple read on how many of its critics gave the movie in question a favorable, or “fresh,” rating. Metacritic and Movie Review Intelligence take nuances of individual reviews into account. Their scores are generally similar – though Rotten Tomatoes methodology lends itself to higher high scores and lower low scores.

“We read through all the reviews we can find and give everything a simple up-or-down rating,” Atchity explained to TheWrap. To be considered fresh a movie needs to hit 60 percent; below that, it’s rotten.

The site looks at the work of about 500 critics, although most movies get 250 reviews at the most. If there’s a question, the staff will ask the critic. And if the critic says it’s neither one, Rotten Tomatoes, which is owned by Flixster, which is, in turn, owned by Warner Bros., will push for an answer.

“If you get a 90 percent Rotten Tomatoes, it sounds like a four-star review,” the executive said. “But what it really means is that 90 percent of the people who reviewed it, reviewed it favorably.”
So studios sometimes tout their Rotten Tomatoes score rather than individual reviews because the Tomato Meter can look better.

Indeed, “The King’s Speech,” which got a 95 from Rotten Tomatoes, is considered “fresh,” and so is “Big Miracle,” Universal’s family film, which got a 70.

But while both are “fresh,” most readers would draw a distinction between a 70 and a 95.

Shawn Levy, the film critic with The Oregonian, told TheWrap that he’s happy to be included on these aggregators.

In fact, his newspaper changed its publication schedule to accommodate them, running reviews on Thursday evenings instead of Friday mornings, so they can be included on Metacritic. But he suggested that the huge number of critics that Rotten Tomatoes draws from detracts from the site.

"By throwing open the floodgates, some of these sites have diluted the field," he said.

Atchity does acknowledge that “there is less space for subtlety” in Rotten Tomatoes.

It used to be fairly easy to be a reviewer for the site. Any member of a critics’ organization was eligible.

“That’s no longer the case,” Atchity said. “That’s merely a part of the qualification.”

Like Rotten Tomatoes, Metacritic’s staff of six reads reviews and decides whether they’re positive or negative. But they try to address how good – or how bad – a movie is.

Sometimes it’s easy: If critics give a movie four stars out of a possible five, Metacritic automatically gives the review an 80. But 40 percent of Metacritic’s reviewers do not assign scores, so Metacritic’s staff of six gives the review a number.

“There is a process of us reading these reviews and evaluating them,” Marc Doyle, Metacritic’s editor-in-chief, told TheWrap.

Metacritic, which is owned by CBS Interactive, reads far fewer reviews than Rotten Tomatoes – 44.

And while at Rotten Tomatoes, all critics are created equal, Metacritic assigns scores to critics as well as to their reviews.

The greater a critic’s stature, the more influential that critic’s opinion will be on the Metacritic score.

Like Metacritic, it has a relatively small number of critics – 51. And like Metacritic, it assigns each critic his or her own level of importance.

But instead of weighting critics based on stature, it looks at their readership.

Anybody working for a large-circulation publication – even someone with no previous experience – would be considered especially important in Movie Review Intelligence’s formula.

“The mainstream taste needs to be heard as part of the mix,” David Gross, the site’s editor and publisher, told TheWrap. “I proudly include People magazine and Us Weekly and AP, along with the Wall Street Journal, New York Times, LA Weekly and Slant magazine. There are some wonderful small publications that are doing fantastic critical work – but I think all voices need to be heard.”

So how do their ratings stack up?

All three gave high marks to “The Artist,” the highest scoring best picture nominee across all three sites, across all three sites: an 89 from Metacritic, a 92.2 from Movie Review Intelligence and a 97 from Rotten Tomatoes. In a potentially ominous sign for its Oscar prospects, the sites were far more mixed on “Extremely Loud & Incredibly Close,” which got a 46 on Metacritic, a 62.2 on Movie Review Intelligence and a 45 at Rotten Tomatoes.

Gross said that for the time being, he's pleased with his site's niche.

"The other sites are giants," Gross told TheWrap, "and so we’re happy to be the independent.”

Comments