The best review is one that is closest to how you ultimately feel about the topic. Consequently, individual critiques aren’t for everyone. Which? It’s hard to decide before seeing it, let alone its subject. Therefore, you must turn to reviews themselves to tell you how you might feel about something. How do you choose reviews that accurately display your feelings? Easy. Work backwards. Start by imagining how you would feel, then pick the reviews that reflect that.

There is no shortage of everyone telling you how you should feel by proxy, if not in proximity. Without critics, we’d have to make decisions for ourselves. Value judgments, if not valuable ones. Life is comprised of decision makers and decision takers. Individuals harness both. Several things are out of your control, if not out of control. Consensus advises we don’t choose to be conceived. “You don’t have a say? You don’t say.”

We take it on faith that birth was elected in good faith. As an idea, you are decidedly decided upon or against the grain. Being is unbecoming if not becoming. You get an A for affordable, not effort. Making the grade is above your pay grade. Everybody likes to trust in the Creator, if not their creators. Authority is dubious when it doesn’t know what it’s doing except itself. In turn, take the salt of the Earth with a grain of salt.

Choice is determined, but experience is indeterminate. We do have a little agency, if not free agency. The thoughts are not a figment of your imagination. Making up your mind isn’t made up. It isn’t automated autonomy. The question is whether you will, not if you can. Ideas travel fast, but thinking takes time, if not just your time. I’m capable of snap judgment or synapse judgment. The divide is favorable. I seldom consider everything.

One of the vital issues can be opportunity. Few access sufficient time. Mercifully, the species neutralizes temporal shortcomings. I have the memory for impressions even if I don’t have the memory of getting them myself. I make room for space, if not time. You may not have a person occupy your mind and think on its behalf, but they can have your thoughts themselves. Instead of developing opinions, you outsource their formation to a web designer.

In other words, admit views of expert ideologues. The dispute conveys possibility. Select amid myriad persuasions or persuasiveness. Let the Right One In. How do you know better? Critics endure bad movies, but we endure bad critics. Luckily, cyberspace extends various averages. I like IMDb, Rotten Tomatoes, and Metacritic. By unifying scores, you generally get a tolerable sense of merit. They impart specific information. It’s important to recognize the difference.

This distinction has been discussed at length. I don’t need to mull, but it’s worth mentioning. IMDb is useful in dispensing details. Cast, crew. Specs, too. Review is user-generated, meaning the score depends on action. I’d expect the passionate disciples to show. Those who enjoy are likeliest to score. They’ll probably score high. If you’re lukewarm on something, you probably don’t care enough to report it. In turn, popular movies are often rated higher.

Take Interstellar and 2001. Both are contentious. I think the former is excellent, but prefer the latter. More importantly, I suspect critics share the preference. Interstellar has a Metascore of 74 and an IMDb rating of 8.6. On RT, the movie is at 71% (67% Top Critics). Odyssey, on the other hand, is at 86 on Metacritic, 96% Fresh, and 8.3 in IMDb. Genre fans emerge quickly in boosting results. Interstellar (804,949) keeps over twice the votes in 2001 (396,391).

The immediate gush of fangirls and boys used to be even more forceful if they rated films before release. 4chan and Redditors hiked The Interview to 9.9 prior to seeing it. Evidently, this is a gesture of protest. Nevertheless, elevating initial criticism fixed it long-term. Bias is manifest in The Dark Knight. Users skew male, but, by far, the heftiest lot was 18-29. Roughly 592,775 18-29 males of 1,106,676 males rated DK 9.2. The second biggest is 30-44 males, with 390,903.

By comparison, if not comparably, 117,489 18-29 females of 194,383 females decided on 8.9. IMDb filters insofar as it doesn’t count irregular votes in the Top 250 films, for example. Users must frequent IDMb until they count. In plenty of cases, rating flattens over time. My flaw is consistency. Lawrence of Arabia is 8.4 by 128,371 males and 17,079 females. A medium can be accurately reviewed by fewer people, if not lesser minds. Ipso facto, miscellany. 

You want different views. It’s easier to decipher harmony, but cacophony counts, too. Unique views in equivalent reviews: solid consensus. Insofar as the fans do, it’s those who care to review. Trolls care for the wrong reasons. Surely, there is a difference between trolls and passionate distaste, if not for them. Some trolls are positive. There is a happy medium between derision and unwarranted praise. That medium is the real thing.

Underneath the sappy or negative film is a movie. In order to see that, you need to see relative unity. Contemporary relativism is at issue. Majority. It’s not a thing-in-itself, it’s just a thing. Hopefully, it’s a just thing. Minority is not error. In history, many get it wrong. The many or the most. There is something to be said for majority depending on what it says itself. If the critics agree, they have basis. In other words, they must say the same to do so.

It’s up to you whether that basis is solid. The danger in relativism is error. Individuals have unique tastes. Different ingredients appeal to different palettes. As such, what one person identifies as good another finds or dismisses as bad. The perception is varied. With entertainment, however, you see indelible egotism like invisible ink: because it’s subjective, there is no correct or even better answer.

If validity is relative, you still equate justification. Multiple views have right to truth, but we can objectively discern similarity. The right to truth isn’t being true. In IMDb, it’s difficult to suss the outliers and suss out the liars. Rotten Tomatoes is fruitful in particular or general. Critics are taken at their words, but they also take a number. RT assigns anything “fresh” or “rotten.” 60% and above or 59% and below, respectively. 75% up is “Certified Fresh.”

The score, “Tomatometer,” identifies the percent of critics giving positive reviews. The Tomatometer is suspect. It offers the critical vibe, not their score. A review of 95% is the same as 60%. Ideally, movies collect a smaller spectrum. RT is better informing enjoyment than how much of it. Specifically, if you should see movies. Additionally, Rotten Tomatoes displays the user approval. For audiences and critics, it also illustrates the average rating.

In a given window, it extends a blurb for each review with an original score. If it doesn’t have an original score, the critique is left as is. By contrast, Metacritic estimates ratings for analyses that don’t have them. A Metascore is weighted according to the eminence of the source. Apparently, it averages a lot of critics and distills them into a select curation. If Rotten Tomatoes tells you whether you will like something, Metacritic says how much.

The basis for the post is a sub called “Data Is Beautiful.” OP showed an interesting graph combining marks for all three websites. Busy, but comprehensive. The interactive version is telling. From 0 to 50, Metascores tend to be higher than the Tomatometer. Once 50, however, it flips. I wonder if that has to do with a preponderance of rating extremes. A firmly positive or negative score is more distinctive or, perhaps, compelling than an average one.

IMDb tends to score higher until the upper echelon. This makes sense because the content, especially the weak sort, is rated by fans. Their scores are likely to be generous. Even though you might think the inverse holds true, top movies attract more votes overall. Therefore, they include additional dissenting opinions. I am only getting at the film. If you want to get into it, go see. Many views never amount to yours, but keep them in mind.