This Calculator KNOWS your All-Time Favorite Movie šŸ˜±

My Personal 1ā€“100 Movie Rating ā€œCalculatorā€ā€”how I created my own film rating rubric + a template for you to use or tweak

Potato Paisan
11 min readFeb 15, 2022

In my last post I explained why I rate every movie Iā€™ve seen and how it helps me avoid reliance on pesky review aggregators like Rotten Tomatoes and Metacritic. Now, itā€™s time for the HOW.

Brief summary of ā€œwhy I rateā€ a.k.a. ā€œThe Problemā€

Iā€™ve been rating movies on a scale of 1 through 100 since 2014 ā€” Iā€™ve rated over 2,300 ā€” and Iā€™m here to not only preach the gospel of adopting your own rating practice, but to share with you my own approach and how I devised it.

SpudScore vs CinemaScore ā€” Left: My calculator; Right: Polling card for CinemaScore the A+ to F Hollywood scoring juggernaut thatā€™s been chugging along for 40 years and helping executives predict whatā€™ll be deemed a damsel vs a dud.

I use a personal system to rate movies on Criticker.com ā€” until Iā€™m proven otherwise, it seems to be the only online film database, ranking service, and recommendation engine of its kind (without inherent bias due to conflicting interests).

One of the main reasons why Criticker probably hasnā€™t taken off (outside of it being designed for freaks like me) is probably that itā€™s not the most attractive website. Before you knock itā€¦ hear me out on this: the user experience is actually fantastic. Itā€™s far easier to navigate, with fewer clicks, and more powerful features than Letterboxd, IMDb, and Metacritic combined.

Something deep in my soul knows that The Handmaiden (2016) is a 91/100 for meā€¦ and that The Green Mile (1999) is an 81/100ā€¦ why?

After years of commitment to this hobby itā€™s only natural that Iā€™ve become obsessed with ratings, scoring, and what it all means. This curiosity and intense desire to better understand what it is about film that I love, is what first led me to investigate, and then reverse engineer, my ratings.

A while ago, I realized that I had a super-speedy, internal gut-rating system that was oddly consistent. (I had friends test me and my list ā€” shouting random movie titles at me for me to rate ā€” 9 times out of 10, Iā€™d be within 0ā€“2 percentage points of what I had originally rated something).

What was going on there? Was there any underlying mental math that my mind was doing? Could I isolate the formula or improve it/tweak it to help me in my divine mission for cinephilic bliss!? Iā€™ve always had conviction in why I rate things, but the issue of how to rate is a different beast entirely and one that Iā€™ve struggled with since it introduces tons of landmines.

ā€œthe gall to reduce art to a score!ā€ / ā€œonly an antisemite would rank Schindlerā€™s List below a 90ā€ / ā€œhow could Manhattan be so high up there, Woody Allen is a pedophileā€ / ā€œthat movie is sexist, how could you call yourself a feminist!?ā€

I share this to preface that I donā€™t have easy answers to all the complex questions of the world and I recognize this is a sticky subjectā€”Iā€™m just trying to enjoy film, discover new things, and learn/evolve as I go.

ā€œHow To Rateā„¢ā€ is a complex and contentious topicā€¦ so letā€™s get into it starting with a pretty straightforward 3-step approach :).

1) Define the ideal outcomes of a rating system, 2) Choose an approach to achieve said outcomes, 3) Execute approach / make a crazy calculator thingy

Step 1: Define the ideal outcomes of the rating system

First things first, what exactly do I want to achieve with this thing?

I want the system I use to drive recommendations for me thatā€¦ sate my own tastes, while also feeding my passion to expand my horizons and experience the best that all film has to offer.

I distilled 3 things to strive for (A, B, & C ā€” listed and detailed in the diagram below). Each come with their own challenge.

My mission was to design a rating system that would ultimately yield film recommendations for me that were of high-quality, like The Good, the Bad and the Ugly (1966), that were sometimes aligned with my tastes, like The Favourite (2018) and that came from all genres and times, like Forbidden Planet (1956).

Step 2: Choose an approach to achieve said outcomes

In order to get what I wanted, Iā€™d have to define ā€œqualityā€ā€¦ and then fairly weight my assessments through the lens of both my own personal preferenceā€¦ and then my understanding of ā€œfilm criticismā€, while also controlling for the variables of release date and genre. (AHH!)

The approach devised (summary below) was to 1) reverse engineer the concept of quality, 2) incorporate a personal preference rating across all those quality categories, and 3) add rating paths for more objective measures of ā€œthis film compared to all filmā€ and ā€œthis film compared to films in the same genreā€.

I didnā€™t want to end up with only recos for sci-fi or romances with happy endings (even though I do love them).

Step 3. Execute approach / make a crazy calculator thingy

A. Make some rating categories to help us encompass makings of ā€œquality filmā€ ā€” yikes.

I am not a film scholar, a writer, a filmmaker, or a critic, so I apologize in advance for brutalizing any of these aforementioned crafts.

After much trial and error I culled down a very long list of things relevant to the goodness of film into 3 major categories each with their own sub-categories. 10 Sub-Categories in total ā€” I love round numbers. All 3 must come together in a symphony of perfection to make a masterpieceā€¦ like Dr. Strangelove (1964).

Acting, Filmmaking, and Storytelling were deemed captains of the 3 parent groups. There are a lot of reasons it netted out this way, but I needed to hit all of these and they fit nicely in different sections because usually the people involved in owning each of these three groups are very different: acting talent, moviemaking / directing talent, writing talent.

You canā€™t have Dr. Strangelove without Peter Sellers performances (plural), the visually arresting cinematography, or the brilliantly satirical script.

Stills from Dr. Strangelove, depicting marvelous work in the 3 big categories ā€” Acting, Filmmaking, and Storytelling.

I set up all sub-categories to be rated on a scale of 1ā€“10: 10 sub-categories multiplied by 10 pts = 100 possible points in a ā€œqualityā€ score.

  • Acting ā€” I incorporated choreography, casting, stunts, singing, dancing into both the Lead and Supporting Roles sub-categories.
  • Filmmaking ā€” focused on the sub-categories that could be relevant to the vast majority of all feature films. For example, special effects are important to consider, but films that donā€™t have them shouldnā€™t be docked.
  • Storytelling ā€” distilled into 4 sub-categories, this is effectively weighted equally to Filmmaking. While that may seem odd, Iā€™ve structured it so that each of the Storytelling components are brought to life via the technical aspect of the filmmaking and are somewhat dependent on it. The medium is the message, but storytelling is the courier.
The main snag is no matter how I sliced it I needed two different systems for Narrative vs. Documentary films (sorry). This post is already too long so I shall share my Documentary Abacus next.

Iā€™ve strongly emphasized the overall POV of the film in this system ā€” what truth, higher purpose, opinion, or perspective does a work of art hold? The characters could be full of life, but if the work lacks a cohesive message, then whatā€™s the point? (Dr. Strangelove nails this, of course)

Up next: what I prefer, and how to add ratings in here.

B. Add-in personal preference and figure out how you want to weigh it

We all have different experiences and preferences ā€” thatā€™s OK. Iā€™d rather not pretend they donā€™t exist or ascribe to some lame Rolling Stoneā€™s top 100 films of the century. YAWN.

I hope most reading this would agree that Citizen Kane (1941) is a better film than Pitch Perfect (2012) [Not that itā€™s bad! I donā€™t hate it!], but weā€™d probably have a lot of people split on La La Land (2019) vs. Uncut Gems (2019).

Getting to rate The Old Guard (2020) a 76/100, above Seabiscuit (2003) a 75/100, on my own private Idaho, felt like a teeny tiny triumph.

I may end up writing many more posts on just the subject of tasteā€¦ because the study of personal preference/enjoyment in films, music, books, etc. is a fascinating and complex one. Consider personal values, upbringing, culture, race, religion, privilege, neuroscienceā€¦ Weā€™re getting into nature vs. nurture territory folks!

ā€œApparently Sci-Fi fans are more likely to listen to Lady Gaga alongside Rap and Classic Rock, while Fantasy fans have a more uniform musical palette.ā€ ā€” The Next Web

Fascinating facts to dig into at a later date: Our music tastes are ā€œcultural in origin, not hardwired in the brainā€ and our taste in music can help predict our taste in film.

Tastes have the capacity to change and they evolve, too. The great thing about my ratings is I can revisit them and change them as I do. For now, Iā€™ll just share a smattering of some things I know I love (we are all very different people, and my ratings skew towards the following):

These are a few of my favorite things

Iā€™m drawn to self-aware nihilistic comedies, like Dr. Strangelove (1964) and In Bruges (2008). While I want to watch more Almodovar and sci-fi, I also want to discover new genres (I recently got into Anime); while I want some cheap romance I also want to balance that desire for instant gratification with the long-term fulfillment I get from watching something truly magnificent (and sad) like Blue is the Warmest Color (2013).

Ultimatelyā€¦ this isnā€™t a science, itā€™s a fun exercise, so I landed on this: 1/3 (33.33%) of the score would be based on my own opinions and preference.

The remaining pieces of the score must therefore be dedicated to an attempt at more ā€œobjectiveā€ measuresā€¦

C. Handle the genre issue + the release date / ā€œtime passesā€ and ā€œthings changeā€ issues

Objectivity is impossible, but in order to counterbalance my personal biases, I wanted to incorporate more ā€œtechnicalā€ ratings against the key categories of Filmmaking, Acting, and Storytelling.

I first played around with the idea of one, 1ā€“10 rating for each of the 10 sub-categories like ā€œPhotography / Effectsā€ and ā€œPlot / Dialogueā€, but this fell flat, because I knew I wanted to take into account all 3 of the following:

  • This Film versus All Film >> The medium of film and how we use it may actually improve over time (or not) ā€” itā€™s a new medium in the grand scheme of human storytelling.
  • This Film versus Its Genre >> I wanted to reduce the typical genre-bias you see in critic reviews (why do they hate action!) and let best-in-class movies rise to the top.
  • This Film versus its Era >> Lastly, I wanted to make sure ā€˜release dateā€™ wasnā€™t an unfair advantage or disadvantage. Just because Star Warsā€™ (1977) special effects arenā€™t cutting edge anymore, doesnā€™t mean that its pioneering excellence shouldnā€™t hold its value or be incorporated into a rating system as significant!

My next speed-bump was realizing that not every single one of the 10 sub-categories is equally relevant to the 3 rating types aboveā€¦. If I took ā€œrelease eraā€ into account in all 10 sub-categories, it just didnā€™t make sense!ā€œPlot / Dialogueā€ ratings donā€™t really get impacted by time and technology in the same way as the technical Filmmaking sub-categories do. For example, I think it is fair to measure Dune (2021) up against Star Wars (1977) in terms of ā€œPlot / Dialogueā€, but not in terms of ā€œPhotography / Special Effectsā€.

Where this led me is to Comparative Ratings for Genre, All Film, and Release Era ā€” but where the Release Era ratings only fit in for the Filmmaking sub-categories.

Weighing all of these: I decided that of the 66.66% remaining percentage points (after Preference was factored in), that 33.33% would be reserved for Genre-on-Genre, and that the final 33.33% would cover All Film and Release Era together. [Thereā€™s not really any science here ā€” just felt right].

Make a copy here > https://docs.google.com/spreadsheets/d/1jdzN26qsyTp-r3oXdFBBqDM4E3ZWK-oxJZ9PWYmGH9Y/edit?usp=sharing

So here it is!! Youā€™ll note, I LOVE DUNE! Without my personal preference ratings here, this wouldnā€™t hit a 90%, but with itā€¦ itā€™s in my top favorites of all time.

In Closing

Iā€™ve made a publicly available version of my calculator if you want to play around with it, make a copy of it, use it, burn itā€¦ or even improve it!

LINK TO Potato Paisanā€™s Spud Scoring System for Film

I could keep writing about my genre and sub-genre categories or break down every individual rating in that Dune rubricā€¦ but itā€™s time to wrap this up!

Note: this is the first time Iā€™ve ever shared and of this ā€” Iā€™d love to hear your thoughts, opinions, feedback, and ideas! I donā€™t think my rubric is right for everyone and I certainly donā€™t think itā€™s perfect ā€” but if jotting this all down and sharing this helps anyone increase their enjoyment of film, than why not.

In a future posts Iā€™d love to dig into the science of personal preference OR my Documentary Rating Rubric OR my categorizations for Genre, Theme, Setting, Period, Type, and Philosophy (work in progress) OR anything you want! Any requests?

--

--

Potato Paisan
Potato Paisan

Written by Potato Paisan

Strong Potatopinions, Spud Scores, and Tater Tastes on film. šŸŽ„šŸæšŸ„”

No responses yet