Nostalgia and Why Your Feelings Are Wrong

We’re still off this week. But that means we get to share this gem from way back in Issue 6 : Looking Back. Hopefully in a world of reboots and remakes it will offer some perspective. Enjoy!

Nostalgia is a weird word. In Greek it means, for lack of a better term – “to ache for home.” How we use it now is to mean to be wistful for a moment and place in time. It is one of the most common human emotions, especially as people get older. As the years pass, time finds a way to turn the sunniness of yesterday into days full of such warmth, gladness and whatever else, that it becomes easy to consider them as the best that have ever been.It is then a very human notion to want the past and to occasionally feel an unrelenting yearning for yester-year.

Wrong Yoda was.
Wrong Yoda was.

Of course, it’s all bullshit. Nostalgia is just a feeling, and no matter what Yoda may have said about trusting them, feelings lie. In fact, the past was not all that it was cracked up to be and those days were not categorically better, no matter how good they look when separated by a gulf of decades. If anything, trusting in that feeling of nostalgia is one of the few ways that I can think of to actually kill the zeitgeist.

I am fully aware that wallowing in the impossibly loving and warm glow of so many perfect yesterdays is delightful-  but so is eating ice cream for dinner. Neither option is terribly healthy for a grown up. The thing is, that day was not as warm as you thought. But it’s not time that makes it all look rosy and grand; it’s actually your brain and a cognitive tweak known as the Nostalgia Bias. It works on the principle that good feelings stick out more than boring ones. It’s the same reason that it is easy to remember that awesome dinner you had at the Chef’s table 4 years ago than it is to remember what you had for dinner yesterday.

Everything is Gonna Work Out

Awful memories will stay right where you left them, but there is a tendency for most people to consider those to be outliers. So when the bad stuff is the exception, what people remember as good becomes the rule. This is a side effect of a different, and equally sexy, cognitive issue called the Optimism Bias, the short of which is that people tend to assume that things are going to work out, even if the math is against them.

Optimism Bias, not to be confused with the Optimus Prime Bias, which is adorbs.
Optimism Bias, not to be confused with the Optimus Prime Bias, which is adorbs.

Basically, the Nostalgia Bias and the Optimism Bias combine to render your past into a sort of Greatest Hits mixtape. Birthdays, fun vacations, and those sorts of experiences all come to the forefront when people start to think about the “good old days.” The B-Sides get left out because they were boring or awful, and the Optimism Bias makes us forget about them. Greatest Hits are great, don’t get me wrong, but a weird function of nostalgia is to compare your memory’s Greatest Hits to a regular release. Of course this just makes your “new” stuff (with all of those B-Sides still intact) seem bad by comparison.

But wait! It’s not just your memory about events that triggers this effect. It’s also your memory about things themselves. Carey Morewedge did some studies for the Journal of Decision Making (I would have have bought a copy, but I couldn’t make up my mind) that looked into this effect. A group of people were asked their thoughts on sitcoms from the 80s, 90s, and 00s and rate them on a scale of 1-10. Unsurprisingly, people tended to rate things from longer ago as being of a better quality than those of today. The easy assumption (which we will pickaxe momentarily) is that TV was better in the 80s. Odds are that the actual reason has to do with syndication. Young people that watched 80s sitcoms probably did so exclusively as reruns. So, the only shows that were syndicated were those that reached 100 episodes, and a show usually has to be pretty good to get that far without being cancelled. Older people who watched the sitcoms when they were new have since forgotten the awful TV that the 80s produced.

movie filmOf course, that nostalgia is making things seem better than they actually were. So, in an effort to drop some cold hard facts on everybody, I went back to the data. The Internet Movie Database (or, if you’re the cool kids – the IMDB) has a sexy little function that allows a person to get vital information out of it regarding year of release and overall review score. I had NitWitty R&D pull the review score information for movies all the way back to the 50s. Then I had them look into the numbers, and correct for films that didn’t have enough reviews to be statistically relevant, and deliver the data to me. What they found was fascinating.


  • 1950s: Average Score – 7.202
  • 1960s: Average Score – 6.995
  • 1970s: Average Score – 6.800
  • 1980s: Average Score – 6.203
  • 1990s: Average Score – 6.147
  • 2000s: Average Score – 6.331

While I would like to point out that these numbers would seem to imply that the 00s are some sort of renaissance of film, that’s not really the case. If anything it has to do with sample sizes and the fact that more people have seen those instead of obscure action trash from the 90s. Another artifact that could be pointed out is that the 1960s and before are clearly a high water mark, but that’s also an issue of sample size (and included in the chart above for curiosity more than anything else). The deviation is probably caused by a selection bias, which works like this: if the only movies from the 60s you know are Snow White and The Sound of Music, those are the only things that get reviewed after the fact on the IMDB. It’s a fallacy to assume that all the movies from the 60s are as good as those because the sample size is so small. The clear standouts, like anything by Hitchcock, are also way over represented because they are historically relevant. This would be a problem if we didn’t have a bunch of data – but we do.

The data shows that, on average, things are average. There is some statistical difference, but these differences are for the most part irrelevant to anybody else. While it’s easy to get all befuddled by the power of nostalgia and say that the 1970s were a great time for film, could anybody honestly say that they were 4% better? Or for that matter, if I could use a magic box and create movies that had exact review scores of 6.5 and 6.9, could anybody tell the difference, and even then there would it come down to personal preference? In other words, whatever differences the numbers show, are purely for show. Which proves the point – movies and things in general, were not better back in the proverbial day. Statistically speaking.

Back in my Day…

A different function of nostalgia is how something affects a person at a specific time in their life. So if a time was really good for you, it is an assumption that a time was really good period. This is a mistake as shown in another paper by Morewedge. They did a study where they had people think about, and rank TV shows from the time they graduated on a 1-10 scale, and then did the same for today.

TVs used to look like this, because the old days sucked.

This is a fun experiment you can try at home. Think about the year you graduated from high school and the TV you watched back then, and give it a score 1-10. Now think of TV that’s on today and do the same. Odds are the immediate reaction is to think that TV you watched from before is really good, and that of today is just okay but inferior. Morewedge got the same result. It turns out that, on average, people assumed that TV from when they graduated was better. Yet when they stopped to think about it specifically, they realized it wasn’t true.

It’s the same sort of bias we see in the IMDB data. Only in this instance it has to do with the time. For most people, graduating high school is probably the most momentous thing they have done in their lives up to that point, and for some people it’s their high water mark. This tends to associate a whole lot of good feelings with that time and the cultural accoutrements that go with it, including what was on TV.

But if you think about it rationally, this isn’t true. Case in point- when I graduated from high school the shows I was watching were Batman: The Animated Series, Animaniacs, and reruns of Newsradio, and Seinfeld. Generally good shows but I would let them all wander into the desert alone to die so I can have Archer, The Venture Bros, Game of Thrones, and This Week Tonight. Well, maybe not Batman, that show really holds up. Anyway, if you at home take that extra step (go on, think about what you watched then and what you’re watching now and if you’d trade) you realize that nostalgia doesn’t keep on going when you really think about it. In other words (unless you’re thinking about Batman:TAS) your feelings have lied to you.

Culture Clash of the Colossus

Shadow-of-the-Colossus-QuadratusAlthough if you want to be fair, it’s not your fault. It’s hard to compare the merits of things over vast temporal distances because they come from different times. The crusty zeitgeist that memories and art emerge from are fundamentally different. This sometimes leads people to make assumptions about the past that are maybe not true. For example, there’s a lot of arguments that the PS2 was at the center of the Greatest Gaming Age ™. Most of that has to do with the library of the PS2. For those of you that are unfamiliar with that library (if so, may I be the first to welcome you to the planet Earth) here are some of the hits : Ico, Shadow of the Colossus, Metal Gear 3: Snake Eater, Final Fantasy X, Ratchet and Clank, Psychonauts, Silent Hill 2, Grand Theft Auto 3, and God of War.

Seems like a golden age, no? But while it is true that the PS2 had an insane number of quality games, much of that was just because it had such a huge library. The console was still making games for over a decade after release, so just 1 classic per year gave the PS2 higher than average hits based purely on volume. Yet, the average was still average, but that nostalgia bias is making you think right now, “that can’t be right.”

That was my thought too, so I had the gnomes at NitWitty R&D go back into the mines and get me some PS2 Metacritic Data. They brought me a spreadsheet that we can all enjoy. Feel free to play with it, but once the numbers were all crunched, these were the averages:

    • 2000: Average Score – 71.14
    • 2001: Average Score – 70.62
    • 2002: Average Score – 68.46
    • 2003: Average Score – 70.14
    • 2004: Average Score – 68.92
    • 2005: Average Score – 68.37
    • 2006: Average Score – 65.14
    • 2007: Average Score – 63.56
    • 2008: Average Score – 64.38
    • 2009/10: Average Score – 66.87
    • Total: Average Score – 68.03


So let’s look at these numbers. Basically, the number you want is the Total Average for reviews. It’s just 68.03 for the lifetime of the system. The games that people think about when they get all starry eyed over the PS2 were the ones like I mentioned previously (which according to the data are all in the 90+ range). The average though, is a D+. Worse, that’s actually being kind. The average User Score is 5.5/10.

That the PS2 had some great releases is not up for debate, but saying that was any kind of golden age isn’t correct either. What is easy to miss is that the gaming culture of then versus now are wildly different. It’s unfair to say that PS2 had (this many) hits and XBOX 360 had (this many less). The gaming landscapes that they existed in make such a direct comparison impossible.

Hey, look! Toriel everybody!
Hey, look! Toriel everybody!

Today while individual platforms don’t have the same amount of “A+” games that the PS2 had, the democratization of distribution has opened up whole new avenues for release. Case in point – Undertale and Minecraft are both games that simply could not have existed in the world that had the PS2. The distribution systems just weren’t in place and there was no way that games of such modest budgets would have passed the Sony QA checks. So yeah, PS2 numerically had the highest number of overall hits, but the entire market lived within its long shadow. Without that monolith in the way, a lot more titles on other platforms can see the light of day. In aggregate, the number of A+ games released per year is the same as it has ever been, it’s just spread around a bit. While the culture is different than it used to be, in terms of numbers and overall quality, the needle hasn’t budged.

The reason that I write this, is not to tear down the past. If anything I want to celebrate the past for what it was – good occasionally with a lot of garbage we’ve thankfully all forgotten about. No matter what our feelings tell us, the past was not perfect, in fact it’s not even statistically different. But nostalgia can be harmless so go ahead and indulge in it from time to time. Heck, give yourself a whole month. But realize that it doesn’t show anything true; it’s a feel-good fun house mirror that casts everything in a golden glow and 10 pounds lighter. Just know that as soon as somebody decides that The There and Then is better than The Here and Now, it becomes a self fulfilling prophecy.


Eric Carr

Occasionally has mad notions, and more often than not runs with them. Welcome to one of those.

You May Also Like