Every generation of gamers eventually asks the same question: Was gaming really better back in the day? Or are we just looking at the past through a nostalgia filter? The truth, like most things, sits somewhere in the middle. The 90’s, 2000’s, and 2010’s all represent different eras of gaming — each with its own strengths, flaws, and defining moments that shaped how we play today.
The 1990’s were about foundations
This was the era where genres were being invented in real time. Platformers, fighting games, RPGs, survival horror — all took shape here. Games weren’t cinematic, they were mechanical. You played for mastery, not story beats. Titles like Super Mario World, Street Fighter II, Final Fantasy VII, and Resident Evil didn’t just entertain — they set templates the industry still follows. But let’s not romanticize it too much. Saves were limited, checkpoints were cruel, and accessibility barely existed. The challenge was real, but so was the frustration.
The 2000’s is where gaming found its identity
Consoles became powerful enough to support open worlds, voice acting, cinematic storytelling, and online multiplayer. This was the era of GTA: San Andreas, Halo 2, Metal Gear Solid 2 & 3, Resident Evil 4, and Call of Duty 4. Games stopped feeling like “games” and started feeling like experiences. Multiplayer became social. LAN parties, early Xbox Live, and PlayStation Network changed how people connected. For many players, this is the golden era — not because it was perfect, but because it balanced creativity, innovation, and fun before monetization became the dominant focus.
The 2010’s, arguably the most transformative decade of all
This is when gaming became mainstream culture. Streaming, esports, digital storefronts, and indie games exploded. We saw massive open worlds like Skyrim and The Witcher 3, narrative-heavy experiences like The Last of Us, and experimental indies like Undertale and Journey. The industry diversified — not just in genres, but in voices, stories, and styles. At the same time, this is when cracks started showing: microtransactions, unfinished launches, live-service fatigue, and corporate overreach became part of the conversation.
So… was gaming better back then?
In some ways, yes. Older games often felt more complete at launch. There was less emphasis on monetization, fewer day one patches, and more creative risk-taking from major studios. You bought a game, and what was on the disc was the experience. No seasons, no battle passes, no roadmaps. But modern gaming has its own advantages. Accessibility has improved. Indie developers can compete with AAA studios. Games are more inclusive, visually stunning, and narratively ambitious than ever before. And despite industry problems, players today have more variety and choice than at any point in history.
The truth is gaming wasn’t better — it was different. Each era reflected the technology, business models, and player expectations of its time. What made those older eras special wasn’t just the games — it was how we played them. No guides. No YouTube walkthroughs. Just trial, error, and discovery. And maybe that’s what we miss most.
Written by StoneyThaGreat


