Table of Contents

About

With some experience, we enumerate some ideas regarding game design and some of the more common pitfalls to avoid when creating a game. This section will be split up by game genre with some general pitfalls that apply for all genres being mentioned on this page. The writing style will be kept mostly historical such that regardless of anything that might be interpreted as an opinion, even if there is proof to that end, the pages will contain references to interesting events in game history that would exceed the context of this section.

Eye Candy

Contrary to common belief, and perhaps mainly due to superficial knowledge of the market, people that play games are not too enticed by the quality of the graphics. If that were true, then the creators of angry birds (Forbes) would not be that wealthy. One very recent flop at the time of writing (2025) that seemed stunning is that of Frostpunk 2, which was a highly acclaimed and expected game due to Frostpunk 1 being a massive success. The difference between Frostpunk 1 and Frostpunk 2 is a whooping rating score difference between 9.7 and 7.1 (from https://steampeek.hu).

We would wager that the decision of the developers to bump up the graphics requirements between Frostpunk 1 and Frostpunk 2 was a very bad decision given that this is a top-down city builder with "colony" and "simulator" elements such that nobody truly cared what the graphics were like. This is without mentioning that retro-gaming elements have not re-surged as a source of inspiration such that many games now contain pixel art and various other elements that could not be described as the bleeding edge of graphical capabilities. Whilst Frostpunk is mentioned in our gallery of unique games, there are memorable flops such as "Doom Eternal" where the developers lost a decade to making eye-candy before finally releasing the game, such that by the time "Doom Eternal" came out, players were not even tracking the game anymore such that the sequel to a very popular title that did not receive the expected attention relative to other games on the market that did not obsess about graphics and released what they had sooner.

In fact, it might be the case that similar to robots, games in general have a similar "Uncanny Valley" as coined by Masahiro Mori (a robotics engineer), where a human's response to a game with characters that look too realistic is, counter-intuitively to those that push for better and better graphics, mostly negative instead of being positive. In other words, the more realistic a game gets, the more a human being is triggered to the point of not really wanting to play the game at all. The same "Uncanny Valley" principle blends well with engineering or games that have an extremely high degree of complexity, that counter-intuitively, end up becoming a niche due to the tasks to be carried out in the game resembling too much like "real life work".

"Eye Candy" and bigger and better graphics are the equivalent of "quantum mechanics" for people that are not even physicists by trade nor could they solve an equation for the life of them yet take solace in discussing a domain that does not scale with creativity, nor ingenuity, nor literary capabilities but with the sheer amount of resources that you can throw into a blending-machine that churns and crunches numbers more than often to approximate curves and make things seem nice and round.

Before graphics cards even existed and after they started appearing, they seemed counter-intuitive to most gamers, such as Commodore Amiga owners, because the graphics cards did not seem to change too much visually. Even console owners already had "3D graphics" aplenty and it was not really clear what this new device on the market would do. The idea here is that of convenience and whilst Donkey Kong was motion captured and turned into sprites making the experience look 3D for all intents and purposes, graphics cards helped game developers by only asking of them to generate a frame, after which the computer rendered the scene on its own without requiring the developer to be able to translate 3D graphics to a 2D scene (for instance, having the artistic perception of knowing where and at what angle to place a shadow on a drawing given multiple light sources, something that a graphics engine and a graphics card would be able to). Even trivially, only very recently, aside from color-depth, monitors do not have a depth component and everything is actually displayed in 2D, so what could 3D graphics possibly bring to the table as "new" if it cannot be experienced in three dimensions? True 3D as in virtual reality or augmented reality by using various optical tricks is only becoming a reality just now but all games that everyone plays to this date except VR/AR are really 2D. The difference is that earlier games either used motion capture and then transformed everything into sprites or hired graphics artists that were knowledgeable in painting scenes with the correct placement of shading and shadows to make the scene look 3D.

Either way, it must be noted that the benefits from having the latest and brightest in terms of graphics technology are conditioned by a bell-distribution where most of the time, if you find yourself after the 2000s, you're already past the point of diminishing returns. In fact, one of our favorite conspiracy theories is that lots of developers and game companies strike a chain of deals that lead up through various layers of software creators in order to push for certain operating system or graphics card upgrades. The reality is that even if one would take away most of the advanced graphics for most current games, most people would still be happy to play them and, to end in a circle, note that "Angry Birds", a game with no claim to graphics was played by a wide array of individuals, not only hardened gamers that "could put up with the modest graphics more than casual players".

Political Squabbles over Game Difficulty Settings

Recent years have let politics hit the gaming scene and one of the talking points has consisted in terse debates on the level of difficulty in games. One camp considers that players should get better at gaming in order to surpass the baseline difficulty setting implemented by the game usually by refusing to allow the player to set a lower difficulty setting and the other camp considers that the difficulty settings should not really matter because games are just a matter of entertainment.

Both parties are right and in different contexts. In a competitive gaming setting, the same default setting is applied to everyone participating in order for everyone to have to beat the same difficulty level. Similarly, in terms of achievements and online rewards, very often the achievements are locked to a given minimal difficulty setting in order to not allow players to switch to a lower difficulty setting and then obtain the same rewards as a player that has played the game on a higher difficulty setting. Otherwise, even touching the point of sharing "Save Games", something that we offer all the time, having saves around that allow players to even skip large portions of a game is something that helps game journalists, vloggers and others that would like to replay or revisit a specific scene in a game (it's why we invented Horizon, in the end!). Similarly, a common design pattern is to create a game that can be played by a single player but then also played via a network by multiple players, such that having a save game that completes the game progression and offers a completely explored map is a blessing to those that want to load up the game and say, have a versus match without having to replay the story again.

However, perhaps the worst part of these political debates is that they have shifted the game design into a stage where players would not know what to pick because there is no baseline anymore. Even though, one would claim that this is great because then players would be able to just mix-and-match what they like in terms of difficulty, especially for games that allow partial customization of difficulty levels, it becomes unsure what the creator of the game intended to write. Games are a literary art-form expressed via technology and many games, in particular those that have a story-mode progression are told like a tale or read as a book, such that the actual difficulty level becomes tied to the literature. If you've played the Metro series, it is clear that the lack of ammo, combined with the lack of oxygen masks and the difficulty of the enemies renders a bleak story that emanates survival under conditions of desolation and despair - which is exactly what the whole literature around Metro seems to revolve around as a post-apocalyptic artwork. If the difficulty setting would have been pushed any lower, given human subjectivity, players might have remember the game as a walk in the park and not a harsh game. When we play games, we always check for the baseline, or, in the formula found often in games, "the way it was meant to be played" and that is the difficulty that is first attempted.

Lastly, game difficulty settings have historically been a matter of number crunching than, well, a more all-encompassing notion of "difficulty". That is to say that for a game developer to make a game "more difficulty", they would typically bump up, say, the amount of life points that an enemy has or increase their damage and even cripple the player units. However, all the former was just a matter of addition and subtraction such that the game got "more difficult" but mostly only in terms of scale. A more "all-encompassing" notion of difficulty would also include, say, more difficult puzzles or other unexpected turns of events, something that is not too easy to realize technically in a linear progression. Later, more advanced games, but with increased costs manage to adapt to the player in order to understand their playing style and to dynamically create challenges. We suppose that the takeaway lesson is that number crunching is not the pinnacle of "game difficulty" and that "difficulty", in general, could engage the player also in terms other than sheer endurance.

Pretentiousness

We have established that games are really just literature with more engagement than a paperback book. However, just like books, reading thousands of Barbara Cartland romance novels that just include permutations of characters with little takeaway and written in a simple vocabulary for all to understand, is way different than reading less but referential literature, such as philosophers, writers etched in history, or other grand-works that constitute the fundamentals of culture.

With that said, just like books, it is important for people that are starstruck by the superficial to understand that the actual quantity of games played does not matter too much, that some games are more referential than others and that every player should ask themselves whether they should sink their precious lifeline into playing a certain game. Going further, being too demanding of players, making the story arch too little or the game loop too small, while claiming that players are ungrateful for abandoning the game up to telling them to "git gud", is just pretentiousness at work because a game might just be intrinsically bad relative to others.

In some ways, this is equivalent to Hollywood producing a bad movie that nobody wants to watch and when the creators realize that they have negative income they try to blame everything from "piracy" to "hackers" for not receiving the income that they expected on paper.

The early versions of the Assassin's Creed games expected the player to perform a fixed set of limited tasks, in one city, then another, and then another, and, well, then yet another city, and so on, for many other cities or towns. Any players complaining that the game is really repetitive and dull would have been met with a political backlash that ostracized them for being too superficial or, hating the fact that the main game-developer is a woman, but without even bothering to look upon their creation with some constructive criticism. In fact, the repetitiveness of Assassin's Creed extends over no less than the first four-or-so games, with little variation except for technical enhancements such as graphics or improvements to the physics engine. Another example reaches back to "Prince of Persia: Warrior Within" and interesting wall-grapple game that, just like its predecessor in the same saga, very nicely implemented time-shift effects where the player could roll-back a scene. Unfortunately, aside from the time-shift innovation that is a hallmark of "Prince of Persia", the "Prince of Persia: Warrior Within" had a map layout that expected a player to reach the top, gain some power, and then perform the exact same missions but in reverse, yet with the help of the extra power. The former decision to expect the player to repeat the game map in reverse made the game extremely repetitive such that few players actually plowed through the same maps again.

In the end, expecting too much of a player, and even going overboard to call them superficial for not comprehending the depths of your work, is a bad design choice. Often, the pretentiousness can be observed as, say:

Clearly, all games are unique and they end up generating a niche, regardless how wide and broad that niche might be, however making a game too pretentious is a guarantee that it will tend to receive attention from a select audience and not from a general audience.

Index