Kill All Fairies

Tackorama's entry for Ludum Dare 33 (Aug 2015).
The game jam theme was You Are The Monster.

Kill All Fairies Download Kill All Fairies from GitHub

Kill All Fairies Download Kill All Fairies from GameJolt

Single player | Mac, Windows, Linux | Free to play

Kill All Fairies

Her impressive elegant queenage, Frogmella Imanokcuf, hereby grants clemency most extraordinaire to all monsters of the realm upon successful execution of all fairies in thy precious kingdom within two minutes.

PS Beware of the sugar plum fairy.


Final marks

Rank Category Score
455 Audio (Jam) 2.96
502 Humour (Jam) 2.83
883 Mood (Jam) 2.60
892 Theme (Jam) 2.88
966 Fun (Jam) 2.43
1043 Innovation (Jam) 1.92
989 Overall (Jam) 2.63

Comments about the marks

The happy part

I am pleased with the humour, theme, and audio marks as I thought these were strongest aspects to the game. The fact that other developers got the joke(s) and liked them meant a lot. I aimed to make a game that was light in tone in contrast to the theme and succeeded in that.

The surprising part

I was surprised with the similarity between the averages of the marks (leaving out the innovation mark). I suspect a lot of lazy marking going on, where developers would assess the game overall and then award that same value for each category, which is not the bloody point of having separate categories at all.

Looking at other game's breakdown of marks I see a very similar pattern of uniform marks across categories. At this point I have to bitch about graphics - I removed my game from this category so I can bitch at will and will do so to illustrate my point. I look at other's games and think

4/5 stars for that?! Really? Have you cleaned your screen lately?
Then I see the other category marks and surpise, surprise, they are roughly the same starrage. Lazy marking. QED. This is a common behaviour that seems to have rewarded some visually heinous entries and punished others.

The bitchy part

Turning now to my audio mark of 2.96... not bad you might think. But there are some really bad, crackly, grunty, muffled, audios on entries that scored higher. Why did they score higher? Lazy marking. The reviewer liked the game so every category got the same mark. FFS.

The serious part

On a more serious and less bitchy note, accurate feedback is essential as the category breakdowns are meant to provide marks for what are the separate components of a game entry. When the marks are relatively uniform (see left, and indeed most other entries) that really doesn't tell me much apart from (1) there's lots of lazy marking going on, and (2) only the overall mark is worth paying attention too.

On a positive note at least my game got reviewed and the comments were much more helpful and welcoming than the marks. I have to wonder if the category breakdowns are worth it all, I mean, what would I loose if I entered my game for the lowest number of categories? Surely I stand to gain by removing weaker categories?

The disappointing part

The lazy marking hurts because I think there was a vast difference between the quality of the seven categories I was scored in and I feel I've been denied some crucial information. But I cannot really do anything with these marks. Perhaps that is a good thing as they are only numbers. (Irritatingly middle of the road numbers as well.)


If there is one lesson to be learned for these marks it is this:

The secret to getting a good score has nothing to do with individual game components...make a game that "they" like and "they" will be lazy fuckers and give high marks across the board.
Cyncial, slightly bitter, but painfully true.

Still I finished my game, which is more than some, and it is complete. I loved almost every minute of it. And it makes players smile. Job well done.