2010 IGF Finalists: Main Competition

By: Derek Yu

On: January 4th, 2010

The results are in. Your main competition finalists for 2010 are:

Seumas McNally Grand Prize:

  • Joe Danger
  • Monaco
  • Rocketbirds: Revolution!
  • Super Meat Boy!
  • Trauma

Excellence in Visual Art:

  • Limbo
  • Owlboy
  • Rocketbirds: Revolution!
  • Shank
  • Trauma

Excellence in Audio:

  • Closure
  • Rocketbirds: Revolution!
  • Shatter
  • Super Meat Boy!
  • Trauma

Excellence in Design:

  • AaaaaAAaaaAAAaaAAAAaAAAAA!!! — A Reckless Disregard For Gravity
  • Cogs
  • Miegakure
  • Monaco
  • Star Guard

Nuovo Award:

  • A Slow Year
  • Closure
  • Enviro-Bear 2000
  • Today I Die
  • Tuning

Technical Excellence:

  • Closure
  • Limbo
  • Heroes of Newerth
  • Joe Danger
  • Vessel

Congratulations to all the finalists! Now… DISCUSS.

  • http://vacuumflowers.com/ Sparky

    I like the idea of honorable mentions. It could both help publicize deserving games and help people who almost made it feel better about not being selected.

    I also like Paul’s suggestion of having a few more finalists. It seems to me that there’s room in the spotlight for more games than are currently finalists, but likely the organizers of the IGF have a better feel for that than I do.

    @Terry-
    I’m looking forward to VVVVVV immensely, its IGF status has absolutely no effect on that :)

  • TK

    I can definitely understand certain games receiving multiple nominations (it happens in film and music awards all the time). However, I get the feeling that certain games weren’t really evaluated based on specific criteria for each category, but that the judges enjoyed the game as a whole and awarded high scores for all categories.

    The audio category feels especially random this year as compared to the past, where many of the finalists featured audio that was far more central to the game (either the music was phenomenal, the implementation was unique, or there was something that really caused the audio to stand out). This year, it doesn’t seem to be the case at all. The audio in these games is good and fits quite well, however in most cases I don’t see how the audio was “excellent, innovative or impressive”… it was just good. There were many other contestants this year whose audio truly stood out. Looking at these comparisons, it’s really difficult for me to believe that the judges actually evaluated the audio as a separate category.

    I hope in the future, more consideration will be given to specific criterion in each category, and that judges don’t just give high marks for all sections if they like the game as a whole.

  • Proc

    I’m cheering for Rocketbirds the art/lighting and style are just top notch.
    Honourable mentions is a nice idea but in the end we are all putting our kids up for adoption.
    I had a glimmer of hope to see Emberwind under audio knowing how much effort we put in to get it spot on, but getting noticed out of 300 entries is epic, bravo to the finalist.

    @David Rosen: Agree with you about HON, acclaim sure but nomination hmmm.

    @VVVVVV: I’m very fond of Terry’s work but Metal Storm with excellent level design isn’t omg.

    @FISH: “fez was one, fro what i hear.” The guys at PixelJunk told you that, didn’t they. =p

    @Ntero: Certification process is paid for by publishers in any fair deal.

  • paul eres

    “But I stand by our voting process – it’s the aggregate of all those judges, and that makes it fairer than one opinion, ultimately.”

    nobody’s suggesting only one judge, though; that isn’t what the objection was.

    the theoretical way i’d ideally structure it is to find 20 or so judges willing to play every single game, and give them more time to judge all the games. so individual games would still be judged by as large a variety of people, it’s just that it would be the same people, rather than a random set of people. i don’t think that’d be any less fair or less of an aggregate than a dozen or so judges at random out of 150 judges playing a game.

    but anyway, since the igf has been using its own strange process so long i don’t really expect it to change, i mainly went into it to explain why the results are what they are.

    also, another thing is it’s not just people having personal favorites that they are upset about, it’s about large numbers of people expecting a results that at least marginally reflect the community’s tastes. i.e. there are even people who don’t like vvvvvv who think it should have been a finalist and are surprised it wasn’t.

  • Jim267

    I checked out a trailer of Trauma and it looks awesome!! I thing it’s going to be the big surpise of this competition!

  • http://www.glaielgames.com Tyler Glaiel

    @paul
    “it’s about large numbers of people expecting a results that at least marginally reflect the community’s tastes”

    the community can’t really have an accurate taste when half the games aren’t out yet.

  • paul eres

    agreed, but that’s why the other complaint arises, that this seems to be a contest for unfinished, unreleased games rather than games the public can play. unlike, say, the emmies or oscars or pulitizers or other awards in other media, where the public has access to the things being given awards.

    i’m not saying it’d work better to restrict it only to released games, as there are advantages to the way it works now, just that it’d be nice if indie games also had something similar to those, for finished games only.

  • robolee

    I’d say the minimum number of games a judge needs to play is 30, then write up what they like dislike about them in some kind of judges comments system, then go back after playing them all and play them in quick succession (say 10 mins per game) then rank them in order from most liked to most disliked (playing quick games to re assess the order of two games). this list gets sent in, 15 of them will have been covered by another person as well, and the other 15 by somebody else.

    if they can’t manage 30 games in 6 weekends (2.5 per day if you only analysed them at weekends) then why would they be considered as a judge.

    because each judge analyses 15 of another judges games then you only need 20 judges minimum (2 reviews per game), however you’ll probably want 40 judges.

    another thing you should encourage people who are playing the same game to share their opinions on it, so that if one doesn’t like a game the other may point out something about the game they hadn’t realized. and you should also encourage people to play more than the minimum number of games.

  • DanMacDonald

    It would be interesting to have some regional game festivals that participants submitted games to prior to IGF where all this contraversy could happen and then the winners from those competitions have the ability to move on to the IGF.

    If there really are that many good games being submitted, its a shame that they don’t get some light shone on them simply because of the logistics of trying to do everything in a single competition / festival.

  • http://www.igf.com Steve Swink

    @robolee: There were more than 150 judges this year. With the current arrangement, each game gets played by more judges than you’re advocating for, for longer. The judges are required to leave comments. Also, there is a sophisticated judging backend: judges can and do leave comments for other judges. There was quite a bit of awesome, lively discussion in that channel this year.

    @DanMacDonald: I’d be interested in seeing a survey with lots of participants: “Top games that didn’t make IGF finalist but should have.” There are always a couple really wonderful games that don’t make finalist.

  • http://blurst.com Matthew

    There are 20 IGF finalists this year. There were more than 20 great games entered into IGF 2010.

    If we have X finalists slots, and Y great games, we’re **always** going to be at X<Y, and that’s always going to trigger this conversation. Just because Game A, B, or C isn’t a finalist isn’t a flawless indictment that the IGF judging process is broken…

    By the way, nor is it any indication that Game A, B, or C wasn’t ranked #6 or #7 in any particular category. Not being a finalist isn’t the same thing as being last place.

  • http://www.beatnikgames.co.uk Robin

    well done everyone who made it! great to see Joe Danger and SMB on the list…

  • Jeremy

    Not surprised about VVVVV. Has an 8-bit platformer ever made finalist (apart from nuovo)? What’s so great about it? From the video it just looks like a platformer with bad graphics and you control gravity. And that feature was already done 18 years ago in Megaman 5…???

    I think it’s mainly cuz Terry is a popular member here.

  • ShawnF

    First off – super happy to see how much the judging process has improved. HUGE strides from previous years. It was especially nice to see that judges couldn’t pick their own games.

    Personally not a fan of some of the picks, but can’t really complain just because I don’t like the taste of the judges.

    @Paul Eres: I think you’re being a little unrealistic. There are now 300+ entries… assuming judges put in an hour per game (which imo is about the minimum it takes to properly judge a game, although after talking to some entrants I know who tracked IGF judge playtime, most judges only play about 15 minutes :/)… you’re talking 300 hours of time. That’s a full time job for 8 weeks. Definitely not feasible. It would be nice to find a way to thin down the herd a little so judges got to see a bigger sampling of games, though.

    @Jeremy: Well, Star Guard is pre-8bit, does that count?

  • Foppy

    Can’t they just count the number of lines in the source code for a fair, objective comparison? :D

  • Jasper

    I wouldn’t even try to send my game to IGF if only a few judges test my game 15 minutes…

    By the way 1/3 of this list is disrespectful for the rest of the 300 entries.

    Participant selection should be better.

  • http://www.indedependant.info Olympi

    Am I the only one who think there is too much plateformer and/or puzzle games in the list ?

  • TK

    My suggestion for improving the judging process is to instruct the judges to evaluate each category for what it is, and try their best to follow the guidelines set forth in the competition rules for each category. The reason I say this is because this year I really felt that in a lot of cases, the judges might have just liked a game as a whole and then given it universally high marks… even though in individual cateogies I really felt that the games did not capture the spirit of the category’s guidelines.

    I’m going to again point out the audio example with comparison to years past. The selection process for the audio category is as follows: “Scores will be based on the innovation, quality, and impressiveness of each Entered Game’s music and sound effects.” I can honestly say that most of this year’s audio finalists had audio that was good and fitting, but was not innovative. I could not in good faith say that the music/sound effects were nearly as impressive as other entrants or finalists from the past. Some barely featured any music at all. In the past we had finalists that attempted to do something unique and innovative with the audio aspect of the game; or we had phenomenal music and SFX. I cannot say the same for the finalists as a whole this year. It really just looks as though judges enjoyed certain games, and gave them universally high scores as as result without evaluating individual categories… and this is disappointing.

    What has to happen is that the guidelines written in the IGF rules need to be changed, or the judges need to start evaluating each category specifically. As it stands, this is very disheartening to people who built their entire game around music and sound and really put effort into trying to do something innovative.

  • http://www.godatplay.com God at play

    @bateleur: Excellence in Design is the “gameplay” category.

    I think Laremere has a great idea regarding how to move more judging time to the better games. You could have a select few that “take one for the team” to judge the games that obviously aren’t going to make it, and then bring on more judges over time until the top 50-100 or something.

    Plus, judging games longer would be feasible if you increase the submission fee. That extra money could go to paying judges to commit more time to judging.

  • zeek

    VVVVV is fun but it didn’t really blow my socks off. I’m actually really shocked so many people are complaining about it’s absence.

  • ShawnF

    @TK: The judging guidelines actually specifically say that each category is meant to be judged in isolation from the rest. Even the Overall category for the Grand Prize is its own thing, not an average of the other scores. This is one place where the IGF’s system is fine, they just can’t defeat the stupidity of the judges who don’t read the rules. :(

    As far as getting more judging time goes, I agree that that’s a real issue right now. The judging window was quite short, which I think is part of the reason why so many judges put in such little time.

    Maybe just have it go in waves, almost like a tournament? Something like:

    Wave1: Judges are instructed to play their games 15-30 minutes and rate each category roughly on a 1-5 scale.

    Wave2: The top ~50% (or whatever) is taken. Games are reassigned. Judges are instructed to play each of their games for at least one hour and use the current, more detailed judging system and provide detailed feedback.

    Wave3: Bottom 50% are eliminated again, the rest move on. Games are all reassigned to new judges. Possibly keep the old scores as well and average out the Wave2/Wave3 judge scores so that basically you’re just adding a bunch of eyeballs to the top chunk of games (aka, the ones that actually have a shot at winning).

    I dunno. There’s no real time pressure for the IGF, so no reason not to stretch out the judging period if you get better results.

    And on a different note, I’d be happy to see the entry fee go up to help weed out more games. I think a lot of people are seeing how almost random the finalist list seems to be, so lots of people are entering just because… well, why the hell not? I realize that this isn’t in “the indie spirit” or whatever, but meh. If you can’t spring for $250 or whatever to enter your game, then your chances of actually being a finalist are pretty much nil anyway.

  • TK

    “This is one place where the IGF’s system is fine, they just can’t defeat the stupidity of the judges who don’t read the rules. :(”

    @ShawnF: That is very true, it’s such a pity. My friends and I were eagerly anticipating the results of the audio finalists because we’re all musicians and had been following several of the audio-centric entries… it was very disappointing because it really did seem like the judges didn’t follow their own criterion for evaluating audio. The most obvious criteria not met by most of this year’s audio finalists I think was innovation (especially compared to years past). What’s the purpose of even having an audio category if it just ends up being a catchall for games the judges like as a whole?

    I don’t think the finalists need to be specifically audio games, but the audio should at least stand out and meet the listed criterion. The judges should at least be issuing scores based on the “innovation, quality, and impressiveness of each Entered Game’s music and sound effects.” There were so many games this year where you could see that music and sound was a major aspect of game development. I can only ask that in the future, the organizers faciliate more emphasis being placed on the audio category. I’m sure it’s possible; after all, I felt that audio was accurately judged in years past… there were many innovative and stand-out audio games that were recognized in other IGF competitions.

  • bateleur

    @ShawnF – There is one reason not to stretch out the judging period: the judges are volunteers!

    I was already quite impressed this year by the number of judges the IGF managed to find. Making the task of judging more time consuming doesn’t seem like a good plan if we want to see the improved level of fairness continue.

  • Emerlan

    I play both Enviro Bear 2000 (Nuovo) and Star Guard (Design).

    I don’t understand…
    I thought my game was a little better than that… I work so much over the year on that game. Write my very own engine, try my best at design and with graphics, pay a lot of money for good musics, create sfx myself and all.

    Not even an email to announce me that I failed pathetically. What a shame for me to show to my family what won a finalist place over me while it is just what I hope for my game.

    I strayed myself.

  • Mr. Podunkian

    “Sadly I think this game suffers from being just another puzzle platformer, which basically hurts every game that isn’t Braid.”

    – feedback from IGF judges regarding kyle pulver’s “verge”

    cool judgin’ bro, i’m glad we have the best people on the job.

  • bateleur

    Mileage may vary – the feedback I got was really helpful. Things I learned from it:

    * The game was submitted way too early in the production process. The judges were looking for something complete.

    * Doing a special “everything unlocked” version for IGF was a really bad idea, because it caused the game to make next to no sense when played.

    * The walking animations look wrong to some people.

    * The character dialogue distracts from rather than supports the gameplay.

    And comments about the game which I did feel were questionable were balanced out by the multiple perspectives, so hardly a cause for alarm.

  • ShawnF

    @batelur: Actually, the fact that the judges are volunteers is one of the main reasons why I think it’d be a good idea to stretch out the judging period. I only had 3 weekends to judge 14 games. One of those weekends I was out of town. If I had had a longer judging period, I could potentially have devoted more time to each game.

    On a different note, since this has become the bitch-about-the-IGF thread, I think it’s unfortunate that they don’t reveal any of the scores. If they really have faith in the fairness of their judging process, there’s no reason not to show the scores. Or if they don’t want to hurt anyone’s feelings because they got a sucky score, then they can just show the top 25 games with the highest average scores. This would have a lot of nice things about it:

    – It’d be something interesting/fun for fans to look at, helping to drive traffic and interest to their site.

    – It’d be a nice ego boost for entries that did well but didn’t quite make Finalist.

    – Larger websites that only cover a few indie games would have a larger pool of “good” indies to choose from. Normally these kinds of sites mostly only cover finalists, and this could help widen the pool of games that actually receive press (which could be the difference between a dev being self-supporting or failing completely).

    At the very least it’d be nice if the entrants themselves could see their own frickin scores. Numbers are a lot more concrete than polite text feedback.