I’ve tried to keep news about the Natural Landscape Photography Awards to a minimum as, although it has taken up a lot of my bandwidth, continuous updates would soon get boring I imagine. However, it’s over! (well the “awards” part of it anyway). Just over a week ago, we announced the winners and runners up of all of the categories and also a few extra awards, which I’ll come onto in a bit. We are all incredibly happy with the quality of winners that were chosen by our panel of judges and we’ll have a gallery of these at the end of the article.
Did the competition meet our expectations and goals though? Well, yes. And surpassed them in many ways (including ending up with 4x as many entries as EpsonPanoAwards and similar amounts to many of the other landscape photography competitions and, in fact, some high profile generic photography competitions). There were a few surprises along the way and we’ve learned a lot in this first year but, judging by the feedback we have had, not only have we hit mostly the right notes but we also seem to have got the timing just about right as it appears that demand for something like this has been simmering away.
Did the competition meet our expectations and goals though? Well, yes. And surpassed them in many ways
Steve Alterman : Photograph Of The Year, Winner
Our main goal was always just to create a space where the sort of photography that we appreciate so much, can shine. To that end, the competition isn’t really complete yet. We’re currently in the process of designing the ‘awards book’, which we’re hoping will be much more than just a ‘Best of NLPA’ catalogue and more of a well-rounded representation for our future goals and aspirations.
I wrote about the foundations for the competition in February of this year when we originally announced the competition. You can read about this in issue 225 and so I won’t repeat all of that.
What might be interesting is what changed along the way and what we learned in the process. Here are a few points.
During both the early prejudging, and during the main remote judging by our very talented panel of judges, it quickly became clear that an averaging scoring system wouldn’t work to bring out some of the best photography. In the prejudging, we had many images where one judge would give an image 5 stars and another would give it 0 stars. This image would then not generally do so well as the average score wouldn’t compete with an image that got mostly 3’s or 4’s. But it was these “polarising” images that the judges got passionate about. In the end, we decided that anything that got given a full 5 stars by a single judge was automatically put forward, regardless of the other judges' scores. Images with high average scores still did well but this change promoted the best range images.
Knowing this, we rationed the number of 5-star scores that each judge could apply. When we got these main scores back, we realised that there were few images that the judges unanimously agreed on. Because of this, we ended up changing our final process so that each judge would present their favourite images from each category to the other judges who would then discuss them. Although it was still up to each judge to pick the image they wanted to promote to the final round, it made sense for them to take into account what the other judges said. In that way, the final round would include a favourite from each judge. In this way, all the final images would be a favourite of at least one judge. The judges said how much they preferred a subjective approach like this.
Michael Frye : Grand Landscape Winner
One of really exciting things about submissions reaching such a high level is that we are able to produce a high-quality hardback book of the high ranking images from the competition. This means we can ensure that many of each judge's favourite images will be showcased and we are soliciting some comments from them about why they found them so interesting. In many ways, we consider the book as the best representation of the awards because it allows us to showcase a wide range of images that meet the vision we had for the competition.
For the top 10% of images, we requested RAW files to verify that the images met our published criteria. If images did not meet the criteria, it was normally because of one of the following four reasons.
The extent to which we post-process our images is very much a personal thing and our goal by checking for “over processing” wasn’t to discard images that exceeded a certain threshold. It was more about ‘has the post-processing been done in a way that could deceive the viewer’. We felt it important that there is still a connection to the scene and so if the post-processing broke that connection, we made a choice between making the judges aware of what the raw file was like or rejecting the image if it was obvious that the connection was lost.
Our goal by checking for “over processing” wasn’t to discard images that exceeded a certain threshold. It was more about 'has the post-processing been done in a way that could deceive the viewer’.
Using the dodge and burn tool to accentuate features or to make the most of the lighting in a photograph is quite normal. Using post-processing to create light that didn’t exist during capture is potentially a step too far. Only a couple of images did this so much that they were automatically rejected. A few were mentioned to the judges where appropriate.
We were fairly flexible with minor cloning if used for cleaning work. All cloning of lens flare, dust and other ‘non-subject matter’ was allowed. Minor cleaning work was allowed so if a small twig was cloned out of a part of an image that wasn’t critical to its success then we let it pass. If it was felt that the cloning had affected distorted the main subject matter of the image then we either referred to the judges or disqualified. This was more about not punishing people who may have done some cloning work but forgotten about it rather than saying it’s OK to clone things in general. Only a few images had such minor cloning and they did not affect the scoring in any way.
This rule covers distortions to the frame that aren’t simulating perspective correction lenses or correcting for lens distortions. The most obvious of these is mountain stretching but there are also lots of uses of the warp tool to pull distractions out of frame, make a composition more symmetric (i.e. warp the subject into the centre of the frame or adjust the height of features from side to side). We were fairly strict with this rule but we did allow minor adjustments that looked ‘accidental’, that didn’t affect the subject matter or were the results of stitching panoramas etc.
Out of all of the images we raw checked in the Grand Landscape category, only 16 out of 358 images were automatically rejected and 18 were ‘borderline’ and would have been referred to the judges if they were chosen. Of the borderline ones, I think the judges would have been fine with keeping the majority of them but they may have potentially scored a little less.
The final number of rejected images worked out at 6% which is a little lower than in raw checking I’ve done for other competitions.
Franka Gabler : Intimate/abstract Winner
The very last part of the judging was done live over Zoom. Managing 10 people in a Zoom session and keeping everything on track was a little bit of a challenge, especially having rejigged our approach just before the meeting, but our panel of judges helped make it work smoothly.
What became apparent as each judge showed their favourite images and the others commented on them was that each judge’s opinion of an image changed as they were discussed. This may have been because the ‘initial’ reaction was perhaps influenced by subject matter or that it was in an iconic location. Or perhaps the composition was more complex and took a little bit of time to appreciate. And it should be noted that as each judging round progressed, the judges spent nearly an hour looking at all of the images selected.
Our feedback from this is that perhaps we should find ways to integrate a little more face to face time before the final rounds in order to get some reflection going on between judges. We are definitely going to be extending the judging process in order to make sure judges have longer to ‘live with’ their picks and potentially include a ‘review’ session a few days after judging to see if people are still happy with the results.
All of the judges commented on how useful it was to use Lightroom for the judging process. Being able to filter, order and quickly compare images allowed them to review their choices and reassess images. For instance, one judge tagged images they thought needed a little more time to assess and came back to just those at later in the week to review their ratings.
What became apparent as each judge showed their favourite images and the others commented on them was that each judge’s opinion of an image changed as they were discussed.
Paul Hammett : Nightscape Winner
One of the things we, as organisers, dislike about the way many competitions run is the lack of feedback for any entrants beyond the winners. Some competitions do give out some general ‘certificates’ on a ‘bronze/silver/gold’ level but what that means is often hidden away or not explained.
We wanted to give something constructive to participants if they got beyond the first stages. To this extent, we spent a while creating certificates for each entry that gave specific feedback on what stage in the process the images reached and explain in detail what each stage means.
You can see an example certificate above and a key to the different parts along with a more extensive description of the judging process, is available on the awards website.
Paul Hoelen : Aerial Winner
We’ve had an incredible response form our participants and also from the general press. Features in some of the mainstream newspapers (although not the ones I read) and also in the likes of Petapixel, DPReview etc. In fact Petapixel weren’t going to write about us until they discovered just how many entrants we had had.
One of our primary goals for next year will be to offer some seminars on how judges assess images and what you can look for in your own photography to adapt to this, either through the photography itself, editing and/or image choices. It shouldn’t need repeating that winning competitions isn’t the end goal of photography but understanding what other photographers like is at least interesting in and of itself and, I think, can go some way to learning how to be a better visual communicator.
I'm going to split the gallery over two issues and just include the individual winners in this article. If you want to see all of the photos, including commended, and read some comments from the winners, please visit the website at Natural Landscape Photography Awards.